Initial commit.

master
josiah 3 years ago
commit 873bc174a3

@ -0,0 +1,39 @@
# This workflow will install Python dependencies, then run various linting programs on a single Python version
# For more information see: https://help.github.com/actions/language-and-framework-guides/using-python-with-github-actions
name: Lint
on:
push:
branches: [ master ]
pull_request:
branches: [ master ]
jobs:
lint:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Set up Python 3.9
uses: actions/setup-python@v2
with:
python-version: 3.9
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -U isort==5.6.4 flake8==3.8.4 flake8-comprehensions==3.3.1 black==20.8b1
- name: Check import statement sorting
run: |
isort -c --df molly/ molly tests
- name: Python syntax errors, undefined names, etc.
run: |
flake8 . --count --show-source --statistics
- name: PEP8 formatting
run: |
black --check --diff molly/ molly tests

@ -0,0 +1,57 @@
# This workflow will install Python dependencies, then run unit testing across the earliest and latest supported Python versions
# For more information see: https://help.github.com/actions/language-and-framework-guides/using-python-with-github-actions
name: Run unit tests
on:
push:
branches: [ master ]
pull_request:
branches: [ master ]
jobs:
python_36:
# We need to use 20.04 to get access to the libolm3 package
runs-on: ubuntu-20.04
steps:
- uses: actions/checkout@v2
- name: Set up Python 3.6
uses: actions/setup-python@v2
with:
python-version: 3.6
- name: Install project dependencies
run: |
# Install libolm, required for end-to-end encryption functionality
sudo apt install -y libolm-dev libolm3
# Install python dependencies
python setup.py install
- name: Run unit tests
run: |
python -m unittest
python_39:
# We need to use 20.04 to get access to the libolm3 package
runs-on: ubuntu-20.04
steps:
- uses: actions/checkout@v2
- name: Set up Python 3.9
uses: actions/setup-python@v2
with:
python-version: 3.9
- name: Install project dependencies
run: |
# Install libolm, required for end-to-end encryption functionality
sudo apt install -y libolm-dev libolm3
# Install python dependencies
python setup.py install
- name: Run unit tests
run: |
python -m unittest

26
.gitignore vendored

@ -0,0 +1,26 @@
# PyCharm
.idea/
# Python virtualenv environment folders
env/
env3/
.env/
# Bot local files
*.db
store/
# Config file
config.yaml
# Python
__pycache__/
*.egg-info/
build/
dist/
# Config file
config.yaml
# Log files
*.log

@ -0,0 +1,96 @@
# Contributing to nio-template
Thank you for taking interest in this little project. Below is some information
to help you with contributing.
## Setting up your development environment
See the
[Install the dependencies section of SETUP.md](SETUP.md#install-the-dependencies)
for help setting up a running environment for the bot.
If you would rather not or are unable to run docker, the following instructions
will explain how to install the project dependencies natively.
#### Install libolm
You can install [libolm](https://gitlab.matrix.org/matrix-org/olm) from source,
or alternatively, check your system's package manager. Version `3.0.0` or
greater is required.
**(Optional) postgres development headers**
By default, the bot uses SQLite as its storage backend. This is fine for a
few hundred users, but if you plan to support a much higher volume
of requests, you may consider using Postgres as a database backend instead.
If you want to use postgres as a database backend, you'll need to install
postgres development headers:
Debian/Ubuntu:
```
sudo apt install libpq-dev libpq5
```
Arch:
```
sudo pacman -S postgresql-libs
```
#### Install Python dependencies
Create and activate a Python 3 virtual environment:
```
virtualenv -p python3 env
source env/bin/activate
```
Install python dependencies:
```
pip install -e .
```
(Optional) If you want to use postgres as a database backend, use the following
command to install postgres dependencies alongside those that are necessary:
```
pip install ".[postgres]"
```
### Development dependencies
There are some python dependencies that are required for linting/testing etc.
You can install them with:
```
pip install -e ".[dev]"
```
## Code style
Please follow the [PEP8](https://www.python.org/dev/peps/pep-0008/) style
guidelines and format your import statements with
[isort](https://pypi.org/project/isort/).
## Linting
Run the following script to automatically format your code. This *should* make
the linting CI happy:
```
./scripts-dev/lint.sh
```
## What to work on
Take a look at the [issues
list](https://github.com/anoadragon453/nio-template/issues). What
feature would you like to see or bug do you want to be fixed?
If you would like to talk any ideas over before working on them, you can reach
me at [@andrewm:amorgan.xyz](https://matrix.to/#/@andrewm:amorgan.xyz)
on matrix.

@ -0,0 +1,177 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS

@ -0,0 +1,160 @@
# Nio Template [![Built with matrix-nio](https://img.shields.io/badge/built%20with-matrix--nio-brightgreen)](https://github.com/poljar/matrix-nio) <a href="https://matrix.to/#/#nio-template:matrix.org"><img src="https://img.shields.io/matrix/nio-template:matrix.org?color=blue&label=Join%20the%20Matrix%20Room&server_fqdn=matrix-client.matrix.org" /></a>
A template for creating bots with
[matrix-nio](https://github.com/poljar/matrix-nio). The documentation for
matrix-nio can be found
[here](https://matrix-nio.readthedocs.io/en/latest/nio.html).
This repo contains a working Matrix echo bot that can be easily extended to your needs. Detailed documentation is included as well as a step-by-step guide on basic bot building.
Features include out-of-the-box support for:
* Bot commands
* SQLite3 and Postgres database backends
* Configuration files
* Multi-level logging
* Docker
* Participation in end-to-end encrypted rooms
## Projects using nio-template
* [anoadragon453/matrix-reminder-bot](https://github.com/anoadragon453/matrix-reminder-bot
) - A matrix bot to remind you about things
* [gracchus163/hopeless](https://github.com/gracchus163/hopeless) - COREbot for the Hope2020 conference Matrix server
* [alturiak/nio-smith](https://github.com/alturiak/nio-smith) - A modular bot for @matrix-org that can be dynamically
extended by plugins
* [anoadragon453/msc-chatbot](https://github.com/anoadragon453/msc-chatbot) - A matrix bot for matrix spec proposals
* [anoadragon453/matrix-episode-bot](https://github.com/anoadragon453/matrix-episode-bot) - A matrix bot to post episode links
* [TheForcer/vision-nio](https://github.com/TheForcer/vision-nio) - A general purpose matrix chatbot
* [anoadragon453/drawing-challenge-bot](https://github.com/anoadragon453/drawing-challenge-bot) - A matrix bot to
post historical, weekly art challenges from reddit to a room
* [8go/matrix-eno-bot](https://github.com/8go/matrix-eno-bot) - A bot to be used as a) personal assistant or b) as
an admin tool to maintain your Matrix installation or server
* [elokapina/bubo](https://github.com/elokapina/bubo) - Matrix bot to help with community management
* [elokapina/middleman](https://github.com/elokapina/middleman) - Matrix bot to act as a middleman, for example as a support bot
* [chc4/matrix-pinbot](https://github.com/chc4/matrix-pinbot) - Matrix bot for pinning messages to a dedicated channel
Want your project listed here? [Edit this
page!](https://github.com/anoadragon453/nio-template/edit/master/README.md)
## Getting started
See [SETUP.md](SETUP.md) for how to setup and run the template project.
## Project structure
*A reference of each file included in the template repository, its purpose and
what it does.*
The majority of the code is kept inside of the `molly` folder, which
is in itself a [python package](https://docs.python.org/3/tutorial/modules.html),
the `__init__.py` file inside declaring it as such.
To run the bot, the `molly` script in the root of the codebase is
available. It will import the `main` function from the `main.py` file in the
package and run it. To properly install this script into your python environment,
run `pip install -e .` in the project's root directory.
`setup.py` contains package information (for publishing your code to
[PyPI](https://pypi.org)) and `setup.cfg` just contains some configuration
options for linting tools.
`sample.config.yaml` is a sample configuration file. People running your bot
should be advised to copy this file to `config.yaml`, then edit it according to
their needs. Be sure never to check the edited `config.yaml` into source control
since it'll likely contain sensitive details such as passwords!
Below is a detailed description of each of the source code files contained within
the `molly` directory:
### `main.py`
Initialises the config file, the bot store, and nio's AsyncClient (which is
used to retrieve and send events to a matrix homeserver). It also registering
some callbacks on the AsyncClient to tell it to call some functions when
certain events are received (such as an invite to a room, or a new message in a
room the bot is in).
It also starts the sync loop. Matrix clients "sync" with a homeserver, by
asking constantly asking for new events. Each time they do, the client gets a
sync token (stored in the `next_batch` field of the sync response). If the
client provides this token the next time it syncs (using the `since` parameter
on the `AsyncClient.sync` method), the homeserver will only return new event
*since* those specified by the given token.
This token is saved and provided again automatically by using the
`client.sync_forever(...)` method.
### `config.py`
This file reads a config file at a given path (hardcoded as `config.yaml` in
`main.py`), processes everything in it and makes the values available to the
rest of the bot's code so it knows what to do. Most of the options in the given
config file have default values, so things will continue to work even if an
option is left out of the config file. Obviously there are some config values
that are required though, like the homeserver URL, username, access token etc.
Otherwise the bot can't function.
### `storage.py`
Creates (if necessary) and connects to a SQLite3 database and provides commands
to put or retrieve data from it. Table definitions should be specified in
`_initial_setup`, and any necessary migrations should be put in
`_run_migrations`. There's currently no defined method for how migrations
should work though.
### `callbacks.py`
Holds callback methods which get run when the bot get a certain type of event
from the homserver during sync. The type and name of the method to be called
are specified in `main.py`. Currently there are two defined methods, one that
gets called when a message is sent in a room the bot is in, and another that
runs when the bot receives an invite to the room.
The message callback function, `message`, checks if the message was for the
bot, and whether it was a command. If both of those are true, the bot will
process that command.
The invite callback function, `invite`, processes the invite event and attempts
to join the room. This way, the bot will auto-join any room it is invited to.
### `bot_commands.py`
Where all the bot's commands are defined. New commands should be defined in
`process` with an associated private method. `echo` and `help` commands are
provided by default.
A `Command` object is created when a message comes in that's recognised as a
command from a user directed at the bot (either through the specified command
prefix (defined by the bot's config file), or through a private message
directly to the bot. The `process` command is then called for the bot to act on
that command.
### `message_responses.py`
Where responses to messages that are posted in a room (but not necessarily
directed at the bot) are specified. `callbacks.py` will listen for messages in
rooms the bot is in, and upon receiving one will create a new `Message` object
(which contains the message text, amongst other things) and calls `process()`
on it, which can send a message to the room as it sees fit.
A good example of this would be a Github bot that listens for people mentioning
issue numbers in chat (e.g. "We should fix #123"), and the bot sending messages
to the room immediately afterwards with the issue name and link.
### `chat_functions.py`
A separate file to hold helper methods related to messaging. Mostly just for
organisational purposes. Currently just holds `send_text_to_room`, a helper
method for sending formatted messages to a room.
### `errors.py`
Custom error types for the bot. Currently there's only one special type that's
defined for when a error is found while the config file is being processed.
## Questions?
Any questions? Please ask them in
[#nio-template:amorgan.xyz](https://matrix.to/#/!vmWBOsOkoOtVHMzZgN:amorgan.xyz?via=amorgan.xyz)
and we'll help you out!

@ -0,0 +1,173 @@
# Setup
nio-template is a sample repository of a working Matrix bot that can be taken
and transformed into one's own bot, service or whatever else may be necessary.
Below is a quick setup guide to running the existing bot.
## Install the dependencies
There are two paths to installing the dependencies for development.
### Using `docker-compose`
It is **recommended** to use Docker Compose to run the bot while
developing, as all necessary dependencies are handled for you. After
installation and ensuring the `docker-compose` command works, you need to:
1. Create a data directory and config file by following the
[docker setup instructions](docker#setup).
2. Create a docker volume pointing to that directory:
```
docker volume create \
--opt type=none \
--opt o=bind \
--opt device="/path/to/data/dir" data_volume
```
Run `docker/start-dev.sh` to start the bot.
**Note:** If you are trying to connect to a Synapse instance running on the
host, you need to allow the IP address of the docker container to connect. This
is controlled by `bind_addresses` in the `listeners` section of Synapse's
config. If present, either add the docker internal IP address to the list, or
remove the option altogether to allow all addresses.
### Running natively
If you would rather not or are unable to run docker, the following will
instruct you on how to install the dependencies natively:
#### Install libolm
You can install [libolm](https://gitlab.matrix.org/matrix-org/olm) from source,
or alternatively, check your system's package manager. Version `3.0.0` or
greater is required.
**(Optional) postgres development headers**
By default, the bot uses SQLite as its storage backend. This is fine for a few
hundred users, but if you plan to support a much higher volume of requests, you
may consider using Postgres as a database backend instead.
If you want to use postgres as a database backend, you'll need to install
postgres development headers:
Debian/Ubuntu:
```
sudo apt install libpq-dev libpq5
```
Arch:
```
sudo pacman -S postgresql-libs
```
#### Install Python dependencies
Create and activate a Python 3 virtual environment:
```
virtualenv -p python3 env
source env/bin/activate
```
Install python dependencies:
```
pip install -e .
```
(Optional) If you want to use postgres as a database backend, use the following
command to install postgres dependencies alongside those that are necessary:
```
pip install -e ".[postgres]"
```
## Configuration
Copy the sample configuration file to a new `config.yaml` file.
```
cp sample.config.yaml config.yaml
```
Edit the config file. The `matrix` section must be modified at least.
#### (Optional) Set up a Postgres database
Create a postgres user and database for matrix-reminder-bot:
```
sudo -u postgresql psql createuser nio-template -W # prompts for a password
sudo -u postgresql psql createdb -O nio-template nio-template
```
Edit the `storage.database` config option, replacing the `sqlite://...` string with `postgres://...`. The syntax is:
```
database: "postgres://username:password@localhost/dbname?sslmode=disable"
```
See also the comments in `sample.config.yaml`.
## Running
### Docker
Refer to the docker [run instructions](docker/README.md#running).
### Native installation
Make sure to source your python environment if you haven't already:
```
source env/bin/activate
```
Then simply run the bot with:
```
molly
```
You'll notice that "molly" is scattered throughout the codebase. When
it comes time to modifying the code for your own purposes, you are expected to
replace every instance of "molly" and its variances with your own
project's name.
By default, the bot will run with the config file at `./config.yaml`. However, an
alternative relative or absolute filepath can be specified after the command:
```
molly other-config.yaml
```
## Testing the bot works
Invite the bot to a room and it should accept the invite and join.
By default nio-template comes with an `echo` command. Let's test this now.
After the bot has successfully joined the room, try sending the following
in a message:
```
!c echo I am a bot!
```
The message should be repeated back to you by the bot.
## Going forwards
Congratulations! Your bot is up and running. Now you can modify the code,
re-run the bot and see how it behaves. Have fun!
## Troubleshooting
If you had any difficulties with this setup process, please [file an
issue](https://github.com/anoadragon453/nio-template/issues]) or come talk
about it in [the matrix room](https://matrix.to/#/#nio-template).

@ -0,0 +1,5 @@
# Default environment variables used in docker-compose.yml.
# Overridden by the host's environment variables
# Where `localhost` should route to
HOST_IP_ADDRESS=127.0.0.1

@ -0,0 +1,101 @@
# To build the image, run `docker build` command from the root of the
# repository:
#
# docker build -f docker/Dockerfile .
#
# There is an optional PYTHON_VERSION build argument which sets the
# version of python to build against. For example:
#
# docker build -f docker/Dockerfile --build-arg PYTHON_VERSION=3.8 .
#
# An optional LIBOLM_VERSION build argument which sets the
# version of libolm to build against. For example:
#
# docker build -f docker/Dockerfile --build-arg LIBOLM_VERSION=3.1.4 .
#
##
## Creating a builder container
##
# We use an initial docker container to build all of the runtime dependencies,
# then transfer those dependencies to the container we're going to ship,
# before throwing this one away
ARG PYTHON_VERSION=3.8
FROM docker.io/python:${PYTHON_VERSION}-alpine3.11 as builder
##
## Build libolm for matrix-nio e2e support
##
# Install libolm build dependencies
ARG LIBOLM_VERSION=3.1.4
RUN apk add --no-cache \
make \
cmake \
gcc \
g++ \
git \
libffi-dev \
yaml-dev \
python3-dev
# Build libolm
#
# Also build the libolm python bindings and place them at /python-libs
# We will later copy contents from both of these folders to the runtime
# container
COPY docker/build_and_install_libolm.sh /scripts/
RUN /scripts/build_and_install_libolm.sh ${LIBOLM_VERSION} /python-libs
# Install Postgres dependencies
RUN apk add --no-cache \
musl-dev \
libpq \
postgresql-dev
# Install python runtime modules. We do this before copying the source code
# such that these dependencies can be cached
# This speeds up subsequent image builds when the source code is changed
RUN mkdir -p /src/molly
COPY molly/__init__.py /src/molly/
COPY README.md molly /src/
# Build the dependencies
COPY setup.py /src/setup.py
RUN pip install --prefix="/python-libs" --no-warn-script-location "/src/.[postgres]"
# Now copy the source code
COPY *.py *.md /src/
COPY molly/*.py /src/molly/
# And build the final module
RUN pip install --prefix="/python-libs" --no-warn-script-location "/src/.[postgres]"
##
## Creating the runtime container
##
# Create the container we'll actually ship. We need to copy libolm and any
# python dependencies that we built above to this container
FROM docker.io/python:${PYTHON_VERSION}-alpine3.11
# Copy python dependencies from the "builder" container
COPY --from=builder /python-libs /usr/local
# Copy libolm from the "builder" container
COPY --from=builder /usr/local/lib/libolm* /usr/local/lib/
# Install any native runtime dependencies
RUN apk add --no-cache \
libstdc++ \
libpq \
postgresql-dev
# Specify a volume that holds the config file, SQLite3 database,
# and the matrix-nio store
VOLUME ["/data"]
# Start the bot
ENTRYPOINT ["molly", "/data/config.yaml"]

@ -0,0 +1,71 @@
# This dockerfile is crafted specifically for development purposes.
# Please use `Dockerfile` instead if you wish to deploy for production.
#
# This file differs as it does not use a builder container, nor does it
# reinstall the project's python package after copying the source code,
# saving significant time during rebuilds.
#
# To build the image, run `docker build` command from the root of the
# repository:
#
# docker build -f docker/Dockerfile .
#
# There is an optional PYTHON_VERSION build argument which sets the
# version of python to build against. For example:
#
# docker build -f docker/Dockerfile --build-arg PYTHON_VERSION=3.8 .
#
# An optional LIBOLM_VERSION build argument which sets the
# version of libolm to build against. For example:
#
# docker build -f docker/Dockerfile --build-arg LIBOLM_VERSION=3.1.4 .
#
ARG PYTHON_VERSION=3.8
FROM docker.io/python:${PYTHON_VERSION}-alpine3.11
##
## Build libolm for matrix-nio e2e support
##
# Install libolm build dependencies
ARG LIBOLM_VERSION=3.1.4
RUN apk add --no-cache \
make \
cmake \
gcc \
g++ \
git \
libffi-dev \
yaml-dev \
python3-dev
# Build libolm
COPY docker/build_and_install_libolm.sh /scripts/
RUN /scripts/build_and_install_libolm.sh ${LIBOLM_VERSION}
# Install native runtime dependencies
RUN apk add --no-cache \
musl-dev \
libpq \
postgresql-dev \
libstdc++
# Install python runtime modules. We do this before copying the source code
# such that these dependencies can be cached
RUN mkdir -p /src/molly
COPY molly/__init__.py /src/molly/
COPY README.md run_molly /src/
COPY setup.py /src/setup.py
RUN pip install -e "/src/"
# Now copy the source code
COPY molly/*.py /src/molly/
COPY *.py /src/
# Specify a volume that holds the config file, SQLite3 database,
# and the matrix-nio store
VOLUME ["/data"]
# Start the app
ENTRYPOINT ["run_molly", "/data/config.yaml"]

@ -0,0 +1,156 @@
# Docker
The docker image will run molly with a SQLite database and
end-to-end encryption dependencies included. For larger deployments, a
connection to a Postgres database backend is recommended.
## Setup
### The `/data` volume
The docker container expects the `config.yaml` file to exist at
`/data/config.yaml`. To easily configure this, it is recommended to create a
directory on your filesystem, and mount it as `/data` inside the container:
```
mkdir data
```
We'll later mount this directory into the container so that its contents
persist across container restarts.
### Creating a config file
Copy `sample.config.yaml` to a file named `config.yaml` inside of your newly
created `data` directory. Fill it out as you normally would, with a few minor
differences:
* The bot store directory should reside inside of the data directory so that it
is not wiped on container restart. Change it from the default to
`/data/store`. There is no need to create this directory yourself, it will be
created on startup if it does not exist.
* Choose whether you want to use SQLite or Postgres as your database backend.
Postgres has increased performance over SQLite, and is recommended for
deployments with many users.
If using SQLite, ensure your database file is
stored inside the `/data` directory:
```
database: "sqlite:///data/bot.db"
```
If using postgres, point to your postgres instance instead:
```
database: "postgres://username:password@postgres/molly?sslmode=disable"
```
**Note:** a postgres container is defined in `docker-compose.yaml` for your convenience.
If you would like to use it, set your database connection string to:
```
database: "postgres://postgres:somefancypassword@postgres/postgres?sslmode=disable"
```
The password `somefancypassword` is defined in the docker compose file.
Change any other config values as necessary. For instance, you may also want to
store log files in the `/data` directory.
## Running
First, create a volume for the data directory created in the above section:
```
docker volume create \
--opt type=none \
--opt o=bind \
--opt device="/path/to/data/dir" data_volume
```
Optional: If you want to use the postgres container defined in
`docker-compose.yaml`, start that first:
```
docker-compose up -d postgres
```
Start the bot with:
```
docker-compose up molly
```
This will run the bot and log the output to the terminal. You can instead run
the container detached with the `-d` flag:
```
docker-compose up -d molly
```
(Logs can later be accessed with the `docker logs` command).
This will use the `latest` tag from
[Docker Hub](https://hub.docker.com/somebody/molly).
If you would rather run from the checked out code, you can use:
```
docker-compose up local-checkout
```
This will build an optimized, production-ready container. If you are developing
instead and would like a development container for testing local changes, use
the `start-dev.sh` script and consult [CONTRIBUTING.md](../CONTRIBUTING.md).
**Note:** If you are trying to connect to a Synapse instance running on the
host, you need to allow the IP address of the docker container to connect. This
is controlled by `bind_addresses` in the `listeners` section of Synapse's
config. If present, either add the docker internal IP address to the list, or
remove the option altogether to allow all addresses.
## Updating
To update the container, navigate to the bot's `docker` directory and run:
```
docker-compose pull molly
```
Then restart the bot.
## Systemd
A systemd service file is provided for your convenience at
[molly.service](molly.service). The service uses
`docker-compose` to start and stop the bot.
Copy the file to `/etc/systemd/system/molly.service` and edit to
match your setup. You can then start the bot with:
```
systemctl start molly
```
and stop it with:
```
systemctl stop molly
```
To run the bot on system startup:
```
systemctl enable molly
```
## Building the image
To build a production image from source, use the following `docker build` command
from the repo's root:
```
docker build -t somebody/molly:latest -f docker/Dockerfile .
```

@ -0,0 +1,32 @@
#!/usr/bin/env sh
#
# Call with the following arguments:
#
# ./build_and_install_libolm.sh <libolm version> <python bindings install dir>
#
# Example:
#
# ./build_and_install_libolm.sh 3.1.4 /python-bindings
#
# Note that if a python bindings installation directory is not supplied, bindings will
# be installed to the default directory.
#
set -ex
# Download the specified version of libolm
git clone -b "$1" https://gitlab.matrix.org/matrix-org/olm.git olm && cd olm
# Build libolm
cmake . -Bbuild
cmake --build build
# Install
make install
# Build the python3 bindings
cd python && make olm-python3
# Install python3 bindings
mkdir -p "$2" || true
DESTDIR="$2" make install-python3

@ -0,0 +1,64 @@
version: '3.1' # specify docker-compose version
volumes:
# Set up with `docker volume create ...`. See docker/README.md for more info.
data_volume:
external: true
pg_data_volume:
services:
# Runs from the latest release
molly:
image: somebody/molly
restart: always
volumes:
- data_volume:/data
# Used for allowing connections to homeservers hosted on the host machine
# (while docker host mode is still broken on Linux).
#
# Defaults to 127.0.0.1 and is set in docker/.env
extra_hosts:
- "localhost:${HOST_IP_ADDRESS}"
# Builds and runs an optimized container from local code
local-checkout:
build:
context: ..
dockerfile: docker/Dockerfile
# Build arguments may be specified here
# args:
# PYTHON_VERSION: 3.8
volumes:
- data_volume:/data
# Used for allowing connections to homeservers hosted on the host machine
# (while docker host networking mode is still broken on Linux).
#
# Defaults to 127.0.0.1 and is set in docker/.env
extra_hosts:
- "localhost:${HOST_IP_ADDRESS}"
# Builds and runs a development container from local code
local-checkout-dev:
build:
context: ..
dockerfile: docker/Dockerfile.dev
# Build arguments may be specified here
# args:
# PYTHON_VERSION: 3.8
volumes:
- data_volume:/data
# Used for allowing connections to homeservers hosted on the host machine
# (while docker host networking mode is still broken on Linux).
#
# Defaults to 127.0.0.1 and is set in docker/.env
extra_hosts:
- "localhost:${HOST_IP_ADDRESS}"
# Starts up a postgres database
postgres:
image: postgres
restart: always
volumes:
- pg_data_volume:/var/lib/postgresql/data
environment:
POSTGRES_PASSWORD: somefancypassword

@ -0,0 +1,16 @@
[Unit]
Description=A matrix bot that does amazing things!
[Service]
Type=simple
User=molly
Group=molly
WorkingDirectory=/path/to/molly/docker
ExecStart=/usr/bin/docker-compose up molly
ExecStop=/usr/bin/docker-compose stop molly
RemainAfterExit=yes
Restart=always
RestartSec=3
[Install]
WantedBy=multi-user.target

@ -0,0 +1,49 @@
#!/bin/bash
# A script to quickly setup a running development environment
#
# It's primary purpose is to set up docker networking correctly so that
# the bot can connect to remote services as well as those hosted on
# the host machine.
#
# Change directory to where this script is located. We'd like to run
# `docker-compose` in the same directory to use the adjacent
# docker-compose.yml and .env files
cd `dirname "$0"`
function on_exit {
cd -
}
# Ensure we change back to the old directory on script exit
trap on_exit EXIT
# To allow the docker container to connect to services running on the host,
# we need to use the host's internal ip address. Attempt to retrieve this.
#
# Check whether the ip address has been defined in the environment already
if [ -z "$HOST_IP_ADDRESS" ]; then
# It's not defined. Try to guess what it is
# First we try the `ip` command, available primarily on Linux
export HOST_IP_ADDRESS="`ip route get 1 | sed -n 's/^.*src \([0-9.]*\) .*$/\1/p'`"
if [ $? -ne 0 ]; then
# That didn't work. `ip` isn't available on old Linux systems, or MacOS.
# Try `ifconfig` instead
export HOST_IP_ADDRESS="`ifconfig $(netstat -rn | grep -E "^default|^0.0.0.0" | head -1 | awk '{print $NF}') | grep 'inet ' | awk '{print $2}' | grep -Eo '([0-9]*\.){3}[0-9]*'`"
if [ $? -ne 0 ]; then
# That didn't work either, give up
echo "
Unable to determine host machine's internal IP address.
Please set HOST_IP_ADDRESS environment variable manually and re-run this script.
If you do not have a need to connect to a homeserver running on the host machine,
set HOST_IP_ADDRESS=127.0.0.1"
exit 1
fi
fi
fi
# Build and run latest code
docker-compose up --build local-checkout-dev

@ -0,0 +1,8 @@
import sys
# Check that we're not running on an unsupported Python version.
if sys.version_info < (3, 5):
print("molly requires Python 3.5 or above.")
sys.exit(1)
__version__ = "0.0.1"

@ -0,0 +1,95 @@
from nio import AsyncClient, MatrixRoom, RoomMessageText
from molly.chat_functions import react_to_event, send_text_to_room
from molly.config import Config
from molly.storage import Storage
class Command:
def __init__(
self,
client: AsyncClient,
store: Storage,
config: Config,
command: str,
room: MatrixRoom,
event: RoomMessageText,
):
"""A command made by a user.
Args:
client: The client to communicate to matrix with.
store: Bot storage.
config: Bot configuration parameters.
command: The command and arguments.
room: The room the command was sent in.
event: The event describing the command.
"""
self.client = client
self.store = store
self.config = config
self.command = command
self.room = room
self.event = event
self.args = self.command.split()[1:]
async def process(self):
"""Process the command"""
if self.command.startswith("echo"):
await self._echo()
elif self.command.startswith("react"):
await self._react()
elif self.command.startswith("help"):
await self._show_help()
else:
await self._unknown_command()
async def _echo(self):
"""Echo back the command's arguments"""
response = " ".join(self.args)
await send_text_to_room(self.client, self.room.room_id, response)
async def _react(self):
"""Make the bot react to the command message"""
# React with a start emoji
reaction = ""
await react_to_event(
self.client, self.room.room_id, self.event.event_id, reaction
)
# React with some generic text
reaction = "Some text"
await react_to_event(
self.client, self.room.room_id, self.event.event_id, reaction
)
async def _show_help(self):
"""Show the help text"""
if not self.args:
text = (
"Hello, I am a bot made with matrix-nio! Use `help commands` to view "
"available commands."
)
await send_text_to_room(self.client, self.room.room_id, text)
return
topic = self.args[0]
if topic == "rules":
text = "These are the rules!"
elif topic == "commands":
text = "Available commands: ..."
else:
text = "Unknown help topic!"
await send_text_to_room(self.client, self.room.room_id, text)
async def _unknown_command(self):
await send_text_to_room(
self.client,
self.room.room_id,
f"Unknown command '{self.command}'. Try the 'help' command for more information.",
)

@ -0,0 +1,198 @@
import logging
from nio import (
AsyncClient,
InviteMemberEvent,
JoinError,
MatrixRoom,
MegolmEvent,
RoomGetEventError,
RoomMessageText,
UnknownEvent,
)
from molly.bot_commands import Command
from molly.chat_functions import make_pill, react_to_event, send_text_to_room
from molly.config import Config
from molly.message_responses import Message
from molly.storage import Storage
logger = logging.getLogger(__name__)
class Callbacks:
def __init__(self, client: AsyncClient, store: Storage, config: Config):
"""
Args:
client: nio client used to interact with matrix.
store: Bot storage.
config: Bot configuration parameters.
"""
self.client = client
self.store = store
self.config = config
self.command_prefix = config.command_prefix
async def message(self, room: MatrixRoom, event: RoomMessageText) -> None:
"""Callback for when a message event is received
Args:
room: The room the event came from.
event: The event defining the message.
"""
# Extract the message text
msg = event.body
# Ignore messages from ourselves
if event.sender == self.client.user:
return
logger.debug(
f"Bot message received for room {room.display_name} | "
f"{room.user_name(event.sender)}: {msg}"
)
# Process as message if in a public room without command prefix
has_command_prefix = msg.startswith(self.command_prefix)
# room.is_group is often a DM, but not always.
# room.is_group does not allow room aliases
# room.member_count > 2 ... we assume a public room
# room.member_count <= 2 ... we assume a DM
if not has_command_prefix and room.member_count > 2:
# General message listener
message = Message(self.client, self.store, self.config, msg, room, event)
await message.process()
return
# Otherwise if this is in a 1-1 with the bot or features a command prefix,
# treat it as a command
if has_command_prefix:
# Remove the command prefix
msg = msg[len(self.command_prefix) :]
command = Command(self.client, self.store, self.config, msg, room, event)
await command.process()
async def invite(self, room: MatrixRoom, event: InviteMemberEvent) -> None:
"""Callback for when an invite is received. Join the room specified in the invite.
Args:
room: The room that we are invited to.
event: The invite event.
"""
logger.debug(f"Got invite to {room.room_id} from {event.sender}.")
# Attempt to join 3 times before giving up
for attempt in range(3):
result = await self.client.join(room.room_id)
if type(result) == JoinError:
logger.error(
f"Error joining room {room.room_id} (attempt %d): %s",
attempt,
result.message,
)
else:
break
else:
logger.error("Unable to join room: %s", room.room_id)
# Successfully joined room
logger.info(f"Joined {room.room_id}")
async def _reaction(
self, room: MatrixRoom, event: UnknownEvent, reacted_to_id: str
) -> None:
"""A reaction was sent to one of our messages. Let's send a reply acknowledging it.
Args:
room: The room the reaction was sent in.
event: The reaction event.
reacted_to_id: The event ID that the reaction points to.
"""
logger.debug(f"Got reaction to {room.room_id} from {event.sender}.")
# Get the original event that was reacted to
event_response = await self.client.room_get_event(room.room_id, reacted_to_id)
if isinstance(event_response, RoomGetEventError):
logger.warning(
"Error getting event that was reacted to (%s)", reacted_to_id
)
return
reacted_to_event = event_response.event
# Only acknowledge reactions to events that we sent
if reacted_to_event.sender != self.config.user_id:
return
# Send a message acknowledging the reaction
reaction_sender_pill = make_pill(event.sender)
reaction_content = (
event.source.get("content", {}).get("m.relates_to", {}).get("key")
)
message = (
f"{reaction_sender_pill} reacted to this event with `{reaction_content}`!"
)
await send_text_to_room(
self.client,
room.room_id,
message,
reply_to_event_id=reacted_to_id,
)
async def decryption_failure(self, room: MatrixRoom, event: MegolmEvent) -> None:
"""Callback for when an event fails to decrypt. Inform the user.
Args:
room: The room that the event that we were unable to decrypt is in.
event: The encrypted event that we were unable to decrypt.
"""
logger.error(
f"Failed to decrypt event '{event.event_id}' in room '{room.room_id}'!"
f"\n\n"
f"Tip: try using a different device ID in your config file and restart."
f"\n\n"
f"If all else fails, delete your store directory and let the bot recreate "
f"it (your reminders will NOT be deleted, but the bot may respond to existing "
f"commands a second time)."
)
red_x_and_lock_emoji = "❌ 🔐"
# React to the undecryptable event with some emoji
await react_to_event(
self.client,
room.room_id,
event.event_id,
red_x_and_lock_emoji,
)
async def unknown(self, room: MatrixRoom, event: UnknownEvent) -> None:
"""Callback for when an event with a type that is unknown to matrix-nio is received.
Currently this is used for reaction events, which are not yet part of a released
matrix spec (and are thus unknown to nio).
Args:
room: The room the reaction was sent in.
event: The event itself.
"""
if event.type == "m.reaction":
# Get the ID of the event this was a reaction to
relation_dict = event.source.get("content", {}).get("m.relates_to", {})
reacted_to = relation_dict.get("event_id")
if reacted_to and relation_dict.get("rel_type") == "m.annotation":
await self._reaction(room, event, reacted_to)
return
logger.debug(
f"Got unknown event with type to {event.type} from {event.sender} in {room.room_id}."
)

@ -0,0 +1,154 @@
import logging
from typing import Optional, Union
from markdown import markdown
from nio import (
AsyncClient,
ErrorResponse,
MatrixRoom,
MegolmEvent,
Response,
RoomSendResponse,
SendRetryError,
)
logger = logging.getLogger(__name__)
async def send_text_to_room(
client: AsyncClient,
room_id: str,
message: str,
notice: bool = True,
markdown_convert: bool = True,
reply_to_event_id: Optional[str] = None,
) -> Union[RoomSendResponse, ErrorResponse]:
"""Send text to a matrix room.
Args:
client: The client to communicate to matrix with.
room_id: The ID of the room to send the message to.
message: The message content.
notice: Whether the message should be sent with an "m.notice" message type
(will not ping users).
markdown_convert: Whether to convert the message content to markdown.
Defaults to true.
reply_to_event_id: Whether this message is a reply to another event. The event
ID this is message is a reply to.
Returns:
A RoomSendResponse if the request was successful, else an ErrorResponse.
"""
# Determine whether to ping room members or not
msgtype = "m.notice" if notice else "m.text"
content = {
"msgtype": msgtype,
"format": "org.matrix.custom.html",
"body": message,
}
if markdown_convert:
content["formatted_body"] = markdown(message)
if reply_to_event_id:
content["m.relates_to"] = {"m.in_reply_to": {"event_id": reply_to_event_id}}
try:
return await client.room_send(
room_id,
"m.room.message",
content,
ignore_unverified_devices=True,
)
except SendRetryError:
logger.exception(f"Unable to send message response to {room_id}")
def make_pill(user_id: str, displayname: str = None) -> str:
"""Convert a user ID (and optionally a display name) to a formatted user 'pill'
Args:
user_id: The MXID of the user.
displayname: An optional displayname. Clients like Element will figure out the
correct display name no matter what, but other clients may not. If not
provided, the MXID will be used instead.
Returns:
The formatted user pill.
"""
if not displayname:
# Use the user ID as the displayname if not provided
displayname = user_id
return f'<a href="https://matrix.to/#/{user_id}">{displayname}</a>'
async def react_to_event(
client: AsyncClient,
room_id: str,
event_id: str,
reaction_text: str,
) -> Union[Response, ErrorResponse]:
"""Reacts to a given event in a room with the given reaction text
Args:
client: The client to communicate to matrix with.
room_id: The ID of the room to send the message to.
event_id: The ID of the event to react to.
reaction_text: The string to react with. Can also be (one or more) emoji characters.
Returns:
A nio.Response or nio.ErrorResponse if an error occurred.
Raises:
SendRetryError: If the reaction was unable to be sent.
"""
content = {
"m.relates_to": {
"rel_type": "m.annotation",
"event_id": event_id,
"key": reaction_text,
}
}
return await client.room_send(
room_id,
"m.reaction",
content,
ignore_unverified_devices=True,
)
async def decryption_failure(self, room: MatrixRoom, event: MegolmEvent) -> None:
"""Callback for when an event fails to decrypt. Inform the user"""
logger.error(
f"Failed to decrypt event '{event.event_id}' in room '{room.room_id}'!"
f"\n\n"
f"Tip: try using a different device ID in your config file and restart."
f"\n\n"
f"If all else fails, delete your store directory and let the bot recreate "
f"it (your reminders will NOT be deleted, but the bot may respond to existing "
f"commands a second time)."
)
user_msg = (
"Unable to decrypt this message. "
"Check whether you've chosen to only encrypt to trusted devices."
)
await send_text_to_room(
self.client,
room.room_id,
user_msg,
reply_to_event_id=event.event_id,
)

@ -0,0 +1,136 @@
import logging
import os
import re
import sys
from typing import Any, List, Optional
import yaml
from molly.errors import ConfigError
logger = logging.getLogger()
logging.getLogger("peewee").setLevel(
logging.INFO
) # Prevent debug messages from peewee lib
class Config:
"""Creates a Config object from a YAML-encoded config file from a given filepath"""
def __init__(self, filepath: str):
self.filepath = filepath
if not os.path.isfile(filepath):
raise ConfigError(f"Config file '{filepath}' does not exist")
# Load in the config file at the given filepath
with open(filepath) as file_stream:
self.config_dict = yaml.safe_load(file_stream.read())
# Parse and validate config options
self._parse_config_values()
def _parse_config_values(self):
"""Read and validate each config option"""
# Logging setup
formatter = logging.Formatter(
"%(asctime)s | %(name)s [%(levelname)s] %(message)s"
)
log_level = self._get_cfg(["logging", "level"], default="INFO")
logger.setLevel(log_level)
file_logging_enabled = self._get_cfg(
["logging", "file_logging", "enabled"], default=False
)
file_logging_filepath = self._get_cfg(
["logging", "file_logging", "filepath"], default="bot.log"
)
if file_logging_enabled:
handler = logging.FileHandler(file_logging_filepath)
handler.setFormatter(formatter)
logger.addHandler(handler)
console_logging_enabled = self._get_cfg(
["logging", "console_logging", "enabled"], default=True
)
if console_logging_enabled:
handler = logging.StreamHandler(sys.stdout)
handler.setFormatter(formatter)
logger.addHandler(handler)
# Storage setup
self.store_path = self._get_cfg(["storage", "store_path"], required=True)
# Create the store folder if it doesn't exist
if not os.path.isdir(self.store_path):
if not os.path.exists(self.store_path):
os.mkdir(self.store_path)
else:
raise ConfigError(
f"storage.store_path '{self.store_path}' is not a directory"
)
# Database setup
database_path = self._get_cfg(["storage", "database"], required=True)
# Support both SQLite and Postgres backends
# Determine which one the user intends
sqlite_scheme = "sqlite://"
postgres_scheme = "postgres://"
if database_path.startswith(sqlite_scheme):
self.database = {
"type": "sqlite",
"connection_string": database_path[len(sqlite_scheme) :],
}
elif database_path.startswith(postgres_scheme):
self.database = {"type": "postgres", "connection_string": database_path}
else:
raise ConfigError("Invalid connection string for storage.database")
# Matrix bot account setup
self.user_id = self._get_cfg(["matrix", "user_id"], required=True)
if not re.match("@.*:.*", self.user_id):
raise ConfigError("matrix.user_id must be in the form @name:domain")
self.user_password = self._get_cfg(["matrix", "user_password"], required=False)
self.user_token = self._get_cfg(["matrix", "user_token"], required=False)
if not self.user_token and not self.user_password:
raise ConfigError("Must supply either user token or password")
self.device_id = self._get_cfg(["matrix", "device_id"], required=True)
self.device_name = self._get_cfg(
["matrix", "device_name"], default="nio-template"
)
self.homeserver_url = self._get_cfg(["matrix", "homeserver_url"], required=True)
self.command_prefix = self._get_cfg(["command_prefix"], default="!c") + " "
def _get_cfg(
self,
path: List[str],
default: Optional[Any] = None,
required: Optional[bool] = True,
) -> Any:
"""Get a config option from a path and option name, specifying whether it is
required.
Raises:
ConfigError: If required is True and the object is not found (and there is
no default value provided), a ConfigError will be raised.
"""
# Sift through the the config until we reach our option
config = self.config_dict
for name in path:
config = config.get(name)
# If at any point we don't get our expected option...
if config is None:
# Raise an error if it was required
if required and not default:
raise ConfigError(f"Config option {'.'.join(path)} is required")
# or return the default value
return default
# We found the option. Return it.
return config

@ -0,0 +1,12 @@
# This file holds custom error types that you can define for your application.
class ConfigError(RuntimeError):
"""An error encountered during reading the config file.
Args:
msg: The message displayed to the user on error.
"""
def __init__(self, msg: str):
super(ConfigError, self).__init__("%s" % (msg,))

@ -0,0 +1,119 @@
#!/usr/bin/env python3
import asyncio
import logging
import sys
from time import sleep
from aiohttp import ClientConnectionError, ServerDisconnectedError
from nio import (
AsyncClient,
AsyncClientConfig,
InviteMemberEvent,
LocalProtocolError,
LoginError,
MegolmEvent,
RoomMessageText,
UnknownEvent,
)
from molly.callbacks import Callbacks
from molly.config import Config
from molly.storage import Storage
logger = logging.getLogger(__name__)
async def main():
"""The first function that is run when starting the bot"""
# Read user-configured options from a config file.
# A different config file path can be specified as the first command line argument
if len(sys.argv) > 1:
config_path = sys.argv[1]
else:
config_path = "config.yaml"
# Read the parsed config file and create a Config object
config = Config(config_path)
# Configure the database
store = Storage(config.database)
# Configuration options for the AsyncClient
client_config = AsyncClientConfig(
max_limit_exceeded=0,
max_timeouts=0,
store_sync_tokens=True,
encryption_enabled=True,
)
# Initialize the matrix client
client = AsyncClient(
config.homeserver_url,
config.user_id,
device_id=config.device_id,
store_path=config.store_path,
config=client_config,
)
if config.user_token:
client.access_token = config.user_token
client.user_id = config.user_id
# Set up event callbacks
callbacks = Callbacks(client, store, config)
client.add_event_callback(callbacks.message, (RoomMessageText,))
client.add_event_callback(callbacks.invite, (InviteMemberEvent,))
client.add_event_callback(callbacks.decryption_failure, (MegolmEvent,))
client.add_event_callback(callbacks.unknown, (UnknownEvent,))
# Keep trying to reconnect on failure (with some time in-between)
while True:
try:
if config.user_token:
# Use token to log in
client.load_store()
# Sync encryption keys with the server
if client.should_upload_keys:
await client.keys_upload()
else:
# Try to login with the configured username/password
try:
login_response = await client.login(
password=config.user_password,
device_name=config.device_name,
)
# Check if login failed
if type(login_response) == LoginError:
logger.error("Failed to login: %s", login_response.message)
return False
except LocalProtocolError as e:
# There's an edge case here where the user hasn't installed the correct C
# dependencies. In that case, a LocalProtocolError is raised on login.
logger.fatal(
"Failed to login. Have you installed the correct dependencies? "
"https://github.com/poljar/matrix-nio#installation "
"Error: %s",
e,
)
return False
# Login succeeded!
logger.info(f"Logged in as {config.user_id}")
await client.sync_forever(timeout=30000, full_state=True)
except (ClientConnectionError, ServerDisconnectedError):
logger.warning("Unable to connect to homeserver, retrying in 15s...")
# Sleep so we don't bombard the server with login requests
sleep(15)
finally:
# Make sure to close the client connection on disconnect
await client.close()
# Run the main function in an asyncio event loop
asyncio.get_event_loop().run_until_complete(main())

@ -0,0 +1,52 @@
import logging
from nio import AsyncClient, MatrixRoom, RoomMessageText
from molly.chat_functions import send_text_to_room
from molly.config import Config
from molly.storage import Storage
logger = logging.getLogger(__name__)
class Message:
def __init__(
self,
client: AsyncClient,
store: Storage,
config: Config,
message_content: str,
room: MatrixRoom,
event: RoomMessageText,
):
"""Initialize a new Message
Args:
client: nio client used to interact with matrix.
store: Bot storage.
config: Bot configuration parameters.
message_content: The body of the message.
room: The room the event came from.
event: The event defining the message.
"""
self.client = client
self.store = store
self.config = config
self.message_content = message_content
self.room = room
self.event = event
async def process(self) -> None:
"""Process and possibly respond to the message"""
if self.message_content.lower() == "hello world":
await self._hello_world()
async def _hello_world(self) -> None:
"""Say hello"""
text = "Hello, world!"
await send_text_to_room(self.client, self.room.room_id, text)

@ -0,0 +1,126 @@
import logging
from typing import Any, Dict
# The latest migration version of the database.
#
# Database migrations are applied starting from the number specified in the database's
# `migration_version` table + 1 (or from 0 if this table does not yet exist) up until
# the version specified here.
#
# When a migration is performed, the `migration_version` table should be incremented.
latest_migration_version = 0
logger = logging.getLogger(__name__)
class Storage:
def __init__(self, database_config: Dict[str, str]):
"""Setup the database.
Runs an initial setup or migrations depending on whether a database file has already
been created.
Args:
database_config: a dictionary containing the following keys:
* type: A string, one of "sqlite" or "postgres".
* connection_string: A string, featuring a connection string that
be fed to each respective db library's `connect` method.
"""
self.conn = self._get_database_connection(
database_config["type"], database_config["connection_string"]
)
self.cursor = self.conn.cursor()
self.db_type = database_config["type"]
# Try to check the current migration version
migration_level = 0
try:
self._execute("SELECT version FROM migration_version")
row = self.cursor.fetchone()
migration_level = row[0]
except Exception:
self._initial_setup()
finally:
if migration_level < latest_migration_version:
self._run_migrations(migration_level)
logger.info(f"Database initialization of type '{self.db_type}' complete")
def _get_database_connection(
self, database_type: str, connection_string: str
) -> Any:
"""Creates and returns a connection to the database"""
if database_type == "sqlite":
import sqlite3
# Initialize a connection to the database, with autocommit on
return sqlite3.connect(connection_string, isolation_level=None)
elif database_type == "postgres":
import psycopg2
conn = psycopg2.connect(connection_string)
# Autocommit on
conn.set_isolation_level(0)
return conn
def _initial_setup(self) -> None:
"""Initial setup of the database"""
logger.info("Performing initial database setup...")
# Set up the migration_version table
self._execute(
"""
CREATE TABLE migration_version (
version INTEGER PRIMARY KEY
)
"""
)
# Initially set the migration version to 0
self._execute(
"""
INSERT INTO migration_version (
version
) VALUES (?)
""",
(0,),
)
# Set up any other necessary database tables here
logger.info("Database setup complete")
def _run_migrations(self, current_migration_version: int) -> None:
"""Execute database migrations. Migrates the database to the
`latest_migration_version`.
Args:
current_migration_version: The migration version that the database is
currently at.
"""
logger.debug("Checking for necessary database migrations...")
# if current_migration_version < 1:
# logger.info("Migrating the database from v0 to v1...")
#
# # Add new table, delete old ones, etc.
#
# # Update the stored migration version
# self._execute("UPDATE migration_version SET version = 1")
#
# logger.info("Database migrated to v1")
def _execute(self, *args) -> None:
"""A wrapper around cursor.execute that transforms placeholder ?'s to %s for postgres.
This allows for the support of queries that are compatible with both postgres and sqlite.
Args:
args: Arguments passed to cursor.execute.
"""
if self.db_type == "postgres":
self.cursor.execute(args[0].replace("?", "%s"), *args[1:])
else:
self.cursor.execute(*args)

@ -0,0 +1,10 @@
#!/usr/bin/env python3
import asyncio
try:
from molly import main
# Run the main function of the bot
asyncio.get_event_loop().run_until_complete(main.main())
except ImportError as e:
print("Unable to import molly.main:", e)

@ -0,0 +1,49 @@
# Welcome to the sample config file
# Below you will find various config sections and options
# Default values are shown
# The string to prefix messages with to talk to the bot in group chats
command_prefix: "!c"
# Options for connecting to the bot's Matrix account
matrix:
# The Matrix User ID of the bot account
user_id: "@bot:example.com"
# Matrix account password (optional if access token used)
user_password: ""
# Matrix account access token (optional if password used)
#user_token: ""
# The URL of the homeserver to connect to
homeserver_url: https://example.com
# The device ID that is **non pre-existing** device
# If this device ID already exists, messages will be dropped silently in encrypted rooms
device_id: ABCDEFGHIJ
# What to name the logged in device
device_name: molly
storage:
# The database connection string
# For SQLite3, this would look like:
# database: "sqlite://bot.db"
# For Postgres, this would look like:
# database: "postgres://username:password@localhost/dbname?sslmode=disable"
database: "sqlite://bot.db"
# The path to a directory for internal bot storage
# containing encryption keys, sync tokens, etc.
store_path: "./store"
# Logging setup
logging:
# Logging level
# Allowed levels are 'INFO', 'WARNING', 'ERROR', 'DEBUG' where DEBUG is most verbose
level: INFO
# Configure logging to a file
file_logging:
# Whether logging to a file is enabled
enabled: false
# The path to the file to log to. May be relative or absolute
filepath: bot.log
# Configure logging to the console output
console_logging:
# Whether logging to the console is enabled
enabled: true

@ -0,0 +1,20 @@
#!/bin/sh
#
# Runs linting scripts over the local checkout
# isort - sorts import statements
# flake8 - lints and finds mistakes
# black - opinionated code formatter
set -e
if [ $# -ge 1 ]
then
files=$*
else
files="molly molly tests"
fi
echo "Linting these locations: $files"
isort $files
flake8 $files
python3 -m black $files

@ -0,0 +1,64 @@
#!/bin/bash -e
# Check that regex-rename is installed
if ! command -v regex-rename &> /dev/null
then
echo "regex-rename python module not found. Please run 'python -m pip install regex-rename'"
exit 1
fi
# GNU sed and BSD(Mac) sed handle -i differently :(
function is_gnu_sed(){
sed --version >/dev/null 2>&1
}
# Allow specifying either:
# * One argument, which is the new project name, assuming the old project name is "my project name"
# * Or two arguments, where one can specify 1. the old project name and 2. the new project name
if [ $# -eq 1 ]; then
PLACEHOLDER="my project name"
REPLACEMENT=$1
elif [ $# -eq 2 ]; then
PLACEHOLDER=$1
REPLACEMENT=$2
else
echo "Usage:"
echo "./"$(basename "$0") "\"new name\""
echo "./"$(basename "$0") "\"old name\" \"new name\""
exit 1
fi
PLACEHOLDER_DASHES="${PLACEHOLDER// /-}"
PLACEHOLDER_UNDERSCORES="${PLACEHOLDER// /_}"
REPLACEMENT_DASHES="${REPLACEMENT// /-}"
REPLACEMENT_UNDERSCORES="${REPLACEMENT// /_}"
echo "Updating file and folder names..."
# Iterate over all directories (besides venv's and .git) and rename files/folders
# Yes this looks like some crazy voodoo, but it's necessary as regex-rename does
# not provide any sort of recursive functionality...
find . -type d -not -path "./env*" -not -path "./.git" -not -path "./.git*" \
-exec sh -c "cd {} && \
regex-rename --rename \"(.*)$PLACEHOLDER_DASHES(.*)\" \"\1$REPLACEMENT_DASHES\2\" && \
regex-rename --rename \"(.*)$PLACEHOLDER_UNDERSCORES(.*)\" \"\1$REPLACEMENT_UNDERSCORES\2\"" \; > /dev/null
echo "Updating references within files..."
# Iterate through each file and replace strings within files
for file in $(grep --exclude-dir=env --exclude-dir=venv --exclude-dir=.git --exclude *.pyc -lEw "$PLACEHOLDER_DASHES|$PLACEHOLDER_UNDERSCORES" -R * .[^.]*); do
echo "Checking $file"
if [[ $file != $(basename "$0") ]]; then
if is_gnu_sed; then
sed -i "s/$PLACEHOLDER_DASHES/$REPLACEMENT_DASHES/g" $file
sed -i "s/$PLACEHOLDER_UNDERSCORES/$REPLACEMENT_UNDERSCORES/g" $file
else
sed -i "" "s/$PLACEHOLDER_DASHES/$REPLACEMENT_DASHES/g" $file
sed -i "" "s/$PLACEHOLDER_UNDERSCORES/$REPLACEMENT_UNDERSCORES/g" $file
fi
echo " - $file"
fi
done
echo "Done!"

@ -0,0 +1,19 @@
[flake8]
# see https://pycodestyle.readthedocs.io/en/latest/intro.html#error-codes
# for error codes. The ones we ignore are:
# W503: line break before binary operator
# W504: line break after binary operator
# E203: whitespace before ':' (which is contrary to pep8?)
# E731: do not assign a lambda expression, use a def
# E501: Line too long (black enforces this for us)
ignore=W503,W504,E203,E731,E501
[isort]
line_length = 88
sections=FUTURE,STDLIB,THIRDPARTY,FIRSTPARTY,TESTS,LOCALFOLDER
default_section=THIRDPARTY
known_first_party=molly
known_tests=tests
multi_line_output=3
include_trailing_comma=true
combine_as_imports=true

@ -0,0 +1,59 @@
#!/usr/bin/env python3
import os
from setuptools import find_packages, setup
def exec_file(path_segments):
"""Execute a single python file to get the variables defined in it"""
result = {}
code = read_file(path_segments)
exec(code, result)
return result
def read_file(path_segments):
"""Read a file from the package. Takes a list of strings to join to
make the path"""
file_path = os.path.join(os.path.abspath(os.path.dirname(__file__)), *path_segments)
with open(file_path) as f:
return f.read()
version = exec_file(("molly", "__init__.py"))["__version__"]
long_description = read_file(("README.md",))
setup(
name="molly",
version=version,
url="https://github.com/anoadragon453/nio-template",
description="A matrix bot to do amazing things!",
packages=find_packages(exclude=["tests", "tests.*"]),
install_requires=[
"matrix-nio[e2e]>=0.10.0",
"Markdown>=3.1.1",
"PyYAML>=5.1.2",
],
extras_require={
"postgres": ["psycopg2>=2.8.5"],
"dev": [
"isort==5.0.4",
"flake8==3.8.3",
"flake8-comprehensions==3.2.3",
"black==19.10b0",
],
},
classifiers=[
"License :: OSI Approved :: Apache Software License",
"Programming Language :: Python :: 3 :: Only",
"Programming Language :: Python :: 3.5",
"Programming Language :: Python :: 3.6",
"Programming Language :: Python :: 3.7",
"Programming Language :: Python :: 3.8",
],
long_description=long_description,
long_description_content_type="text/markdown",
# Allow the user to run the bot with `molly ...`
scripts=["run_molly"],
)

@ -0,0 +1,50 @@
import unittest
from unittest.mock import Mock
import nio
from molly.callbacks import Callbacks
from molly.storage import Storage
from tests.utils import make_awaitable, run_coroutine
class CallbacksTestCase(unittest.TestCase):
def setUp(self) -> None:
# Create a Callbacks object and give it some Mock'd objects to use
self.fake_client = Mock(spec=nio.AsyncClient)
self.fake_client.user = "@fake_user:example.com"
self.fake_storage = Mock(spec=Storage)
# We don't spec config, as it doesn't currently have well defined attributes
self.fake_config = Mock()
self.callbacks = Callbacks(
self.fake_client, self.fake_storage, self.fake_config
)
def test_invite(self):
"""Tests the callback for InviteMemberEvents"""
# Tests that the bot attempts to join a room after being invited to it
# Create a fake room and invite event to call the 'invite' callback with
fake_room = Mock(spec=nio.MatrixRoom)
fake_room_id = "!abcdefg:example.com"
fake_room.room_id = fake_room_id
fake_invite_event = Mock(spec=nio.InviteMemberEvent)
fake_invite_event.sender = "@some_other_fake_user:example.com"
# Pretend that attempting to join a room is always successful
self.fake_client.join.return_value = make_awaitable(None)
# Pretend that we received an invite event
run_coroutine(self.callbacks.invite(fake_room, fake_invite_event))
# Check that we attempted to join the room
self.fake_client.join.assert_called_once_with(fake_room_id)
if __name__ == "__main__":
unittest.main()

@ -0,0 +1,81 @@
import unittest
from unittest.mock import Mock
from molly.config import Config
from molly.errors import ConfigError
class ConfigTestCase(unittest.TestCase):
def test_get_cfg(self):
"""Test that Config._get_cfg works correctly"""
# Here's our test dictionary. Pretend that this was parsed from a YAML config file.
test_config_dict = {"a_key": 5, "some_key": {"some_other_key": "some_value"}}
# We create a fake config using Mock. _get_cfg will attempt to pull from self.config_dict,
# so we use a Mock to quickly create a dummy class, and set the 'config_dict' attribute to
# our test dictionary.
fake_config = Mock()
fake_config.config_dict = test_config_dict
# Now let's make some calls to Config._get_cfg. We provide 'fake_cfg' as the first argument
# as a substitute for 'self'. _get_cfg will then be pulling values from fake_cfg.config_dict.
# Test that we can get the value of a top-level key
self.assertEqual(
Config._get_cfg(fake_config, ["a_key"]),
5,
)
# Test that we can get the value of a nested key
self.assertEqual(
Config._get_cfg(fake_config, ["some_key", "some_other_key"]),
"some_value",
)
# Test that the value provided by the default option is used when a key does not exist
self.assertEqual(
Config._get_cfg(
fake_config,
["a_made_up_key", "this_does_not_exist"],
default="The default",
),
"The default",
)
# Test that the value provided by the default option is *not* used when a key *does* exist
self.assertEqual(
Config._get_cfg(fake_config, ["a_key"], default="The default"),
5,
)
# Test that keys that do not exist raise a ConfigError when the required argument is True
with self.assertRaises(ConfigError):
Config._get_cfg(
fake_config, ["a_made_up_key", "this_does_not_exist"], required=True
)
# Test that a ConfigError is not returned when a non-existent key is provided and required is False
self.assertIsNone(
Config._get_cfg(
fake_config, ["a_made_up_key", "this_does_not_exist"], required=False
)
)
# Test that default is used for non-existent keys, even if required is True
# (Typically one shouldn't use a default with required=True anyways...)
self.assertEqual(
Config._get_cfg(
fake_config,
["a_made_up_key", "this_does_not_exist"],
default="something",
required=True,
),
"something",
)
# TODO: Test creating a test yaml file, passing the path to Config and _parse_config_values is called correctly
if __name__ == "__main__":
unittest.main()

@ -0,0 +1,22 @@
# Utility functions to make testing easier
import asyncio
from typing import Any, Awaitable
def run_coroutine(result: Awaitable[Any]) -> Any:
"""Wrapper for asyncio functions to allow them to be run from synchronous functions"""
loop = asyncio.get_event_loop()
result = loop.run_until_complete(result)
loop.close()
return result
def make_awaitable(result: Any) -> Awaitable[Any]:
"""
Makes an awaitable, suitable for mocking an `async` function.
This uses Futures as they can be awaited multiple times so can be returned
to multiple callers.
"""
future = asyncio.Future() # type: ignore
future.set_result(result)
return future
Loading…
Cancel
Save