diff --git a/CHANGELOG.md b/CHANGELOG.md index a8eb4ca36..1f097e979 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -1,3 +1,17 @@ +# 2019-10-05 + +## Improved Postgres upgrading/importing + +Postgres [upgrading](docs/maintenance-postgres.md#upgrading-postgresql) and [importing](docs/importing-postgres.md) have been improved to add support for multiple databases and roles. + +Previously, the playbook would only take care of the `homeserver` database and `synapse` user. +We now back up and restore all databases and users on the Postgres server. + +For now, the playbook only uses that one database (`homeserver`) and that one single user (`synapse`), so it's all the same. +However, in the future, additional components besides Synapse may also make use the Postgres database server. +One such example is the [matrix-appservice-slack](https://github.com/matrix-org/matrix-appservice-slack) bridge, which strongly encourages use of Postgres in its v1.0 release. We are yet to upgrade to it. + + # 2019-10-04 ## Postgres 12 support diff --git a/docs/importing-postgres.md b/docs/importing-postgres.md index 878888f03..f1adaa06a 100644 --- a/docs/importing-postgres.md +++ b/docs/importing-postgres.md @@ -12,6 +12,8 @@ If your database name differs, be sure to change `matrix_postgres_db_name` to yo The playbook supports importing Postgres dump files in **text** (e.g. `pg_dump > dump.sql`) or **gzipped** formats (e.g. `pg_dump | gzip -c > dump.sql.gz`). +Importing multiple databases (as dumped by `pg_dumpall`) is also supported. + Before doing the actual import, **you need to upload your Postgres dump file to the server** (any path is okay). diff --git a/docs/maintenance-postgres.md b/docs/maintenance-postgres.md index dd9d4cfe6..8a6e35fb1 100644 --- a/docs/maintenance-postgres.md +++ b/docs/maintenance-postgres.md @@ -40,16 +40,18 @@ To make a back up of the current PostgreSQL database, make sure it's running and ```bash docker run \ --rm \ ---network matrix \ +--network=matrix \ --env-file=/matrix/postgres/env-postgres-psql \ postgres:12.0-alpine \ -pg_dump -h matrix-postgres \ +pg_dumpall -h matrix-postgres \ | gzip -c \ > /postgres.sql.gz ``` If you are using an [external Postgres server](configuring-playbook-external-postgres.md), the above command will not work, because the credentials file (`/matrix/postgres/env-postgres-psql`) is not available. +Restoring a backup made this way can be done by [importing it](importing-postgres.md). + ## Upgrading PostgreSQL @@ -64,7 +66,7 @@ This playbook can upgrade your existing Postgres setup with the following comman ansible-playbook -i inventory/hosts setup.yml --tags=upgrade-postgres -**The old Postgres data directory is backed up** automatically, by renaming to `/matrix/postgres-auto-upgrade-backup`. +**The old Postgres data directory is backed up** automatically, by renaming it to `/matrix/postgres-auto-upgrade-backup`. To rename to a different path, pass some extra flags to the command above, like this: `--extra-vars="postgres_auto_upgrade_backup_data_path=/another/disk/matrix-postgres-before-upgrade"` The auto-upgrade-backup directory stays around forever, until you **manually decide to delete it**. @@ -72,5 +74,4 @@ The auto-upgrade-backup directory stays around forever, until you **manually dec As part of the upgrade, the database is dumped to `/tmp`, an upgraded and empty Postgres server is started, and then the dump is restored into the new server. To use a different directory for the dump, pass some extra flags to the command above, like this: `--extra-vars="postgres_dump_dir=/directory/to/dump/here"` -**ONLY one database is migrated** (the one specified in `matrix_postgres_db_name`, named `homeserver` by default). -If you've created other databases in that database instance (something this playbook never does and never advises), data will be lost. +**All databases, roles, etc. on the Postgres server are migrated**. diff --git a/roles/matrix-postgres/tasks/import_postgres.yml b/roles/matrix-postgres/tasks/import_postgres.yml index bbed1c95f..ba237d6a3 100644 --- a/roles/matrix-postgres/tasks/import_postgres.yml +++ b/roles/matrix-postgres/tasks/import_postgres.yml @@ -56,6 +56,10 @@ msg: "Could not find existing Postgres installation" when: "not matrix_postgres_detected_existing|bool" +# Starting the database container had automatically created the default +# role (`matrix_postgres_connection_username`) and database (`matrix_postgres_db_name`). +# The dump most likely contains those same entries and would try to re-create them, leading to errors. +# We need to skip over those lines. - name: Generate Postgres database import command set_fact: matrix_postgres_import_command: >- @@ -67,17 +71,26 @@ -v {{ server_path_postgres_dump }}:/{{ server_path_postgres_dump|basename }}:ro --entrypoint=/bin/sh {{ matrix_postgres_docker_image_latest }} - -c 'cat /{{ server_path_postgres_dump|basename }} | + -c "cat /{{ server_path_postgres_dump|basename }} | {{ 'gunzip |' if server_path_postgres_dump.endswith('.gz') else '' }} - psql -v ON_ERROR_STOP=1 -h matrix-postgres' - + grep -vE '^CREATE ROLE {{ matrix_postgres_connection_username }}' | + grep -vE '^CREATE DATABASE {{ matrix_postgres_db_name }}' | + psql -v ON_ERROR_STOP=1 -h matrix-postgres" + +# This is a hack. +# See: https://ansibledaily.com/print-to-standard-output-without-escaping/ +# +# We want to run `debug: msg=".."`, but that dumps it as JSON and escapes double quotes within it, +# which ruins the command (`matrix_postgres_import_command`) - name: Note about Postgres importing alternative - debug: - msg: >- - Importing Postgres database using the following command: `{{ matrix_postgres_import_command }}`. - If this crashes, you can stop Postgres (`systemctl stop matrix-postgres`), - delete its existing data (`rm -rf {{ matrix_postgres_data_path }}/*`), start it again (`systemctl start matrix-postgres`) - and manually run the above import command directly on the server. + set_fact: + dummy: true + with_items: + - >- + Importing Postgres database using the following command: `{{ matrix_postgres_import_command }}`. + If this crashes, you can stop Postgres (`systemctl stop matrix-postgres`), + delete its existing data (`rm -rf {{ matrix_postgres_data_path }}/*`), start it again (`systemctl start matrix-postgres`) + and manually run the above import command directly on the server. - name: Perform Postgres database import command: "{{ matrix_postgres_import_command }}" diff --git a/roles/matrix-postgres/tasks/upgrade_postgres.yml b/roles/matrix-postgres/tasks/upgrade_postgres.yml index 64f3d64b4..096992dee 100644 --- a/roles/matrix-postgres/tasks/upgrade_postgres.yml +++ b/roles/matrix-postgres/tasks/upgrade_postgres.yml @@ -67,14 +67,21 @@ delegate_to: 127.0.0.1 become: false +# We dump all databases, roles, etc. +# +# Because we'll be importing into a new container which initializes the default +# role (`matrix_postgres_connection_username`) and database (`matrix_postgres_db_name`) by itself on startup, +# we need to remove these from the dump, or we'll get errors saying these already exist. - name: Perform Postgres database dump - command: | - /usr/bin/docker run --rm --name matrix-postgres-dump \ - --user={{ matrix_user_uid }}:{{ matrix_user_gid }} \ - --network={{ matrix_docker_network }} \ - --env-file={{ matrix_postgres_base_path }}/env-postgres-psql \ - -v {{ postgres_dump_dir }}:/out \ - {{ matrix_postgres_detected_version_corresponding_docker_image }} pg_dump -h matrix-postgres {{ matrix_postgres_db_name }} -f /out/{{ postgres_dump_name }} + command: >- + /usr/bin/docker run --rm --name matrix-postgres-dump + --user={{ matrix_user_uid }}:{{ matrix_user_gid }} + --network={{ matrix_docker_network }} + --env-file={{ matrix_postgres_base_path }}/env-postgres-psql + --entrypoint=/bin/sh + -v {{ postgres_dump_dir }}:/out + {{ matrix_postgres_detected_version_corresponding_docker_image }} + -c "pg_dumpall -h matrix-postgres > /out/{{ postgres_dump_name }}" - name: Ensure matrix-postgres is stopped service: @@ -102,15 +109,43 @@ delegate_to: 127.0.0.1 become: false +# Starting the database container had automatically created the default +# role (`matrix_postgres_connection_username`) and database (`matrix_postgres_db_name`). +# The dump most likely contains those same entries and would try to re-create them, leading to errors. +# We need to skip over those lines. +- name: Generate Postgres database import command + set_fact: + matrix_postgres_import_command: >- + /usr/bin/docker run --rm --name matrix-postgres-import + --user={{ matrix_user_uid }}:{{ matrix_user_gid }} + --cap-drop=ALL + --network={{ matrix_docker_network }} + --env-file={{ matrix_postgres_base_path }}/env-postgres-psql + --entrypoint=/bin/sh + -v {{ postgres_dump_dir }}:/in:ro + {{ matrix_postgres_docker_image_latest }} + -c "cat /in/{{ postgres_dump_name }} | + grep -vE '^CREATE ROLE {{ matrix_postgres_connection_username }}' | + grep -vE '^CREATE DATABASE {{ matrix_postgres_db_name }}' | + psql -v ON_ERROR_STOP=1 -h matrix-postgres" + +# This is a hack. +# See: https://ansibledaily.com/print-to-standard-output-without-escaping/ +# +# We want to run `debug: msg=".."`, but that dumps it as JSON and escapes double quotes within it, +# which ruins the command (`matrix_postgres_import_command`) +- name: Note about Postgres importing + set_fact: + dummy: true + with_items: + - >- + Importing Postgres database using the following command: `{{ matrix_postgres_import_command }}`. + If this crashes, you can stop Postgres (`systemctl stop matrix-postgres`), + delete the new database data (`rm -rf {{ matrix_postgres_data_path }}`) + and restore the automatically-made backup (`mv {{ postgres_auto_upgrade_backup_data_path }} {{ matrix_postgres_data_path }}`). + - name: Perform Postgres database import - command: | - /usr/bin/docker run --rm --name matrix-postgres-import \ - --user={{ matrix_user_uid }}:{{ matrix_user_gid }} \ - --cap-drop=ALL \ - --network={{ matrix_docker_network }} \ - --env-file={{ matrix_postgres_base_path }}/env-postgres-psql \ - -v {{ postgres_dump_dir }}:/in:ro \ - {{ matrix_postgres_docker_image_latest }} psql -v ON_ERROR_STOP=1 -h matrix-postgres -f /in/{{ postgres_dump_name }} + command: "{{ matrix_postgres_import_command }}" - name: Delete Postgres database dump file file: