Step 7: Setting up a Database
Setting up a Database¶
The Conference Guestbook website is about gathering feedback during conferences. We need to store the comments contributed by the conference attendees in a permanent storage.
A comment is best described by a fixed data structure: an author, their email, the text of the feedback, and an optional photo. The kind of data that can be best stored in a traditional relational database engine.
PostgreSQL is the database engine we will use.
Adding PostgreSQL to Docker Compose¶
On our local machine, we have decided to use Docker to manage services. Create
docker-compose.yaml file and add PostgreSQL as a service:
1 2 3 4 5 6 7 8 9 10
version: '3' services: database: image: postgres:13-alpine environment: POSTGRES_USER: main POSTGRES_PASSWORD: main POSTGRES_DB: main ports: 
This will install a PostgreSQL server and configure some environment variables that control the database name and credentials. The values do not really matter.
We also expose the PostgreSQL port (
5432) of the container to the local
host. That will help us access the database from our machine.
pdo_pgsql extension should have been installed when PHP was set up
in a previous step.
Starting Docker Compose¶
Start Docker Compose in the background (
$ docker-compose up -d
Wait a bit to let the database start up and check that everything is running fine:
1 2 3 4 5
$ docker-compose ps Name Command State Ports --------------------------------------------------------------------------------------- guestbook_database_1 docker-entrypoint.sh postgres Up 0.0.0.0:32780->5432/tcp
If there are no running containers or if the
State column does not read
Up, check the Docker Compose logs:
$ docker-compose logs
Accessing the Local Database¶
psql command-line utility might prove useful from time to time.
But you need to remember the credentials and the database name. Less obvious,
you also need to know the local port the database runs on the host. Docker
chooses a random port so that you can work on more than one project using
PostgreSQL at the same time (the local port is part of the output of
If you run
psql via the Symfony CLI, you don’t need to remember anything.
The Symfony CLI automatically detects the Docker services running for the
project and exposes the environment variables that
psql needs to connect to
Thanks to these conventions, accessing the database via
symfony run is much
$ symfony run psql
If you don’t have the
psql binary on your local host, you can also run
$ docker-compose exec database psql main
Dumping and Restoring Database Data¶
pg_dump to dump the database data:
$ symfony run pg_dump --data-only > dump.sql
And restore the data:
$ symfony run psql < dump.sql
docker-compose down if you don’t want to lose data. Or
Adding PostgreSQL to SymfonyCloud¶
For the production infrastructure on SymfonyCloud, adding a service like
PostgreSQL should be done in the currently empty
1 2 3 4
db: type: postgresql:13 disk: 1024 size: S
db service is a PostgreSQL database (same version as for Docker) that
we want to provision on a small container with 1GB of disk.
We also need to “link” the DB to the application container, which is described
relationships: database: "db:postgresql"
db service of type
postgresql is referenced as
database on the
The last step is to add the
pdo_pgsql extension to the PHP runtime:
1 2 3 4
runtime: extensions: - pdo_pgsql # other extensions here
Here is the full diff for the
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
--- a/.symfony.cloud.yaml +++ b/.symfony.cloud.yaml @@ -4,6 +4,7 @@ type: php:7.4 runtime: extensions: + - pdo_pgsql - apcu - mbstring - sodium @@ -21,6 +22,9 @@ build: disk: 512 +relationships: + database: "db:postgresql" + web: locations: "/":
Commit these changes and then re-deploy to SymfonyCloud:
1 2 3
$ git add . $ git commit -m'Configuring the database' $ symfony deploy
Accessing the SymfonyCloud Database¶
PostgreSQL is now running both locally via Docker and in production on SymfonyCloud.
As we have just seen, running
symfony run psql automatically connects to
the database hosted by Docker thanks to environment variables exposed by
If you want to connect to PostgreSQL hosted on the production containers, you can open an SSH tunnel between the local machine and the SymfonyCloud infrastructure:
$ symfony tunnel:open --expose-env-vars
By default, SymfonyCloud services are not exposed as environment variables on
the local machine. You must explicitly do so by using the
flag. Why? Connecting to the production database is a dangerous operation. You
can mess with real data. Requiring the flag is how you confirm that this is
what you want to do.
Now, connect to the remote PostgreSQL database via
symfony run psql as
$ symfony run psql
When done, don’t forget to close the tunnel:
$ symfony tunnel:close
To run some SQL queries on the production database instead of getting a
shell, you can also use the
symfony sql command.
Exposing Environment Variables¶
Docker Compose and SymfonyCloud work seamlessly with Symfony thanks to environment variables.
Check all environment variables exposed by
symfony by executing
1 2 3 4 5 6 7 8
$ symfony var:export PGHOST=127.0.0.1 PGPORT=32781 PGDATABASE=main PGUSER=main PGPASSWORD=main # ...
PG* environment variables are read by the
psql utility. What about
When a tunnel is open to SymfonyCloud with the
--expose-env-vars flag set,
var:export command returns remote environment variables:
1 2 3
$ symfony tunnel:open --expose-env-vars $ symfony var:export $ symfony tunnel:close
Describing your Infrastructure¶
You might not have realized it yet, but having the infrastructure stored in files alongside of the code helps a lot. Docker and SymfonyCloud use configuration files to describe the project infrastructure. When a new feature needs an additional service, the code changes and the infrastructure changes are part of the same patch.
This work, including the code samples, is licensed under a Creative Commons BY-NC-SA 4.0 license.