Setting up a Database
The Conference Guestbook website is about gathering feedback during conferences. We need to store the comments contributed by the conference attendees in a permanent storage.
A comment is best described by a fixed data structure: an author, their email, the text of the feedback, and an optional photo. The kind of data that can be best stored in a traditional relational database engine.
PostgreSQL is the database engine we will use.
On our local machine, we have decided to use Docker to manage services. The generated
docker-compose.yml file already contains PostgreSQL as a service:
This will install a PostgreSQL server and configure some environment variables that control the database name and credentials. The values do not really matter.
We also expose the PostgreSQL port (
5432) of the container to the local host. That will help us access the database from our machine:
pdo_pgsql extension should have been installed when PHP was set up in a previous step.
Start Docker Compose in the background (
$ docker-compose up -d
Wait a bit to let the database start up and check that everything is running fine:
1 2 3 4 5
$ docker-compose ps Name Command State Ports --------------------------------------------------------------------------------------- guestbook_database_1 docker-entrypoint.sh postgres Up 0.0.0.0:32780->5432/tcp
If there are no running containers or if the
State column does not read
Up, check the Docker Compose logs:
$ docker-compose logs
psql command-line utility might prove useful from time to time. But you need to remember the credentials and the database name. Less obvious, you also need to know the local port the database runs on the host. Docker chooses a random port so that you can work on more than one project using PostgreSQL at the same time (the local port is part of the output of
If you run
psql via the Symfony CLI, you don't need to remember anything.
The Symfony CLI automatically detects the Docker services running for the project and exposes the environment variables that
psql needs to connect to the database.
Thanks to these conventions, accessing the database via
symfony run is much easier:
$ symfony run psql
If you don't have the
psql binary on your local host, you can also run it via
$ docker-compose exec database psql main main
pg_dump to dump the database data:
$ symfony run pg_dump --data-only > dump.sql
And restore the data:
$ symfony run psql < dump.sql
For the production infrastructure on Platform.sh, adding a service like PostgreSQL should be done in the
.platform/services.yaml file, which was already done via the
webapp package recipe:
database service is a PostgreSQL database (same version as for Docker) that we want to provision with 1GB of disk.
We also need to "link" the DB to the application container, which is described in
database service of type
postgresql is referenced as
database on the application container.
Check that the
pdo_pgsql extension is already installed for the PHP runtime:
PostgreSQL is now running both locally via Docker and in production on Platform.sh.
As we have just seen, running
symfony run psql automatically connects to the database hosted by Docker thanks to environment variables exposed by
If you want to connect to PostgreSQL hosted on the production containers, you can open an SSH tunnel between the local machine and the Platform.sh infrastructure:
$ symfony cloud:tunnel:open $ symfony var:expose-from-tunnel
By default, Platform.sh services are not exposed as environment variables on the local machine. You must explicitly do so by running the
var:expose-from-tunnel command. Why? Connecting to the production database is a dangerous operation. You can mess with real data.
Now, connect to the remote PostgreSQL database via
symfony run psql as before:
$ symfony run psql
When done, don't forget to close the tunnel:
$ symfony cloud:tunnel:close
To run some SQL queries on the production database instead of getting a shell, you can also use the
symfony sql command.
Docker Compose and Platform.sh work seamlessly with Symfony thanks to environment variables.
Check all environment variables exposed by
symfony by executing
1 2 3 4 5 6 7 8
$ symfony var:export PGHOST=127.0.0.1 PGPORT=32781 PGDATABASE=main PGUSER=main PGPASSWORD=main # ...
PG* environment variables are read by the
psql utility. What about the others?
When a tunnel is open to Platform.sh with
var:export command returns remote environment variables:
1 2 3 4
$ symfony cloud:tunnel:open $ symfony var:expose-from-tunnel $ symfony var:export $ symfony cloud:tunnel:close
You might not have realized it yet, but having the infrastructure stored in files alongside of the code helps a lot. Docker and Platform.sh use configuration files to describe the project infrastructure. When a new feature needs an additional service, the code changes and the infrastructure changes are part of the same patch.