Some whining

For many years I’ve been using local PostgreSQL database for projects. But this has always had many limitations, starting from fixed DB engine version and ending with the need to maintain multiple databases for several projects with migrations, rollbacks and here and there data corruption which happens during development.

Docker to the rescue!

So then I moved to Docker and that solved all these issues at once. Here I’ll describe a simple setup for a project that runs on PostgreSQL database, provide configs and setup DB access for localhost. For all commands below I assume you are in your project root directory.

Create directory layout for your database:

mkdir -p .dockerdb/db .dockerdb/initdb

When you start working on a project that has been around for some time, it usually already has a test database up and running. If this is a case you can use that to initialise your local one. So first you need to make its dump and save it locally.

pg_dump --host=<db_host> --port=5432 --dbname=<db_name> \
  --username=<access_user> --file=.dockerdb/initdb/db.sql

Otherwise you can create .dockerdb/initdb/db.sql manually and fill it with SQL commands to create your tables and populate some initial data if you want. If you don’t it’s also OK, you just start with an empty DB without any tables and data upon first Docker startup.

Now create a .dockerdb/docker-compose.yml file with the contents:

version: '3'
    image: postgres:10.6-alpine  # use the necessary version here
      - POSTGRES_PASSWORD=secure_password
      - POSTGRES_USER=myusername
      - POSTGRES_DB=example_project_db
      - 15432:5432  # expose port 15432 for DB connections
      # Preserve DB contents between Docker runs by storing it locally
      - ./db:/var/lib/postgresql/data
      # Startup SQL script: if there's no DB upon startup,
      #  scripts from ./initdb/ directory will be used to create one
      #  (that's why we saved DB backup there):
      - ./initdb/:/docker-entrypoint-initdb.d

See official Docker postgres examples for a list of available images and more information about the params above.

Run the database

And that’s all! By now before the first start you should have these files and directories:

$ tree .dockerdb/
├── db
├── docker-compose.yml
└── initdb
    └── db.sql

I’d recommend running a DB in a separate terminal with these commands:

cd .dockerdb
docker-compose up; docker-compose down

First command (up) will own the terminal and provide a nice log of all the processes. When you are done with this DB, just hit Ctrl-C in this window, that will terminate up and move to graceful down command.

Connect to the database

To connect to the running DB use the credentials you setup in docker-compose.yml file. In our case:

$ psql --host= --port=15432 example_project_db myusername
Password for user myusername:
psql (11.2, server 10.6)
Type "help" for help.


If you use Django in your project, fill same credentials in (or other settings file you use):

    'default': {
        'ENGINE': 'django.db.backends.postgresql_psycopg2',
        'NAME': 'example_project_db',
        'USER': 'myusername',
        'PASSWORD': 'secure_password',
        'HOST': '',
        'PORT': 15432,


DB handling is super easy with Docker. All your DB files are preserved between the runs in .dockerdb/db directory:

$ tree .dockerdb/db -d
├── base
│   ├── 1
│   ├── 13014
│   ├── 13015
│   └── 16384
├── global
├── pg_commit_ts
├── pg_dynshmem
├── pg_logical
│   ├── mappings
│   └── snapshots
├── pg_multixact
│   ├── members
│   └── offsets
├── pg_notify
├── pg_replslot
├── pg_serial
├── pg_snapshots
├── pg_stat
├── pg_stat_tmp
├── pg_subtrans
├── pg_tblspc
├── pg_twophase
├── pg_wal
│   └── archive_status
└── pg_xact

If your DB becomes corrupted at some point (invalid migrations/playing with data/etc.) you can easily recreate the DB. First stop database docker-compose and cleanup the DB files:

rm -rf .dockerdb/db/*

After that start the DB as usual, it will create a new DB from the same dump (or clean one if you have no dump).