Building on top of our docker-machine tutorial, we’re going to use the docker toolbox to create provision a server and deploy a Ruby on Rails application to the cloud.
Goal for this Tutorial:
- Create a Docker host with docker-machine.
- Deploy a Ruby on Rails application to the server using docker-compose.
Up to this point we’ve looked at how to utilize the Docker in our development environment, how to package up our own images, and how to use docker-compose to orchestrate different services in our development environment. These things are all great for having a sharable development, but there’s nothing stopping us from using this for deploying applications.
Our Rails Application
We’re actually going to build on top of the previous Docker in Development tutorial for our application. It shouldn’t be too difficult to take the basic application that we created in that tutorial and run it in the cloud.
The first thing that we need to do though is grab the code. We’ll use git to pull the sample code from the previous tutorial from github. We’re placing a .
at the end of this command so that the contents of the repository are placed in our current directory.
$ git clone git@github.com:coderjourney/03-learn-to-use-docker-in-development.git .
We’ll need to change into the blog
directory, but besides that we’re good to go with a Rails application.
Looking at the Docker Setup
Before we get too far we should refresh our memories as to how we’ve set up Docker and docker-compose in this application.
Dockerfile:
FROM ruby:2.3
RUN apt-get update -yqq \
&& apt-get install -yqq --no-install-recommends \
postgresql-client \
&& rm -rf /var/lib/apt/lists
WORKDIR /usr/src/app
COPY Gemfile* ./
RUN bundle install
COPY . .
EXPOSE 3000
CMD rails server -b 0.0.0.0
docker-compose.yml
version: "2"
volumes:
db-data:
external: false
services:
db:
image: postgres
env_file: .env
volumes:
- db-data:/var/lib/postgresql/db-data
app:
build: .
env_file: .env
volumes:
- .:/usr/src/app
ports:
- "3000:3000"
depends_on:
- db
We’ve also packaged up the .env
file that holds the environment variables that we’re passing into the containers. Normally I wouldn’t store this file in version control because it contains secrets and passwords. Let’s look at that file again.
.env:
POSTGRES_USER=coderjourney
POSTGRES_PASSWORD=abcd1234
Everything here looks pretty good so let’s see if we can deploy this as is.
Docker-machine & Digital Ocean
There are essentially two things that we need to deploy a container for a web application into the wild:
1) A docker host server to run the container 2) An image to download or a Dockerfile to build the image from
Thankfully, Docker provides us with a really nice tool for building docker hosts easily in quite a few hosting providers. For this tutorial, we’re going to host with Digital Ocean (that link will get you a $10 credit for signing up) and creating our host using docker-machine. You’ll need to sign up with digital ocean before we begin because you’ll need to grab your API token. Once you have your account you can go here to generate a new API token.
I have my token store in the environment variable DO_TOKEN
(you can set that for yourself using export DO_TOKEN="YOUR_TOKEN"
. Now that we have our API token, we can use docker-machine
to actually create a droplet for us and set it up to be a docker host for us.
$ docker-machine create --driver=digitalocean --digitalocean-access-token=$DO_TOKEN --digitalocean-size=1gb blog
Now we have our first Docker host running “in the cloud” 😀. There are a lot more configuration values that you can pass to the digital ocean driver, so check those out in the Docker docs.
After our machine is up and running we’ll want to set that as our active machine using the env
command docker-machine
gives us:
$ eval $(docker-machine env blog)
Deploying with Docker
Like we had to when learning to develop a Rails application using Docker, we’re going to need to do a few things before our application will actually run. First, we’re going to need to create our database. We can do that by starting our services that are going to run from images, those shouldn’t have any issues.
$ docker-compose up -d db
This will create the network that all of our containers connect to to communicate with one another and also start our postgres container.
Next up, we’ll need to build our app
image before we can run commands. This is going to need to install all of the gems and dependencies so it will likely take awhile.
$ docker-compose build app
After we’ve build our image we can run some commands with it. First, we need to create and migrate our database:
$ docker-compose run --rm app rake db:create db:migrate
/usr/src/app/Gemfile not found
Well that’s not good. This is an issue caused by our docker-compose.yml
file sharing through our local volume so that we could develop using our normal tools and run the application in a container. That doesn’t work with a remote docker host like we’re working with now.
To fix this problem we need to change the docker-compose.yml
file, but that will mean that we won’t be able to work with our application in development 😱. Thankfully, we can get past that by create a different compose file for our “production” deploys.
Let’s make a copy of our current file and call it docker-compose.prod.yml
$ cp docker-compose.yml docker-compose.prod.yml
We can make a few tweaks to this file and we’ll be well on our way to having this app deployed.
docker-compose.prod.yml
version: "2"
volumes:
db-data:
external: false
services:
db:
image: postgres
env_file: .env
volumes:
- db-data:/var/lib/postgresql/db-data
app:
build: .
env_file: .env
environment:
RAILS_ENV: production
ports:
- "3000:3000"
depends_on:
- db
We’ll also adjust the “production” block of our config/database.yml
and add “POSTGRES_HOST” to our .env
file to make sure we can connect to the database.
config/database.yml
# Extra values excluded
production:
<<: *default
host: <%= ENV["POSTGRES_HOST"] %>
database: blog_production
.env:
POSTGRES_HOST=db
POSTGRES_USER=coderjourney
POSTGRES_PASSWORD=abcd1234
Since we’ve made a modification to the files that are in the image we’ll need to rebuild our app
image before we continue, thankfully the caching will allow this to be quick. Notice that we’re using the -f
flag to specify a different docker-compose file.
$ docker-compose -f docker-compose.prod.yml build app
After that’s finished we should be able to create our database:
$ docker-compose -f docker-compose.prod.yml run --rm app rake db:create db:migrate
Yay, Our database was successfully created! Let’s start our application and see if it works. We’ll run in the foreground so that we can see the logs.
$ docker-compose -f docker-compose.prod.yml up app
Connecting to the App
We should be running now. Our next step is to try to access the application. For that we’ll need the IP address of our machine (your IP will be different).
$ docker-machine ip blog
45.55.89.18
Now if we go to http://45.55.89.18:3000
we should see our application… Except we get this message:
An unhandled lowlevel error occurred. The application logs may have details.
Fixing the App
Thankfully, if we look at our logs we see this error:
#<RuntimeError: Missing `secret_key_base` for 'production' environment, set this value in `config/secrets.yml`>
We need to define a SECRET_KEY_BASE
in our .env file since that’s the environment variable used in config/secrets.yml
.
POSTGRES_HOST=db
POSTGRES_USER=coderjourney
POSTGRES_PASSWORD=abcd1234
SECRET_KEY_BASE=cb6b786b1ec490b51694594a1243cffef162655f93f5f0927ef7de7553039d7440560cba832a5ec53c0e7293f3382ce707628367bde3b8290255ada8a3f64737
After that’s finished we’ll run our application again, this time in the background (confidence!).
$ docker-compose -f docker-compose.prod.yml up -d app
If we try to connect to http://45.55.89.18:3000
now we should see our fancy “Hello from Docker” message.
Recap
We’ve successfully deployed a “hello world” rails application to a docker host running in the cloud. There were a few snags that we had to deal with that don’t happen when running in development, from a local VM, but we made it through it. This deploy is not “good”. There’s quite a bit more that we should have done to get this production ready, but we did get the application to run.