Setup Node.js, Apache and an nginx reverse-proxy with Docker
A complete example
A modern web-app requires an environment to run JavaScript on a server and to request data from APIs running on other type of servers.
Edit Apr 28, 2018: removed the deprecated links
docker-compose instructions.
Let’s make an example which does the following when the server receives a client request:
- An nginx reverse-proxy forwards incoming traffic to the appropriate server and directly serves static assets (images and scripts).
- A Node.js server builds pages with content pre-fetched from the PHP api (server side rendering).
- A PHP Api running on Apache provides content as Json.
The code is on Github.
This could be extended to build a Progressive Web App (with Vue.js or React) on top of the Wordpress API (or anything with a REST API).
tl;dr
# Install and run the code with:$ git clone https://github.com/francoisromain/docker-nginx-nodejs-php.git && cd docker-nginx-nodejs-php$ docker-compose up# Go to http://localhost:8000 and see the result.
Prerequisite
- Install Docker on your local computer and your server.
Project structure
.
+-- nginx
+-- nodejs
| +-- index.js
| +-- …
+-- php
| +-- api
| | +-- index.php
| | +-- …
| +-- content
| | +-- image.jpg
| | +-- …
+-- static
| +-- scripts.js
Docker
You could install nginx, Apache, and Node.js individually on the host machine. However, if you ever need to duplicate this, on a local computer, or an other server, you will have to repeat the installation process step by step. Docker is aimed to solving this problem by making applications run inside isolated environments called containers. Containers can run on any hosts (server, local computer, etc.) and can be shared easily.
Let’s use Docker Compose to create a container description for each process and then start everything.
Create a file named docker-compose.yml
at the root of the project with that content:
version: "3.1"services:
nginx:
image: nginx:alpine
ports:
- "8000:80"
volumes:
- ./php/content:/srv/www/content
- ./static:/srv/www/static
- ./nginx/default.conf:/etc/nginx/conf.d/default.conf
depends_on:
- php
- nodejsnodejs:
image: node:alpine
environment:
NODE_ENV: production
working_dir: /home/app
restart: always
volumes:
- ./nodejs:/home/app
depends_on:
- php
command: ["node", "index"]php:
image: php:apache
volumes:
- ./php:/var/www/html
Inside this file, 3 services are defined: nginx
, nodejs
, and php
. Each service has a list of options:
image
: defines the container content. This image is pulled from the docker hub.ports
: exposes ports from the host to the containerHOST:CONTAINER
.volumes
: shares content between the host and the container.links
: aliases to containers in another service. This also determines starting order. The services with thelinks
option start after the linked services.
See the Docker Compose file reference for more detail.
nginx
nginx (pronounce engine x) is an HTTP and reverse proxy server. The role of a reverse-proxy is to route incoming traffic to the appropriate server which eventually handles the request. Also even if there is an overlap between nginx and Apache functionalities, it is a good idea to use them together, because nginx is fast at serving static files directly, and let Apache handle only PHP.
Inside the /nginx
directory, create a file named default.conf
with that configuration:
server {
listen 80;
root /srv/www;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme; location / {
try_files $uri @nodejs;
} location /api {
rewrite ^([^.\?]*[^/])$ $1/ break;
proxy_pass http://php:80;
} location @nodejs {
proxy_pass http://nodejs:8080;
}
}
The server
instruction defines our server:
listen
: the listening port.root
: where the files are served from.proxy_set_header
: define the headers sent to the proxied servers.
Each location
instruction defines what to proxy depending on the incoming request. nginx selects the most specific match in the list.
/
: the default instruction (if no other matches).try_files
check the existence of files in the specified order. Here, it tries first to match the exacturi
, then, if no file is found it send the request to the@nodejs
aliased location./api
: proxies requests tohttp://php
. Therewrite
instruction handles the missing trailing slash.@nodejs
: an alias used in the default location instruction which proxies requests tohttp://nodejs
Docker dynamically modify the /etc/hosts
file in each individual containers for this to work.
Apache
For the sake of this demo, let’s mimic a very simple Api with a single index.php
file inside the /php/api/
directory.
<?php $db = Array (
Array ("name" => "apples", "value" => 5, "img" => "/content/apple.jpg"),
Array ("name" => "oranges", "value" => 3, "img" => "/content/apple.jpg"),
Array ("name" => "pears", "value" => 12, "img" => "/content/apple.jpg")
); header("Content-type:application/json");
header("HTTP/1.1 200 Success");
echo json_encode($db);?>
On every requests to http://localhost:8000/api/
, this returns a json array like so:
[
{ name: 'apples', value: 5, img: '/content/apple.jpg' },
{ name: 'oranges', value: 3, img: '/content/orange.jpg' },
{ name: 'pears', value: 12, img: '/content/pear.jpg' }
]
Note: by default, the mod_rewrite
Apache module, useful for handling url rewriting, is not installed.
Node.js
The Node.js has two jobs:
- Fetching data from the PHP Api
- Serving pages on incoming requests, with fetched data included
To keep this example as simple as possible, we write everything in a single index.js
file inside the /nodejs/
directory and we avoid using any external dependency.
1. Fetching data from the PHP Api
To fetch data from the Api, we use the native node http.get
method. Next we will use this inside the method which handles user requests. So, to avoid a huge function full of callbacks and keep things separated, let’s wrap the http.get
method inside a promise
, like so:
const apiUrl = 'http://php:80/api/'const apiFetch = () => {// the promise wrapper
return new Promise((resolve, reject) => {// the http.get method, from the Node.js doc example:
// https://nodejs.org/api/http.html#http_http_get_options_callback http.get(apiUrl, res => {
const contentType = res.headers['content-type']
let error
let data = '' if (res.statusCode !== 200) {
error = new Error('Api error / statusCode: ' + res.statusCode)
} else if (!/^application\/json/.test(contentType)) {
error = new Error('Api error / contentType: ' + contentType)
} if (error) {
console.error(error.message)
res.resume()
reject(error)
} res.setEncoding('utf8')
res.on('data', chunk => { data += chunk })
res.on('end', () => {
try {
const apiJson = JSON.parse(data)
console.log('Api success / apiJson: ', apiJson)
resolve(apiJson)
} catch (e) {
console.error('Api error / response: ', e.message)
reject(e.message)
}
})
}).on('error', e => {
console.error('Api error / get: ', e.message)
reject(e.message)
})
})
}
Notice that Node.js connects to the Api via the php-app
service exposed by Docker, like it does in the nginx config.
2. Serving pages on incoming requests, with fetched data included
We use a simple Node.js server, and include a call to the apiFetch
function above:
const http = require('http')
const port = 8080const server = http.createServer((req, res) => {
console.log('request: ' + req)
res.writeHead(200, { 'Content-Type': 'text/html' }) // Fetch the api
apiFetch().then(json => {
res.write(htmlTemplateCreate(htmlListCreate(json)))
res.end()
}).catch(error => {
res.write(htmlTemplateCreate(`<p>${error}</p>`))
res.end()
})
})server.listen(port)
console.log('Server running at http://localhost:' + port + '/')
Finally we need the two functions htmlTemplateCreate
and htmlListCreate
to render the html template and include the fetched content:
const htmlTemplateCreate = htmlString => `<!DOCTYPE html>
<html>
<head>
<title>Test</title>
</head>
<body>
<div id="client-list">
<h2>Client side rendered</h2>
</div>
<div id="server-list">
<h2>Server side rendered</h2>
${htmlString}
</div>
<script src="/static/scripts.js"></script>
</body>
</html>`// Create an html list from json dataconst htmlListCreate = json => {
let htmlListString = '<ul>'
json.map(entry => {
htmlListString += `<li>
<img src="${entry.img}">
<span>${entry.name}: ${entry.value}</span>
</li>`
})
htmlListString += '</ul>'
return htmlListString
}
Start the machine
From this point, things should work fine. Let’s boot up with a simple command! From the root directory of the project:
$ docker-compose up
- Docker installs everything specified in the
docker-compose.yml
file. - It starts and connects the services together.
Check the result at http://localhost:8000
.
Bonus: client side rendering
Until now, no javascript is running in the browser. To fetch data from the php Api, from the client side, we create a scripts.js
file in the /static/
directory. (This one is already linked from the html template made previously).
const url = 'http://localhost:8000/api'
const $clientList = document.getElementById('client-list')
const createNode = element => document.createElement(element)
const append = (parent, el) => parent.appendChild(el)const htmlListCreate = json => {
const ul = createNode('ul')
json.map(entry => {
let li = createNode('li')
let img = createNode('img')
let span = createNode('span')
img.src = entry.img
span.innerHTML = entry.name + ': ' + entry.value
append(li, img)
append(li, span)
append(ul, li)
})
return ul
}fetch(url)
.then(data => data.json())
.then(json => {
console.log(json)
append($clientList, htmlListCreate(json))
})
.catch(error => console.log(error))
This fetches data from the Api and inserts content into the page.
Next Steps
- Include a js front-end framework and a build step.
- Make a real Api.
- Deploy.