Creating a web app with a Raspberry Pi, Express and PostgreSQL

This post is a detailed guide for anyone who is interested in setting up a simple web application with an HTTP API on a Raspberry Pi that runs on their home network. It uses Express as the web application framework which integrates with a PostgreSQL database.

Using a Raspberry Pi for something like this enables various projects around the house. It’s also a great learning project to get comfortable with things like PostgreSQL and Linux, since the Raspberry Pi OS is based on a popular Linux distribution, Debian.

For this project, I wanted to have a web application with an HTTP API that could store data received from a homemade sensor over Wi-Fi, as well as have an API to surface sensor measurements in an interface like Apple’s Home app. The homemade sensor and displaying of data in the Apple Home app will be covered in later posts. The focus for this post is setting up the API on Raspberry Pi server in the middle picture below.

I did this with a Raspberry Pi 4 Model B but you could probably do the same thing with the more cost effective Raspberry Pi Zero W. Remember that you’ll need a micro SD card and USB power supply to go with that.

This post goes into as much detail as possible, so if you’re a relatively new programmer you can follow along to learn more about building and deploying an Express project with PostgreSQL. If you’re already familiar with some of these then you might just want to skim over some sections.

This project doesn’t use an ORM as I wanted to get some hands on experience with SQL. While I’m generally on the side that prefers to use an ORM for public facing projects, not using one in this case keeps our set up simple and gets us to results faster. Since it’s not a public facing service, we don’t really need all the validation provided by model schemas and so on. Various security precautions are also discussed in the post to keep things reasonably secure within its home use case.

The project we’ll be creating in this post is called Habitat.

Getting started

Okay, so the general approach we’re going to take with this project is to set up the web app and database on our own development machine (your laptop, home PC or whatever you use to write code) first, and then deploy those to the Raspberry Pi.

Software installation

The first step to get started is to make sure you have the following software installed on your development machine

  • Node.js - a popular JavaScript runtime. It lets you write JavaScript code that can interacts with a computer’s OS, instead of just web pages in a browser.
  • PostgreSQL - a popular SQL database application
  • Git - a popular tool for managing and sharing versions of your code

You can check if any of these are installed by running the commands below in your command-line. If the software is installed, the command-line should output the version number after you’ve entered the command.

$ node -v
$ psql --version
$ git --version

If some of these are not installed, click on the above links to find installation instructions for your particular operating system.

Once you have the required software on your dev machine we can move on to setting up the database.

Database

If you’re not familiar with PostgreSQL, you might find this quick primer helpful.

When you install PostgreSQL it creates a default user called postgres, this is the super user that we can use to set up a database and a user for our application. Let’s open up the command line and log in as the postgres user with the following command:

$ sudo -i -u postgres

Now we can start interacting with Postgres from the command line using the psql client that gets installed with it. While logged in as the postgres user in your terminal, you can start a psql session (to run SQL queries and other commands) by entering the command psql without any arguments:

$ psql
postgres=#

You should now see that your prompt has changed to the postgres=# prompt which means you’re logged in as a superuser (denoted by the #) and attached to the default ‘postgres’ database. You can can now start typing in psql or SQL commands.

You can exit psql by entering in the \q command:

postgres=# \q

Postgres also installs a few other shortcut utilities that you can use from the command-line. For example, let’s use createuser to create a new user to own our web application’s database. If you haven’t already, exit your psql session with \q and then enter the following into the command prompt:

$ createuser -P username

Replace username above with whatever you want your username to be. Once run, it should prompt you for a password, which you can then duly enter and confirm. Once a user has been created, we can then create a database and have the user set as the owner for it. We can use the createdb tool for this from the command-line too:

$ createdb -0 username dbname

Replace username and dbname above with the user you created and the database name of your choice.

Now that we have a user and a fresh database we need to add some tables to the database. Let’s connect to our new database by running the psql command and passing the database name as an argument:

$ psql dbname

You should now see your command prompt change to dbname=#, which means you’re now connected to your new database as a super user and can start entering some SQL queries to create tables.

If it’s just a personal hobby project you can keep your tables simple, but I wanted to create a semi-real world web application. I’d like API requests to be authenticated with user credentials, and the ability to support multiple sensors, so am going to create three tables: users, sensors and measurements. Here are the SQL commands to create each table:

CREATE TABLE users (
   user_id SERIAL PRIMARY KEY,
   username VARCHAR(50),
   password VARCHAR(255),
   is_admin BOOLEAN DEFAULT FALSE
);
CREATE TABLE sensors(
    sensor_id SERIAL PRIMARY KEY,
    name VARCHAR(255)
);
CREATE TABLE measurements(
    id SERIAL PRIMARY KEY,
    sensor_id INT,
    celcius FLOAT,
    humidity FLOAT,
    created TIMESTAMP DEFAULT NOW(),
    CONSTRAINT sensor
        FOREIGN KEY(sensor_id)
            REFERENCES sensors(sensor_id)
);

If you’re wondering what some of these column data types are you can have a look at this post for a reference.

Now that we have some tables let’s insert a user into our users table. We’ll use these credentials to authenticate our HTTP requests made to the web application. This is different to the database user we’ve created earlier, which is used for querying the database. To insert a user into our web app’s users table we can enter the following SQL into the psql prompt:

INSERT INTO users (username, password) VALUES ('myuser', 'secret');

Next we’ll want to insert a sensor record that can be linked to the measurements that are logged. This is so that if we add multiple sensors to the system we know which readings belong to which sensors. Let’s create one using the insert statement again.

INSERT INTO sensors (name) values ('NodeMCU  ESP32');

I’ve just named my sensor after the NodeMCU ESP32 Development board that I’ll using for the sensor.

Web application

Now that we have a database to interact with we can start setting up our web application. We’re going to use Express which is a Node.js library that helps us to write web applications.

Hello World

Exit the psql client or open up a new terminal window and navigate to a directory where you’d like to store you project, create a folder for it and use npm init to start a new project. Follow the prompts to create your new project, you can just press enter if you want to use some of the default suggestions.

$ cd Desktop/
$ mkdir habitat
$ cd habitat
$ npm init

The npm (Node package manager) command-line tool is installed with Node.js. It enables us to easily set up a new project and install various Node libraries from the npm. Let’s use it to install our first library, Express.

$ npm install express

Okay, now open up the project folder in your preferred IDE. I just use Sublime to keep things quick and simple.

Create a new entry point file (where your application starts) in the project’s root folder. The default npm init setting expects this file to be called index.js but name the file whatever you specified as the entry point filename when setting up your project with npm init. You can check this in the package.json file if you need to.

Once you’ve created the index.js file open it up and add the following code:

const express = require('express')
const app = express()
const port = 3000

app.get('/', (req, res) => {
  res.send('Hello World!')
})

app.listen(port, () => {
  console.log(`Habitat is up and running at http://localhost:${port}`)
})

Save the file, go back to your terminal and run the following command:

$ node index.js

The command line should then output some text saying ‘Habitat is up and running at http://localhost:3000‘. Now open up your browser and visit http://localhost:3000/.

Nice! Let’s create a new Git repository to store snapshots of our code from this point.

Repository set up

If you’re familiar with Git feel free to skip this section. Otherwise, to create a new Git repository you can use Git’s init command in the root of your project. Enter the following into your command-line:

$ git init

This will create a new repository named after the folder it’s contained in.

Now that we have a repo set up we can ‘commit’ snapshots of our code at various points. But before we do this we should tell Git to ignore certain files, like the node_modules directory that contains all of the packages our project uses, like Express. Including this directory would just clutter our repo. It’s easy enough for anyone who might want to clone and use the repo to just run npm install to install all of the required packages that are listed in the package.json file.

To tell Git what files to ignore you can create a .gitignore file in the project root and fill it with the names of files and directories that we don’t want Git to track.

$ touch .gitignore

Dot files like .gitignore are hidden files and directories. For the most part they are normal text files and directories. They are hidden from the GUI and simple commands like ls but they still exist in the operating system. You can view them with with a command like ls -la and edit them using a text editor like Nano. Let’s open up the .gitignore file:

$ nano .gitignore

and tell it what files and folders we don’t want git to track:

node_modules/
.git
.DS_Store

We can now save (Ctrl/CMD + O) and exit (Ctrl/CMD + X) the Nano text editor and make our first commit.

$ git add .
$ git commit -m "initial commit"

The first command above tells Git to add the current folder (specified with a .) to Git’s staging area to be committed. The second command tells Git to commit the staged items with a commit message of “initial commit”. Git will now store a snapshot of the current state of our code with this message.

In general, you want to try and commit at points where you’ve made a distinct change to a certain part of your code. So that if you need to review the commit log of messages you can easily see what has been changed where. If you commit versions of your code where several things have been changed at a time then this can complicate things if you ever want to roll back to a certain point or undo certain changes. You also don’t want to be committing too frequently. Just find a balance that works for you.

You’ll also want to push your code to a code hosting service like GitHub, Gitlab or BitBucket so that you can easily download it to the Pi later. This also gives the benefit of backing up your code remotely and being able to share it with others who might want to participate in the project.

Head to one of those websites mentioned and create a new (empty) repository, then grab the new repo URL they give you and come back to the command-line and add it as a remote origin for your local repo, then push your local commits to the remote repo:

$ git remote add origin <new_repo_url>
$ git push -u origin master

Great, you should now have pushed your initial commit. We’ll commit and push versions of the project code at certain points in this guide. Now let’s get back to building our web application.

Handling requests

If you’re familiar with how to handle requests in Express then you can skip this section. Otherwise, let’s recall this bit of code from index.js for a second:

app.get('/', (req, res) => {
  res.send('Hello World!')
})

The Express app that you initialise and store in the variable app has a get() method, which you can use to register a new function to handle a GET request. The get() method takes two arguments: a route and a function which handles the request sent to that route, also known as a controller function.

Whenever you enter a URL into your browser’s address bar and press enter your browser performs a GET request to essentially ‘get’ the information at that URL. So when you now visit http://localhost:3000/ (or even without that last backslash) you’ll get a message saying ‘Hello World!’, as this is how we’ve told Express to respond to GET requests made to that route.

Because the controller function above is relatively simple (sends a response of ‘Hello World!’), the entire function definition has been passed as the second argument.

You could also define your function outside of the get() method and pass it in as an argument like this:

function homepage(req, res) {
    res.send('Hello World!');
}

app.get('/', homepage);

This is essentially, how you handle requests in Express. In this particular case, if a user wants to GET information from the homepage of your app.

Automatic reloads

In order to see any code changes made to your Node application take effect, you normally have to stop and start the application again. You can do this pressing Ctrl/⌘ + C and then running node index.js again. This can get tedious quickly.

So instead we can use a utility called nodemon to automatically restart our application each time we save a file in our our project. You can install it globally on your development machine, like this:

npm install -g nodemon

This makes nodemon available to all projects and doesn’t include it in your current project dependencies.

Now if we start our app with nodemon like this:

$ nodemon index.js

It will automatically reload our application every time we save a file with some updated code. Now let’s get back to handling requests in Express.

Handling JSON

Say we wanted to create a route like http://localhost:3000/measurements/ to handle some JSON that is sent to our web app.

To do that we would create a POST request handler instead of a GET handler as when applications send information to a web server they typically use the POST method. At this point we also need to tell Express to use its JSON middleware parser which will create a body object on the request that we can access.

So we could add something like this to our index.js file:

app.use(express.json());

function postMeasurement(req, res) {
    var temp = req.body.temp;
    res.send(`The temperature is: ${temp}`);
}

app.post('/measurements/', postMeasurement);

‌Now, if you can use an API testing tool like Postman to make a POST request to /measurements/ with some JSON like this:

{"temperature": 22.6}

You should get a response back that says “The temperature is: 22.6”.

Project structure

Now that we know how to handle requests, we could just add a bunch of route handlers and controller functions to our index.js file. However, it can become a mess quickly.

const express = require('express')
const app = express()
const port = 3000

app.use(express.json());

function homepage(req, res) {
    res.send('Hello World!');
}
app.get('/', homepage);

function postMeasurement(req, res, next) {
    var temp = req.body.temperature;
    res.send(`The temperature is: ${temp}`);
}
app.post('/measurements/', postMeasurement);

app.listen(port, () => {
  console.log(`Habitat is up and running at http://localhost:${port}`)
})

We can improve the structure of this app by moving the routes and controllers to their own respective folders. Let’s create two new folders in the root of the project called routes and controllers. Inside the routes directory we can create a file called measureRoutes.js and inside the controllers directory we can create a file called measureControllers.js. So our app folder structure should now look something like this:

habitat
├── controllers
│   └── measureControllers.js
├── index.js
├── node_modules
├── package-lock.json
├── package.json
└── routes
    └── measureRoutes.js

with the contents of measureRoutes.js looking like this:

var express = require('express');
var router = express.Router();
var measurements = require('../controllers/measureControllers');

router.post('/', measurements.postMeasurement);

module.exports = router;

and the contents of measureControllers.js looking like this:

exports.postMeasurement = function(req, res, next) {
    var temp = req.body.temperature;
    res.send(`The temperature is: ${temp}`);
}

and finally, we can change index.js to look like the below.

const express = require('express')
const app = express()
const port = 3000
const measurementsRouter = require('./routes/measureRoutes');

app.use(express.json());

app.use('/measurements', measurementsRouter);

app.listen(port, () => {
  console.log(`Example app listening at http://localhost:${port}`)
})

This allows us to maintain a clean list of measurement-related routes in measureRoutes.js and controllers in measureControllers.js which we can then neatly include in our index.js file.

For example, if we want to add a new GET handler to return the latest measurement in the database when querying /measurements/latest/ we can simply add these to the measureRoutes.js and measureControllers.js files.

/* routes/measureRoutes.js */

var express = require('express');
var router = express.Router();
var measurements = require('../controllers/measureControllers');

router.post('/', measurements.postMeasurement);

router.get('/latest/', measurements.getLatestMeasurement);

module.exports = router;
/* controllers/measureControllers.js */

exports.postMeasurement = function(req, res, next) {
    var temp = req.body.temperature;
    res.send(`The temperature is: ${temp}`);
}

exports.getLatestMeasurement - function(req, res, next) {
    res.send('The latest measurement was')
}

Now would be a good time to make another commit, so let’s look at doing that and push our code to our origin repo at whatever service you’ve chosen.

$ git add .
$ git commit -m "add static POST and GET measurement endpoints"
$ git push origin master

As you’ve probably noticed though we’ve had to hardcode the value that gets returned from our new getLatestMeasurement handler because we’re not connected to the database yet, which is pretty boring. So let’s take a look at that now that we have some decent structure in place.

Database connection

We’re going to be using the node-postgres middleware package to interact with our PostgreSQL database. Let’s go ahead and install it with npm:

$ npm install pg

This will give us what we need to set up a database Pool object (of clients) that we can use to make database queries.

Note on ORMs: A popular approach for web applications is to use an ORM. For Express apps it would be something like Sequelize. This gives you an object-oriented way to interact with your database. For example, you could fetch all records from your users table by writing something like const users = await User.findAll(); instead of writing a raw SQL query. However, they require a bit more initial set up (of a User model class) and so on. So for this hobby project we’ll be using plain SQL queries with the node-postgres package to keep things simple and get familiar with the SQL syntax too which is a useful skill to have.

Once you’ve installed node-postgres, we’ll need to create a new file that initializes our database Pool object, we can then import this into files where we’d like to make database queries.

Let’s create a new file called database.js in a directory called config, so that our project directory looks like this:

habitat
├── config
│   └── database.js
├── controllers
│   └── measureControllers.js
├── index.js
├── node_modules
├── package-lock.json
├── package.json
└── routes
    └── measureRoutes.js

Inside the database.js file we can place the following code:

require('dotenv').config({ path: __dirname + '/.env' })
const Pool = require('pg').Pool;
const pool = new Pool({
    host: 'localhost',
    port: '5432',
    database: 'db_name',
    user: 'db_user',
    password: 'db_password',
    
})

module.exports = pool;

This creates a new instance of a Pool class using the database details that are passed into it. This pool object can be exported for use in other parts of the web application, for example into the measureControllers.js file.

But obviously, we don’t want to hardcode our database user and password into this database.js file as it would end up being committed our code repository. Instead we’ll store these values in a .env file (which we’ll tell Git not to track) and then pull these details into the database.js file. This keeps sensitive credentials out of our code repository and and in one central place.

To make this easier we’ll use a package called dotenv. Let’s install that now":

$ npm install dotenv

Then we can create a .env file in our project root to store our credentials, and open it up using Nano

$ touch .env
$ nano .env

Now we can place our various application and database details in here:

PORT=3000
DB_NAME=dbname
DB_USER=db_user
DB_PASSWORD=db_password
DB_HOST=localhost
DB_PORT=5432

Then save and exit the file. Now we want to add the .env to the .gitignore file so that it isn’t tracked in our Git repo. To append some text to the contents of a file from the command line you can use this command:

$ echo ".env" >> ".gitignore"

It saves us having to open up Nano again. Now we can update our index.js file to require the dotenv package with a path specified to our .env file. We’ll also include our database pool object. Now both of these will be loaded into the application’s scope when it starts up.

/* index.js */
require('dotenv').config({ path: __dirname + '/.env' })
const pool = require('./config/database');

const express = require('express')
const app = express()
const port = 3000
const measurementsRouter = require('./routes/measureRoutes');

app.use(express.json());

app.use('/measurements', measurementsRouter);

app.listen(port, () => {
  console.log(`Example app listening at http://localhost:${port}`)
})

Then in our database.js file we can reference the environment variables that have been loaded into Node’s process.env, like this:

const Pool = require('pg').Pool;
const pool = new Pool({
    host: process.env['DB_HOST'],
    port: process.env['DB_PORT'],
    database: process.env['DB_NAME'],
    user: process.env['DB_USER'],
    password: process.env['DB_PASSWORD'],
    
})

module.exports = pool;

Great, now our database configuration values are safely decoupled from our code and we have a Pool instance that we can use to make database queries.

Time for another code commit:

$ git add .
$ git commit -m "add database config"
$ git push origin master

Database queries

Now that we have a tool to make database queries with, let’s go back to the measureControllers.js and require this database pool object to be included. We can then use this to make SQL queries that are a bit more interesting

var pool = require('../config/database');


exports.postMeasurement = function(req, res, next) {

    // Extract data from request body intro variables
    var { sensor_id, celcius, humidity } = req.body;

    // Check if variables are int, float and float
    var dataValid = (
        Number.isInteger(sensor_id) &&
        typeof celcius == 'number' &&
        typeof humidity == 'number'
    )

    if (dataValid) {
        // DO NOT insert user generated values into the string directly
        var insertSQL = `INSERT INTO measurements (sensor_id, celcius, humidity) VALUES ($1, $2, $3);`
        var values = [sensor_id, celcius, humidity]
        // Pass an array of values as the second 
        // argument for pool.query() method to 
        // build the query string safely.
        pool.query(insertSQL, values, (error, result) => {
            if (error) {
                res.status(400).send(error);
            } else {
                res.status(200).send('Saved to database.\n');
            }
        });

    } else {
        res.status(400).send('Please check that your data types are correct');
    }

}

exports.getLatestMeasurement = function(req, res, next) {
    // Get most recent measurement from db and return as JSON.
    pool.query('SELECT * FROM measurements ORDER BY created DESC LIMIT 1;', (error, results) => {
        if (error)
            throw error;
        res.status(200).json(results.rows);
    });
}

SQL injection warning: Don’t insert any user generated values into your SQL query string directly, this leaves your web application vulnerable to SQL injection attacks. Instead place all user supplied values into an array and pass that array as the second argument to the pool.query() method. This method has provisions to escape user data safely when inserting it into the SQL query string.

Testing the API

Now let’s try to POST the following JSON to /measurements/

{
  "sensor_id": 1,
  "celcius": 24.9,
  "humidity": 56.2
}

If we do this in Postman, it should look something like this:

and now let’s GET the latest measurement from /measurements/latest/

Awesome! Now our web application is a bit more interesting as it’s saving and retrieving information from the database.

Time for a commit:

$ git add .
$ git commit -m "update POST and GET endpoints to use database"
$ git push origin master

Adding authorization

Even though this project is going to be hosted on a private network, you may want to protect these endpoints with some form of authorization. For this we can use something like Passport.js. Let’s install it:

$ npm install passport passport-http

For this example, we’re going to set up a basic authentication strategy.

create a file called passport.js in your /config folder and put this code into it:

/* config/passport.js */

var passport = require('passport');
var BasicStrategy = require('passport-http').BasicStrategy;
var db = require('./database');

module.exports = passport.use('basic', new BasicStrategy((username, password, done) => {
  db.query('SELECT username, password FROM users WHERE username=$1', [username], (err, result) => {
    if(err) 
      return done(err);

    if(result.rows.length > 0) {
      const user = result.rows[0]

      password, user.password
        if(res) {
          done(null, {username: user.username})
         } else {
          done(null, false)
         }
       })
     } else {
       done(null, false)
     }
  })
}))

Note: If you’re going to be setting up a public facing web application then you’ll want to beef up your web application’s security a bit more than this. We’ll discuss security in the next section.

You can now require this passport file inside your measureRoutes.js and and pass it as a second parameter to your route handlers:

/* routes/measureRoutes.js */

var express = require('express');
var router = express.Router();
var measurements = require('../controllers/measureControllers');

var passport = require('passport')
var basicAuth = passport.authenticate('basic', { session: false })

router.post('/', basicAuth, measurements.postMeasurement);
router.get('/latest/', basicAuth, measurements.getLatestMeasurement);

module.exports = router;

Finally, we just need to initialise it in our index.js file.

/* index.js */
require('dotenv').config({ path: __dirname + '/.env' })
var pool = require('./config/database');
const express = require('express')
const app = express()
const port = 3000
const passport = require('./config/passport');
const measurementsRouter = require('./routes/measureRoutes');

app.use(express.json());

app.use('/measurements', measurementsRouter);

app.listen(port, () => {
  console.log(`Habitat is up and running at http://localhost:${port}`)
})

Now when we try to make the same requests we get a response of 401 Unauthorized as our routes are now protected with Basic Authentication.

So to make POST and GET requests from this point, we’ll need to add Basic Authentication (with the username and password we inserted into the database) to our Postman request and try again.

Great! Now that we’ve verified that we can make authenticated requests to the endpoints it seems like a good time to make another commit:

$ git add .
$ git commit -m "add basic authentication to endpoints"
$ git push origin master

Security

So far we’ve added basic authentication to these endpoints but this application is only secure because it’s on your own private home network which you control access to. This web application is not yet secure enough for larger corporate networks or the public internet. Here are some additional security considerations to bear in mind before making your web application available on a more public network.

Firstly, this web application only handles traffic over HTTP, so none of the traffic is encrypted and it’s entirely possible for someone on the same network to intercept this data, including the username and password you’re using. Enabling HTTPS for a web application on a local network involves a bit more work than on a public server with a domain name, which is beyond the scope of this post for now. But we’ll look at it in a future post. Enabling HTTPS support is crucial before deploying an application like this to a network which may have people on it that you don’t know or trust. HTTPS encrypts data being transmitted on the network which prevents people from intercepting it and abusing your application.

By default most home Wi-Fi routers have firewalls that block incoming traffic from the internet that may be trying to access devices on your network. So the risk of someone accessing your Raspberry Pi server from the outside internet with your router’s default settings is low. However, in a scenario where your web application is on a publicly accessible server it is imperative that you encrypt user passwords with something like bcrypt. This ensures that if someone gains access to your database there is still reasonable protection in place for your users passwords. Encrypted passwords require prohibitively expensive computation to decrypt. Some of your users may use the same password for your service and their email account. If that password is obtained by someone else then their email account could be accessed and used to reset passwords for a whole bunch of other services like online banking, shopping, government services and so on. If you’re asking users for a password to use your service then you have a responsibility to ensure you store it securely.

Finally, if you want to expose your Raspberry Pi to the public internet you may want to look at securing it further.

These extra security precautions put in place will obviously depend on your particular use case for the application so choose the level of security you’re comfortable with. We could spend a lot of time and effort hardening up this application but that would go beyond the scope of this posts use case. So we’re going to leave it at Basic Authentication for now.

Deployment

Okay, now we’re going to look at deploying our web application and database to the Pi. This guide assumes you have a fresh installation of Raspberry Pi OS but you shouldn’t have any issues deploying to one that you may have set up previously.

Make sure your Raspberry Pi is switched on and accessible via the network over Wi-Fi or Ethernet. Then we can SSH into it to begin the set up:

$ ssh pi@raspberrypi.local

The default username and password for a fresh Raspberry Pi OS installation is pi and raspberry, respectively. You’ll probably want to look at changing this using the passwd command once you’ve logged in.

Software installation

As with our development machine, we’re going to need to make sure that the Raspberry Pi has the required software installed to run this web application. It may be more convenient to login as the root user in order to run a lot of these installation commands. To do that you can type in the command:

$ sudo -i -u root

Now let’s enter the following commands to install PostgreSQL, Node, Express and Git:

$ sh -c 'echo "deb http://apt.postgresql.org/pub/repos/apt $(lsb_release -cs)-pgdg main" > /etc/apt/sources.list.d/pgdg.list'
$ wget --quiet -O - https://www.postgresql.org/media/keys/ACCC4CF8.asc | sudo apt-key add -
$ apt-get update
$ apt-get install postgresql

$ curl -sL https://deb.nodesource.com/setup_14.x | bash -
$ apt-get install -y nodejs

$ apt-get install git

apt-get is used to install packages on Linux. The other commands are to ensure that we are downloading officially signed packages.

Now that we’ve installed the software we need, let’s look at setting up the database on the Pi.

Database

To get some practice with backing up and restoring PostgreSQL databases we can dump the database from our development machine and restore it to a new database on the Raspberry Pi.

Open up a new terminal window on your development machine and enter the following command:

$ pg_dump dbname > db_backup.sql

This pg_dump utility will dump the dbname database into a file called db_backup.sql in your current working directory. This file is basically a collection of SQL commands. When you restore it to a new database, it will set up the tables and data exactly as it was in the database you dumped it from.

Next we need to copy this .sql file from our development machine to the Pi. For Linux and Mac we can use scp from the command-line:

$ scp db_backup.sql pi@raspberrypi.local:../../tmp/

This copies the db_backup.sql file from its current location to the /tmp directory on the Pi.

To copy files from Windows to Linux, you’ll need to use a tool called PuTTY.

Once the .sql file has been copied over to the Pi, we can SSH into the Pi and restore this dump into a new database. Let’s SSH into the Pi now:

$ ssh pi@raspberrypi.local

Once we’re in we can login as the postgres user, then create a new user and database with the user assigned as the owner. You’ll need to ensure you create the same username that you’ve used on your development machine as that’s what is contained in the SQL dump.

$ sudo -i -u postgres
$ createuser -P username
$ createdb -O username dbname

Once you’ve created the user and database you can then navigate to the /tmp directory and import the dump into the new database like this:

$ cd ../../../tmp
$ psql dbname < db_backup.sql

Your new database should now have the same table structure and data as the database on your development machine. With that in place we can now move on to setting up the web app.

Web app set up

Log out of the postgres user by entering in the command exit:

$ exit

This should return you to your normal pi user. You can now use Git to clone the web application to the Pi.

$ ssh pi@raspberrypi.local
$ git clone <repo_url>

If you run the ls command you should now see a folder with your code repository inside it. Let’s change into that directory and install the project dependencies:

$ cd habitat
$ npm install

Once all of the dependencies have been installed we’ll need to create a .env file for the web applications environment variables:

$ touch .env

Open it up with nano .env and copy the contents from the .env file on your development machine into this new file:

PORT=3000
DB_NAME=dbname
DB_USER=db_user
DB_PASSWORD=db_password
DB_HOST=localhost
DB_PORT=5432

Then save and exit the file (Ctrl/⌘ + O and Ctrl/⌘ + X). You should now be ready to run your application. Let’s try it out!

Run the web app

We know that we can run the application by entering in the command node index.js or nodemon index.js. However, this only runs the web application for as long as the terminal window stays open. Now that it’s on the Pi we want this web application to run all time time, even after the terminal window has been closed. So we’ll need to run the job, move it to the background and then disown it. Here’s how we do that:

$ node index.js &

Then press Ctrl + Z to suspend the job and enter the following commands:

$ bg
$ disown

This will move the job to a background process and disown it. You should now be able to close your terminal window and the web application should continue to run but don’t do that just yet, we still need this window for one more thing.

Testing the API

Okay, now that the web application is running on the Pi we want to see if we can call its API from another device (our dev machine) on the same network. On your dev machine update the Postman request by replacing localhost in the URL with raspberrypi.local, like this http://raspberrypi.local:3000/measurements/latest/.

Remember to leave the port in the URL and include Basic Authentication credentials in your request. Now fire it off and see what you get back!

If all goes well you should get a response with some JSON showing the most recent measurement. Congratulations! You now have a web application API running and available on your private network.

From here on you can choose what you’d like to do, perhaps add more columns to your database to store different types of data or add more endpoints. But essentially you now have a home server that you can now use around the house.

In future posts we’ll be taking things further by looking at:

  • building a custom sensor to take temperature, humidity and other possible measurements, and post them to this API
  • surfacing this data in other interfaces like Apple’s Home app.


Mihai Zecheru picture

How can I host my API on a custom domain? I bought a .com domain from Ionos.com.

Mihai Zecheru picture

How can I host my API on a custom domain? I bought a .com domain from Ionos.com.