Home Climate Monitoring with AWS Lambda and Raspberry Pi

I am building a home climate monitoring system for my house. Active sensors are built around Raspberry Pi computers paired with AM2302 temperature / humidity sensors. The Raspberry Pis read humidity and temperature data from the sensors every minute, add metadata such as home id, room name, tenant ID — yeah, making it multi-tenant because I plan to share the infrastructure with friends — and make a HTTP POST call to an AWS Lambda function written in Java using the Spring Cloud Function framework exposed over the web using the AWS API Gateway. The lambda function will store the climate readings in a AWS RDS PostgreSQL instance. The database schema is maintained using LiquiBase. Other AWS Lambda functions will read the database, process readings and serve temperature and humidity curves to an Angular web application where we can monitor the whole house.

Let’s see how its done, piece by piece.

Software Components

On the Raspberry Pi, a python script is invoked every minute to read temperature and humidity data from the AM3202 sensor, enhance it with metadata and post it to the AWS Lambda Function. The code is fairly simple as seen in the snippet bellow. In order to run it you’ll first need to install the Adafruit DHT python library.

In the code bellow, you’ll have to replace two tokens:

  • in line 26, use the AWS API gateway URL you that is mapped to the function
  • in line 28, you need to use the API key you have defined for the usage plan that guards your API

So:

  • connect your AM3202 sensor to your Raspberry Pi,
  • get this python file on the same,
  • schedule it with CRON,
  • you’re almost set.

First you’ll have to define the lambda function that responds to the API and stores all these records.

The back-end is implemented as an AWS Lambda function written in Java, using the Spring Cloud Function framework. This lambda function is exposed on the internet using the AWS API Gateway.

The advantage of using Spring Cloud Function is that it is actually Spring Boot, so you can take advantage of the entire ecosystem out of the box. For this project we’re going to use Spring Data to connect to a PostgreSQL instance on AWS RDS instance. We are also going to LiquiBase to manage the schema of the Postgres instance.

You’ll also able to run it on a variety of public cloud infrastructures. As of today, Spring Cloud Function supports AWS Lambda, Azure Functions and Apache OpenWhisk .

I have chosen to deploy the function on AWS lambda, so a dedicated adapter class was added to the project.

The project can be found on HitHub here: https://github.com/entzik/iot-home-climate-monitoring

As mentioned, a Spring Cloud Function is a Spring Boot application. It is also expected to define a bean of type java.util.function.Function, java.util.function.Consumer or java.util.function.Supplier, like in the code snippet before.

In this case we use a function that gets a ClimateReading reading as input, saves it to the database and returns the saved object.

The object is constructed from the JSON document produced by the python script above and saved using a very simple Spring Data repository:

Since we are only using the “save” operation we don’t need to add anything else to this interface for now.

You may have noticed the bean is named. You can name the bean any way you want, as long as you specify the function bean name in your application.properties file:

Because we want to run it on AWS Lambda and expose it to the internet over the AWS API Gateway, we need to add one more adapter class:

The handler will receive the payload of the HTTP POST request, deserialize it and invoke the function. It will then get the saved object, serialize it to JSON and include it in the HTTP response.

So that’s pretty much it. The Spring Cloud Function is now complete to run on its own. Just run ./gradlew clean assemble and you’ll get an artifact named iot-home-climate-monitoring-1.0.0.BUILD-SNAPSHOT-aws.jar under /build/libs, and that’s your packaged function that you will deploy on AWS Lambda.

Setting up the database

The database in which climate readings are stored is managed by LiquiBase. LiquiBase integrates nicely with SpringBoot and each time a function comes up it will upgrade the database, if necessary.

Spring Boot will look for LiquiBase change logs when the following properties are added to application.properties

The main change log describes the table that holds climate readings:

Setting up the the AWS environment

In order to do this you will of course need an AWS account and your CLI installed and running

NOTE — bellow is a series of CLI commands and AWS console operations. All this can of course be automated, but that will be topic of another post.

It is fairly easy to create an RDS database using the command line bellow:

This will create a database that will live in the AWS free tier. You will have to specify a connection password.

You may want to change the region to have the DB closer to where you.

After creating the database you should make note of the information highlighted bellow, you will need it in order to configure the lambda function.

You will need to endpoint bellow to configure the DB_HOST_NAME environment variable and security group information to enable the function to access this database.

In your AWS console go to IAM and create a role that will allow your function to access AWS resource — such as the database you have just created.

After creating the database, the next thing you do create and deploy the lambda function:

This will create a lambda function on AWS and deploy your JAR file.

Note that:

  • you need to specify the role you have created for your function and assign it— line 7
  • we need to set a timeout of 120 seconds to cope with rare situation where cold start gets really cold… — line 11

In the AWS console go to your Lambda function and create an API Gateway trigger.

We will see a little bit later how to configure your API, for now let’s continue configuring the lambda function.

You will have to set values to environment variables defined in when you created the function via the command line in order to configure access to your database. To DB_HOST_NAME is supposed to be the same as the database endpoint — see screenshot above, while user name and password must match the value you have specified when you created the database using the command line.

The you need to configure the VPC and access groups to be able to access your database. Make sure you use the same VPC and security group values

Now you have to configure the trigger API for internet access. If you go to the API Gateway Console and click on the API you have created to trigger your function, it will look like this:

I recommend creating a stage called production. When you click on the stage you will see the URL where you can invoke the

You can either leave the API public — risky move, don’t do it — or decide to protect it with a usage plan of your choice. In my case I have created a a usage plan associated to an API key and then I added to usage plan to the API stage.

Hardware Components

Well this one is easy. I used a RaspberyPi and a wired humidity / temperature sensor which I connected to GPIO pin number 2. You can of course choose to connect it to another GPIO pin, in which case you will have to reflect that in line 10 of the python script.

Here are links to components I used:

  • AM2302 sensor — this is very good option for the sensor because it come pre-wired with the required resistor and a set of small tables cabled that makes it very easy to connect it to your GPIO port of choice on the Raspberry Pi
  • The Raspberry Pi itself — is is an expentive model model, if you plan to equip multiple rooms you may want to look at the Raspberry Pi Zero W, or if like me, you are not very fond of doing much soldering, a Raspberry Pi Zero WH.

Lead Architect at LiquidShare, building a cloud native, blockchain enabled, financial services SaaS platform.