Contact sales

How to build an air quality alerting system with the Particle Boron and AWS

This tutorial guides you through the process of building a low-cost, portable air quality monitoring system using the Particle Boron microcontroller and a variety of Grove sensors.

Mithun Das article author avatarMithun DasJune 21, 2024
How to build an air quality alerting system with the Particle Boron and AWS

The Problem

Last summer, smoke from Canadian wildfires swept across Canada and the US making it dangerous to breathe outside. Air quality issues have been common during fire season in California for years but this time it wasn’t just in dry areas, it was in most of the United States. This dangerous air was a threat to anyone with lungs but was especially dangerous to asthmatics. Data was available for outdoor air quality in most places, however the air wasn’t just dangerous outdoors. Smoky air made its way inside as well and most people had no way of keeping track of the air quality in their own space. Commercially available products were expensive and sold quickly. Most people just had to hope that their air was fine. 

 

Fire season is coming up this year and most people still do not have their own air quality monitoring systems. 

The Solution

 

This tutorial guides you through the process of building a low-cost, portable air quality monitoring system using the Particle Boron microcontroller and a variety of Grove sensors. By following the step-by-step instructions, you’ll integrate sensors for measuring particulate matter, carbon dioxide, volatile organic compounds and more. 

 

The device will be programmed using the Particle Workbench to log raw sensor data to Particle Cloud. Then raw data will be translated to human readable meaning status using Particle Logic in Particle Cloud without maintaining the business logic inside firmware. 

Translated data will be sent to AWS for sending the alert to a Slack channel. Once you have data in AWS, you can do a lot more than just sending a message to slack, for example, write to a dynamodb table, build REST API or GraphQL, build and deploy a front end app to visualize the data. 

Bill of materials

Hardwares and sensors

Things I am using in this projects are

  • Particle Boron 404X [buy]
  • Grove Air Quality Sensor 
  • Grove eC02 and VOC [buy]
  • Grove dust Sensor [buy]

 

Software

  • Visual Studio Code
  • AWS Account
  • Slack Account 
  • Slack Mobile App

 

Setup your Boron

First off, go to setup.particle.io and follow the step by step instructions to set up your Boron. In the end, you should see a screen like below. Click on the “Go to Console” button. 

 

 

By now your device should be online. Notice the product you created earlier. 

 

Connect the sensors

  • Connect the dust sensor to D4
  • Connect the air quality sensor to A2
  • Connect eCO2 & VOC sensor to one of the I2C port

From here, click on the “Web IDE” icon from the bottom left menu. Then choose the “Blink An LED” example and flash the firmware. After a few seconds the firmware should get updated and you will notice the led on the board starts blinking. You should also see event data appearing on the particle cloud console. Now it’s time to write some custom firmware. 

 

Create the project in workbench

Follow the instructions here to set up Particle Workbench and create your project, name it “firmware”. You should see “firmare.cpp” under “src” folder. 

Let’s fist install necessary libraries. Type cmd + shift + P and type “Particle:Install Library”. It will ask for the library name. Type “SGP30_Gas_Sensor_Grove” and hit enter. 

 

Repeat the same process to install following libraries

  • Grove_Air_quality_Sensor
  • JsonParserGeneratorRK

Now copy the code  (firmware.cpp ) from my github gist and flash the code. Within a few minutes you should start receiving events in your Particle cloud console. 

 

Particle Logic

As you can see the data coming from Boron are raw sensor readings such as air quality=3, eco2=411. These values are not human readable. Would it not be better to say air quality=Fresh or Pollute? You can perform such translation in the cloud without modifying your firmware code using Particle Logic. Logic is basically a javascript function that is triggered when an event is received. We are going to trigger another event after translation which will in turn post data to the AWS queue. 

Navigate to the particle cloud and select your product. 

While you are on the products page, click the “Logic” icon from the left menu. 

In the next page, you will find a few templates, choose Reformat JSON data” to customize. On the next page, paste below code in the edit code block. 

 

import Particle from 'particle:core';




export default function reformat({ event }) {

  let data;

  try {

  data = JSON.parse(event.eventData);

  } catch (err) {

    console.error("Invalid JSON", event.eventData);

    throw err;

  }

   const reformatted = {

   

   aq: data.aq === 3 ? "Fresh Air" : data.aq === 2 ? "Low Pollution" : data.aq===1 ? "High Pollution":"Danger",

   eco2: data.eco2 <= 1000? "Normal": data.eco2 <= 2000? "Poor": data.eco2 <= 5000? "Danger":"Extremly Danger",

   tvoc: data.tvoc <=200? "Normal": data.tvoc <=3000? "Little Discomfort": data.tvoc <=25000? "Discomfort":"Toxic",

   dust: data.dustcon <=54? "Good": data.tvoc <=154? "Moderate": data.tvoc <=354 ? "Unhealthy":  "Hazardous"

   

  };

  Particle.publish("data-reformatted", reformatted, { productId: event.productId });

}

Click on ‘Next” and then on the next page, select the product from the drop down and type the same event name you used in the firmware code.

Now if you navigate to “Events” page, you will see both the events – one coming from the device and another formatted one from the “logic”.

AWS Integration

Now let’s send the formatted data to AWS Simple Queue Service using Particle Integration. 

Working knowledge on AWS is required for this part of the tutorial. Setting up an AWS account and getting started is outside of the scope of this tutorial. 

First create an IAM user called “particle-sqs-user” 

Then go to the user detail page and generate an access key and secret. Copy them somewhere secure. You will need them when you will create the integration on the particle console. Also keep a note of the user ARN. 

Next, create a standard queue by navigating to AWS SQS and then under Access Policy, restrict the access to only the user you just created. Paste the ARN in the box, leave all other fields as default and create the queue. 

Now head over to the particle console and navigate to the integration page. Then search for “AWS SQS”

On the next page, enter a name for the integration of your choice. 

Under the event name, make sure to put the event name you used in your logic function. 

Under the URL, paste the URL from the AWS SQS console. 

Expand “Advanced settings”, put the same URL for “QueueUrl” and then paste AWS access key and secrets

Keep everything else as default and enable the integration. Just after a few minutes if you scroll down on the integration page, you will notice the events being sent from the particle cloud.

Now to confirm if AWS is receiving the messages, head over to the AWS SQS console and you will notice the number of messages available in the queue. 

Congratulations!!! You have successfully integrated AWS with Particle Cloud. 

 

Slack notification

The last piece in our puzzle is sending out notification to Slack. For that we will create a lambda function which will be triggered when a message is received on SQS. 

First, navigate to the AWS Lambda console and click on the “Create function” button. On the next page, enter “particleAQAS” as the function name, choose Nodejs 20.x as your runtime, leave everything else as default and click on the “Create function” button. Within a few seconds lambda function will be created, 

Now navigate to Configuration and then Permissions and click on the execution role. 

It will take you to the IAM role. Click on “Add permissions” then “Create inline policy”. Then in the policy editor JSON view, paste below policy. Make sure to change the account id. Follow the steps and create the policy. 

{

    "Version": "2012-10-17",

    "Statement": [

        {

            "Sid": "VisualEditor0",

            "Effect": "Allow",

            "Action": [

                "sqs:DeleteMessage",

                "sqs:ReceiveMessage",

                "sqs:GetQueueAttributes"

            ],

            "Resource": "arn:aws:sqs:us-east-1:ACCUNT_ID:aqas-particle-queue"

        }

    ]

}

 

Then head over to your lambda function page and click on the “+ Add trigger” button. Then select SQS as trigger and select the queue you created earlier and add the trigger.

Now if you go to “Monitor” -> “View CloudWatch Log”, you will notice the logs being generated. 

Now let’s modify the lambda code to extract sensor data from the message and send notification to Slack. I am assuming you have a Slack webhook URL available with you. Paste below code in your lambda and deploy the function.  

import * as http from 'https';




const defaultOptions = {

    host: 'hooks.slack.com',

    port: 443,

    headers: {

        'Content-Type': 'application/json',

    }

}




const post = (path, payload) => new Promise((resolve, reject) => {

    const options = { ...defaultOptions, path, method: 'POST' };

    const req = http.request(options, res => {

        let buffer = "";

        res.on('data', chunk => buffer += chunk)

        res.on('end', () => resolve(buffer))

    });

    req.on('error', e => reject(e.message));

    req.write(JSON.stringify(payload));

    req.end();

})





export const handler = async (

  event,

  context

) => {

  console.log('Received event:', JSON.stringify(event, null, 2));

  console.log('Received context:', JSON.stringify(context, null, 2));

  for (const message of event.Records) {

    await processMessageAsync(message);

  }

  console.info('done');

};




async function processMessageAsync(message) {

  try {

    const data = JSON.parse(message.body)

    console.log(data);

    

    const text = Object.entries(JSON.parse(data.data)).map(([key, value]) => `${key}:${value}`).join("\n");




    await post(process.env.SLACK_WEBHOOK_PATH, { text: text });

    

    

   

  } catch (err) {

    console.error('An error occurred');

    throw err;

  }

}

 

Soon you will start receiving alerts from slack like below.

Display data on Ubidots

 

The bonus – let me show you how to display the data on Ubidots Dashboard. Let’s first modify our Logic function to add raw sensor values to display on the dashboard along with deviceId to bind with the dashboard. 

 

const reformatted = {

   

   aq: data.aq === 3 ? "Fresh Air" : data.aq === 2 ? "Low Pollution" : data.aq===1 ? "High Pollution":"Danger",

   eco2: data.eco2 <= 1000? "Normal": data.eco2 <= 2000? "Poor": data.eco2 <= 5000? "Danger":"Extremly Danger",

   tvoc: data.tvoc <=200? "Normal": data.tvoc <=3000? "Little Discomfort": data.tvoc <=25000? "Discomfort":"Toxic",

   dust: data.dustcon <=54? "Good": data.tvoc <=154? "Moderate": data.tvoc <=354 ? "Unhealthy":  "Hazardous",

   deviceId: event.deviceId,

   aq_raw: data.aq,

   eco2_raw : data.eco2,

   tvoc_raw : data.tvoc,

   dust_raw : data.dustcon

  };

 

 

If you are new to Ubidots, head over to https://ubidots.com/stem and create a free account. Then go to devices from the top menu and create a blank device. Enter “aqas1” as device name and the deviceId from particle cloud as device label.

Once the device is created, go to your account page and from there go to API Credentials and copy the default token, you will need that later. 

Next, go to your AWS Lambda console and open particleAQAS lambda you created earlier. Then go to, configuration-> Environment Variables and add a new variable called “UBIDOTS_TOKEN” and paste the token you have copied earlier from Ubidots. 

 

Modify processMessageAsync method inside the lambda function. 

 

async function processMessageAsync(message) {

  try {

    const data = JSON.parse(message.body)

   

    const jsonData = JSON.parse(data.data);

    console.log(jsonData);

    const humanReadableData = {

      aq : jsonData.aq,

      eco2: jsonData.eco2,

      tvoc: jsonData.tvoc,

      dust: jsonData.dust

    }

    

     const rawData = {

      aq : jsonData.aq_raw,

      eco2: jsonData.eco2_raw,

      tvoc: jsonData.tvoc_raw,

      dust: jsonData.dust_raw

    }

    const text = Object.entries(humanReadableData).map(([key, value]) => `${key}:${value}`).join("\n");




    await post(process.env.SLACK_WEBHOOK_PATH, { text: text });

    

    //remove human readable data 

    await postUbitdots(jsonData.deviceId, rawData);

    

    

   

  } catch (err) {

    console.error('An error occurred');

    throw err;

  }

}




And add a new method above the handler.




const postUbitdots = (deviceId, payload) => new Promise((resolve, reject) => {

    const headers = {

      'Content-Type': 'application/json',

      'X-Auth-Token': process.env.UBIDOTS_TOKEN

    };

    

    const options = { host:'industrial.api.ubidots.com', path: `/api/v1.6/devices/${deviceId}`, port:443, headers:headers, method: 'POST' };

    const req = http.request(options, res => {

        let buffer = "";

        res.on('data', chunk => buffer += chunk)

        res.on('end', () => resolve(buffer))

    });

    req.on('error', e => reject(e.message));

    req.write(JSON.stringify(payload));

    req.end();

})

 

Save and deploy the lambda function. Within a few minutes ( depending on the frequency you have in the device firmware code) you should see data appearing on your Ubidots.

Now go to Data->Dashboard and create new dashboard – name it “Air Quality Dashboard”

Next, click on the “Add new Widget” – select the widget type you want to use. There are many different graphs you can choose from. Then you select the variable you want to bind to the widget.

Repeat this step for all of your variables. Your final Dashboard should look something like this. 

Be creative and create more meaningful graphs as per your need.

Comments are not currently available for this post.