Proximity gesture detection with Particle’s B504e and Edge Impulse


Ready to build your IoT product?
Create your Particle account and get access to:
- Discounted IoT devices
- Device management console
- Developer guides and resources
Introduction
This project builds on a previous post about how to remotely gather training data for Edge Impulse over LTE with Particle.
Gesture detection is a common use case for Edge AI models. Typically it’s shown by using 3D accelerometer data to classify a gesture. This implies that the end user has the device in their hand while attempting a gesture. A more realistic scenario might involve classifying a gesture the user makes nearby a static device.
For example, we could imagine a kiosk that uses gestures to turn change screens, or a dispenser waiting for a specific gesture to distribute product. To record “external” gestures in a scenarios like these, we might first consider a camera module or something similar.
But, as a challenge and to keep costs and power draw low, in this post we’ll try to classify user gestures with a proximity and light sensor. We’ll use the specific pattern of proximity and external light readings to estimate when someone has waved their hand near the sensing element.
Hardware setup
For this project, we’re using an M.2 Breakout Board with the B504e as the main LTE CAT 1-enabled processor. We’ll use the VCNL4040 proximity breakout and an OLED FeatherWing from Adafruit for our sensing element and user interface.
Particle’s M.2 Breakout Board makes it simple to connect our external sensors and let’s us get right to firmware development without any hardware debugging.
Make sure to cut the reset jumper on the M.2 breakout board if you plan to use the 128x64 OLED FeatherWing.
Configuring the build environment
Start by creating a new project in the VSCode Workbench extension. Configure the project for the latest Device OS release and the B-SoM device.
Now, we’ll need to install some libraries to interact with our peripherals. In this example, we can use a Particle provided library for the display, but we’ll need to import a third-party library for the VCNL4040 module.
To install the display library, open the command palette and search for Adafruit_SH110X_Z
. Running this command should automatically create a new lib
folder with our SH110X display driver and its dependency.
Using the same steps as above, we also need to install the Adafruit_Bus_IO
library.
Next, we’ll need to import the VCNL4040 library provided by Adafruit as it’s not provided by Particle. There are a few things we need to change to make it work with our build environment as described in the Particle documentation.
Start by downloading the library from Adafruit’s source repository and unzipping it to some local folder. You may rename the folder whatever you’d like, Adafruit_VCNL404
in this example.
Inside of this new folder, create a /src
directory and move Adafruit_VCNL4040.cpp
and Adafruit_VCNL4040.h
into it. Now this whole folder (Adafruit_VCNL4040
) can be copied into the /lib
directory of your Particle project.
Test that everything was properly installed by compiling some sample code:
#include "Particle.h"
#include <Adafruit_SH110X.h>
#include <Adafruit_VCNL4040.h>
#define SAMPLING_FREQ_HZ 50 // Sampling frequency (Hz)
#define SAMPLING_PERIOD_MS 1000 / SAMPLING_FREQ_HZ // Sampling period (ms)
#define NUM_SAMPLES 100 // 100 samples at 50 Hz is 2 sec window
#define BUTTON_A D4
#define BUTTON_B D3
#define BUTTON_C D22
Adafruit_SH1107 display = Adafruit_SH1107(64, 128, &Wire);
Adafruit_VCNL4040 vcnl4040 = Adafruit_VCNL4040();
Data acquisition firmware
Now that we can interface with our peripherals, we can start building out our training dataset. It’s recommended to first read through how to remotely gather training data for Edge Impulse over LTE with Particle as we’ll be employing many of the same methods.
The complete data acquisition firmware repository can be found here and is relatively simple. However, there are a few things worth calling out.
The setup
routine configures the auxiliary power output from the power module so that the OLED and QWIIC connector can be receive the necessary power.
SystemPowerConfiguration powerConfig = System.getPowerConfiguration(); powerConfig.auxiliaryPowerControlPin(D23).interruptPin(A6); System.setPowerConfiguration(powerConfig);
The VCNL4040 configuration was determined experimentally and doesn’t seem to matter too much. However, it’s important that this does not change once the training dataset is gathered.
vcnl4040.setProximityIntegrationTime(VCNL4040_PROXIMITY_INTEGRATION_TIME_8T); vcnl4040.setAmbientIntegrationTime(VCNL4040_AMBIENT_INTEGRATION_TIME_80MS); vcnl4040.setProximityLEDCurrent(VCNL4040_LED_CURRENT_120MA); vcnl4040.setProximityLEDDutyCycle(VCNL4040_LED_DUTY_1_40);
In the main loop, while waiting for the button to be pressed, we log out some existing readings to the display. Notice how the proximity and ambient light are normalized between 0 and 1. This is important to force similar readings across varying lighting environments.
The ambientLightMax
is constantly re-evaluated based on the current max lighting while the proximityMax = 1000.0
was determined by watching a few “wave” actions and noting the relative max.
while (digitalRead(BUTTON_A) == 1) { uint16_t proximity = vcnl4040.getProximity(); uint16_t ambientLight = vcnl4040.getAmbientLight(); if (ambientLight > ambientLightMax) ambientLightMax = ambientLight; float normProximity = proximity < proximityMin ? 0.0 : proximity > proximityMax ? 1.0 : (float)(proximity - proximityMin) / (proximityMax - proximityMin); float normAmbientLight = ambientLight < ambientLightMin ? 0.0 : ambientLight > ambientLightMax ? 1.0 : (float)(ambientLight - ambientLightMin) / (ambientLightMax - ambientLightMin); display.clearDisplay(); display.setTextSize(1); display.setTextColor(SH110X_WHITE); display.setCursor(0, 0); display.print("Norm Prox: "); display.println(normProximity); display.print("Norm Light: "); display.println(normAmbientLight); display.println("------"); display.print("Proximity: "); display.println(proximity); display.print("Light: "); display.println(ambientLight); display.display(); }
Once the button is pressed, we fill up a sample buffer with our normalized data:
// Record samples in buffer startTimestamp = millis(); Variant samplesArray; for (int i = 0; i < NUM_SAMPLES; i++) { // Take timestamp so we can hit our target frequency timestamp = millis(); unsigned long timeDelta = timestamp - startTimestamp; uint16_t proximity = vcnl4040.getProximity(); uint16_t ambientLight = vcnl4040.getAmbientLight(); float normProximity = proximity < proximityMin ? 0.0 : proximity > proximityMax ? 1.0 : (float)(proximity - proximityMin) / (proximityMax - proximityMin); float normAmbientLight = ambientLight < ambientLightMin ? 0.0 : ambientLight > ambientLightMax ? 1.0 : (float)(ambientLight - ambientLightMin) / (ambientLightMax - ambientLightMin); Log.info("Time: %ld, Proximity: %f, ambientLight: %f", timeDelta, normProximity, normAmbientLight); Variant entryArray; entryArray.append(normProximity); entryArray.append(normAmbientLight); samplesArray.append(entryArray); // Wait just long enough for our sampling period while (millis() < timestamp + SAMPLING_PERIOD_MS) ; }
Then, we send it off to the Particle cloud using a blocking publish operation:
event.name("samples"); event.data(samplesArray); Particle.publish(event); waitForNot(event.isSending, 60000); if (event.isSent()) { Log.info("publish succeeded"); event.clear(); } else if (!event.isOk()) { Log.info("publish failed error=%d", event.error()); event.clear(); }
It’s recommended to review the rest of the acquisition firmware to get a full idea of what the code is doing.
Particle webhook configuration
Many of the following steps are covered in the earlier post: How to remotely gather training data for Edge Impulse over LTE with Particle. But, there are a few key differences with the webhook configuration this time around.
Before starting on this step, be sure to have created an Edge Impulse project.
The JSON data being sent to the Edge Impulse data acquisition API is specific to this project and is different than what was provided in the previous post. Notice that we match the interval_ms
value to the sampling frequency defined by our firmware. We also define two sensors: proximity
and ambient_light
.
{
"payload": {
"device_name": "{{{PARTICLE_DEVICE_ID}}}",
"device_type": "B5SoM",
"interval_ms": 20,
"sensors": [
{
"name": "proximity",
"units": "raw"
},
{
"name": "ambient_light",
"units": "raw"
}
],
"values": {{{PARTICLE_EVENT_VALUE}}}
},
"protected": {
"alg": "none",
"ver": "v1"
},
"signature": "00"
}
In the HTTP Headers section, we’ll need to update the x-label
header depending on which scenario we’re recording data for. In the screenshot below, it’s configured for the “idle” classification. But, when recording the “wave” gesture, be sure to change the x-label
accordingly.
The x-api-key
can be found in the Edge Impulse console in the “API Keys” section for your specific project.
Gathering training data
With everything wired up, we can start the data gathering process. Starting with the “idle” label as defined in the HTTP Headers in the previous step. To gather an “idle” sample, press button A on the OLED breakout to generate a two-second window of data.
You should see this sample get forwarded directly to your Edge Impulse project.
Repeat this process as many times as you’d like. Remember that the more data you gather, the more robust your algorithm will end up being. We recommend changing the environment a bit as well as moving in front of the sensor a little, without performing a full-on “wave” gesture, to add some sample variance.
Next, go back to the custom webhook integration in your Particle Console and change the x-label
to “wave.” Gather a number of samples where you wave your hand in front of the sensor during the sampling period. You should see a pretty clear pattern start to emerge in your training dataset. Same as before remember to vary the wave motion a bit to ensure a robust training set.
Training the model
With a proper dataset gathered, we can start to train the model from the Edge Impulse console. Navigate to the “Create impulse” page in your Edge Impulse project and configure a new impulse as shown below.
Here, the window size is set to two seconds to match the sampling period set in the firmware. The frequency will be inferred from your dataset. A window increase of one second seems to work well. We are interested in the spectral features for proximity
and ambient_light
and intend to classify between idle
and wave
.
Choose “Spectral analysis” for the processing block.
Choose “Classification” for the learning block.
Move on to the “Spectral features” section, leave the defaults, and choose “Save parameters.”
center
Then select “Calculate feature importance” and “Generate features” to kick off the feature generation job.
center
Once complete, we can move on to the “Classifier” section to train our model and see how everything performs.
The default settings for the neural network seem to work well for this application. So jump right into the training process by selecting “Save & train” towards the bottom. This will kick off the training job.
center
Once the job completes, you should get a readout that grades the model’s performance. In this case, it seems to have performed exceptionally well within the test dataset.
center
Deploying the model
On the left navigation panel, scroll all the way down to the “Deployment” tab. Under “Configure your deployment” search for “Particle library.”
It’s recommended to select “quantized” and use the “EON Compiler” for the library export. Click “Build” at the bottom of the page to generate the library. Once built, you will get a .zip folder containing the library source ready to be used with the Particle environment.
Unzip the folder and open it in VSCode. Copy the /lib
folder from the data acquisition firmware project into the new inferencing project. Then, replace main.cpp
with the following code.
#include "Particle.h"
#include <Proximity_Gesture_Detection_inferencing.h>
#include <Adafruit_SH110X.h>
#include <Adafruit_VCNL4040.h>
#define SAMPLING_FREQ_HZ 50 // Sampling frequency (Hz)
#define SAMPLING_PERIOD_MS 1000 / SAMPLING_FREQ_HZ // Sampling period (ms)
#define NUM_SAMPLES 100 // 100 samples at 50 Hz is 2 sec window
#define BUTTON_A D4
#define BUTTON_B D3
#define BUTTON_C D22
Adafruit_SH1107 display = Adafruit_SH1107(64, 128, &Wire);
Adafruit_VCNL4040 vcnl4040 = Adafruit_VCNL4040();
ei_impulse_result_t result = {0};
float ambientLightMin = 0.0;
float ambientLightMax = 0.0; // dynamically determined based on training environment
float proximityMin = 0.0;
float proximityMax = 1000.0; // experimentally determined
SerialLogHandler logHandler(LOG_LEVEL_ERROR);
int raw_feature_get_data(size_t offset, size_t length, float *out_ptr);
void setup();
void loop();
static float features[200];
int raw_feature_get_data(size_t offset, size_t length, float *out_ptr)
{
memcpy(out_ptr, features + offset, length * sizeof(float));
return 0;
}
void print_inference_result(ei_impulse_result_t result);
void setup()
{
SystemPowerConfiguration powerConfig = System.getPowerConfiguration();
powerConfig.auxiliaryPowerControlPin(D23).interruptPin(A6);
System.setPowerConfiguration(powerConfig);
pinMode(BUTTON_A, INPUT_PULLUP);
pinMode(BUTTON_B, INPUT_PULLUP);
pinMode(BUTTON_C, INPUT_PULLUP);
if (!vcnl4040.begin())
{
ei_printf("Couldn't find VCNL4040 chip");
}
vcnl4040.setProximityIntegrationTime(VCNL4040_PROXIMITY_INTEGRATION_TIME_8T);
vcnl4040.setAmbientIntegrationTime(VCNL4040_AMBIENT_INTEGRATION_TIME_80MS);
vcnl4040.setProximityLEDCurrent(VCNL4040_LED_CURRENT_120MA);
vcnl4040.setProximityLEDDutyCycle(VCNL4040_LED_DUTY_1_40);
// Put initialization like pinMode and begin functions here
display.begin(0x3C, true); // Address 0x3C default
// Clear the buffer.
display.clearDisplay();
display.display();
display.setRotation(1);
}
void loop()
{
unsigned long timestamp;
while (digitalRead(BUTTON_A) == 1)
{
uint16_t proximity = vcnl4040.getProximity();
uint16_t ambientLight = vcnl4040.getAmbientLight();
if (ambientLight > ambientLightMax)
ambientLightMax = ambientLight;
float normProximity =
proximity < proximityMin ? 0.0
: proximity > proximityMax ? 1.0
: (float)(proximity - proximityMin) / (proximityMax - proximityMin);
float normAmbientLight =
ambientLight < ambientLightMin ? 0.0
: ambientLight > ambientLightMax ? 1.0
: (float)(ambientLight - ambientLightMin) / (ambientLightMax - ambientLightMin);
display.clearDisplay();
display.setTextSize(1);
display.setTextColor(SH110X_WHITE);
display.setCursor(0, 0);
display.print("Norm Prox: ");
display.println(normProximity);
display.print("Norm Light: ");
display.println(normAmbientLight);
display.println("------");
// Print how long it took to perform inference
ei_printf("Timing: DSP %d ms, inference %d ms, anomaly %d ms\r\n",
result.timing.dsp,
result.timing.classification,
result.timing.anomaly);
ei_printf("Predictions:\r\n");
for (uint16_t i = 0; i < EI_CLASSIFIER_LABEL_COUNT; i++)
{
ei_printf(" %s: ", ei_classifier_inferencing_categories[i]);
ei_printf("%.5f\r\n", result.classification[i].value);
display.print(ei_classifier_inferencing_categories[i]);
display.print(": ");
display.println(result.classification[i].value);
}
display.display();
}
// Record samples in buffer
int j = 0;
for (int i = 0; i < NUM_SAMPLES; i++)
{
// Take timestamp so we can hit our target frequency
timestamp = millis();
uint16_t proximity = vcnl4040.getProximity();
uint16_t ambientLight = vcnl4040.getAmbientLight();
float normProximity =
proximity < proximityMin ? 0.0
: proximity > proximityMax ? 1.0
: (float)(proximity - proximityMin) / (proximityMax - proximityMin);
float normAmbientLight =
ambientLight < ambientLightMin ? 0.0
: ambientLight > ambientLightMax ? 1.0
: (float)(ambientLight - ambientLightMin) / (ambientLightMax - ambientLightMin);
features[j] = normProximity;
features[j + 1] = normAmbientLight;
j += 2;
// Wait just long enough for our sampling period
while (millis() < timestamp + SAMPLING_PERIOD_MS)
;
}
// the features are stored into flash, and we don't want to load everything into RAM
signal_t features_signal;
features_signal.total_length = sizeof(features) / sizeof(features[0]);
features_signal.get_data = &raw_feature_get_data;
// invoke the impulse
EI_IMPULSE_ERROR res = run_classifier(&features_signal, &result, false);
if (res != EI_IMPULSE_OK)
{
ei_printf("ERR: Failed to run classifier (%d)\n", res);
return;
}
// print inference return code
ei_printf("run_classifier returned: %d\r\n", res);
// Make sure the button has been released for a few milliseconds
while (digitalRead(BUTTON_A) == 0)
;
}
Notice that we carry over most of the same logic from the acquisition firmware, with some minor updates filling a features
buffer that gets passed into the Edge Impulse library. This new code is highlighted out below.
// Record samples in buffer int j = 0; for (int i = 0; i < NUM_SAMPLES; i++) { ... features[j] = normProximity; features[j + 1] = normAmbientLight; j += 2; } // the features are stored into flash, and we don't want to load everything into RAM signal_t features_signal; features_signal.total_length = sizeof(features) / sizeof(features[0]); features_signal.get_data = &raw_feature_get_data; // invoke the impulse EI_IMPULSE_ERROR res = run_classifier(&features_signal, &result, false); if (res != EI_IMPULSE_OK) { ei_printf("ERR: Failed to run classifier (%d)\n", res); return; } // print inference return code ei_printf("run_classifier returned: %d\r\n", res);
The previous prediction get passed to the display so we can track our performance.
ei_printf("Predictions:\r\n"); for (uint16_t i = 0; i < EI_CLASSIFIER_LABEL_COUNT; i++) { ei_printf(" %s: ", ei_classifier_inferencing_categories[i]); ei_printf("%.5f\r\n", result.classification[i].value); display.print(ei_classifier_inferencing_categories[i]); display.print(": "); display.println(result.classification[i].value); }
We can now test everything out to make sure it works as expected! Our model was 78% sure of the wave demonstrated below.
Summary
This demonstrates how you might build a more practical gesture detection algorithm for something like a kiosk or dispenser. Rather than relying on a handheld accelerometer, we show that a gesture can be reliably detected using proximity and light data. Particle makes it easy to gather training data from anywhere by leveraging an out-of-the-box LTE connection and very little hardware configuration. The same Particle module can easily handle inference on the edge once the model is trained and deployed.
Ready to get started?
Order your B504e from the store.
