Internet of Things Project 1

Download as pdf or txt
Download as pdf or txt
You are on page 1of 50

IOT Project Challenge

Session-1
Background and Motivation
IoT architecture
Simply, IoT is connecting devices to the internet so their data can be
seen from anywhere in the world. These devices can also be controlled
from anywhere through the internet.
Solving IoT challenges
By running ML models on the embedded devices we can get benefits
such as:
TinyML, What is it?
TinyML is a branch of machine learning and embedded systems
research that looks into the types of models that can be run on small,
low-power devices like microcontrollers.
So tiny, how?
TF Lite for Microcontrollers is a modified version of the TensorFlow Lite
framework that is meant to run on embedded devices with only a few
tens of kilobytes of memory.
It supports Android, IOS, Arduino etc..
Not only Python you can use C, C++ and JAVA Pretrained models
Seed Studio XIAO ESP32S3
Pinout Diagram
Seed Studio XIAO ESP32S3
Camera Module
Seed Studio XIAO ESP32S3
Front View
Seed Studio XIAO ESP32S3
Back View
TinyML Project flow
AIRGLIDE - Gesture-driven
drone simulator
Project - 1
Problem Statement

● Develop a robust gesture recognition system capable of accurately


interpreting hand movements and translating them into commands for
the drone.
● Create a dronekit code which takes the input command and converts it
into action in the drone simulation.
Architecture Diagram
Hardware Setup
• Seeed Studio XIAO ESP32S3
• Seed Studio Camera Module
• MPU 6050 Sensor
IMU 6050 Sensor
Circuit Diagram

GND

3.3v
SDA

SCL
Software Setup
• Arduino IDE setup
• Setting up a drone simulation environment
• Visual Studio Code (Latest version)
• Edge Impulse platform setup
Arduino IDE setup
Drivers
➢ Silicon Labs CP210x USB to UART Bridge driver
Software
➢ Arduino IDE – Version (latest)
➢ MPU 6050 by Seeed Studio
➢ Websockets
➢ Esp32 by expressif
Silicon Labs CP210x USB to UART Bridge VCP Drivers
• You need to install drivers for the USB-to-Serial chip on your ESP32 board.
https://www.silabs.com/developers/usb-to-uart-bridge-vcp-drivers?tab=downloads
• Extract the folder
• Open Device Manager and Go to -> Ports
• Connect your esp32 to laptop using micro-USB cable
• It shows a new port, right click on it and select Update Driver
• Browse the driver for extracted folder and click next

3
• After uploading it shows
ESP32 BOARD SUPPORT TO ARDUINO IDE
• Open Arduino IDE.
• Go to File -> Preferences.
• In the "Additional Boards
Manager URLs" field, add this
URL: https://dl.espressif.com/dl/packa
ge_esp32_index.json

• Click OK to close the


Preferences window.
INSTALL ESP32 BOARD PACKAGE
• Go to Tools -> Board -> Boards Manager.
• Type "ESP32" in the search bar.
• Install "esp32" by Espressif Systems.
• Go to Tools -> Board.
• Select your ESP32 board from the list. ("XIAO_ESP32S3" for most
generic ESP32 boards)
• Open Arduino ide and select board and port
• After selecting Board and Port you are good to
upload your code.
Arduino program structure
Arduino program structure

• A sketch can be divided into two parts:


Setup ()
Loop()
• The function setup() is the point where the code
starts, just like the main() function in C and C++
• I/O Variables, pin modes are initialized in the
Setup() function
• Loop() function, as the name suggests, iterates the
specified task in the program
Arduino Function Libraries
• Input/Output Functions:
• The arduino pins can be
configured to act as input or
output pins using the pinMode()
function

Void setup ()
{
pinMode (pin , mode);
}
• Pin- pin number on the Arduino board
• Mode- INPUT/OUTPUT
Arduino Function Libraries
• digitalWrite() : Writes a HIGH or LOW value to a digital pin

• analogRead() : Reads from the analog input pin i.e., voltage applied
across the pin

• Character functions such as isdigit(), isalpha(), isalnum(), isxdigit(),


islower(),
isupper(), isspace() return 1(true) or 0(false)

• Delay() function is one of the most common time manipulation function


used to provide a delay of specified time. It accepts integer value (time
in miliseconds)
Setting up a Drone Simulation Environment
Software Requirements
➢ Python - Version 3.7.0 (Include following libraries)
❏ Dronekit - Version 2.9.2
❏ Dronekit Sitl - Version 3.3.0
❏ Pymavlink - Version 2.4.8
❏ Websocket-client – Version 1.6.1
➢ Mavproxy - Version 1.8.69
➢ Visual studio code
Edge Impulse Setup
Software Requirements

➢ Edge impulse
○ Nodejs-latest version
○ edge impulse cli
Software Installation Commands (Windows)
• Python - Version 3.7.0 (Install)
• Dronekit - Version 2.9.2
▪ py -3.7 -m pip install dronekit==2.9.2
• Dronekit Sitl - Version 3.3.0 `
▪ py -3.7 -m pip install dronekit-sitl==3.3.0
• Pymavlink - Version 2.4.8
▪ py -3.7 -m pip install pymavlink==2.4.8
• Websocket-client – Version 1.6.1
▪ py -3.7 -m pip install websocket-client==1.6.1
• Mavproxy - Version 1.8.69 (Install)
Simulation Environment setup
● Run dronekit-sitl in command prompt using
○ py -3.7 -m dronekit-sitl copter
(Or)
○ dronekit-sitl copter `
● Open another terminal and run mavproxy map
○ mavproxy.exe --master tcp:127.0.0.1:5760 --out
127.0.0.1:14550 --out 127.0.0.1:14551 --map
MPU6050 Sensor
Calibrating MPU 6050 Sensor
● The MPU6050 can be calibrated using the sketch:
mpu6050-calibration.ino
● Run the code. The following will be displayed on the Serial
Monitor:
● Send any character (in the serial monitor), and the calibration
should start.
● In the end, you will receive the offset values to be used on all
your sketches:

● Write down your offsets so you can set them in your projects.
Training the model
• For training the model, we use Edge Impulse platform.
• Edge Impulse trains machine learning models to be deployed to
your microcontroller or other small, low-powered device. You
then perform inference on those small devices without the
need for an internet connection.
• Create an Edge Impulse account.
• Create new project.
• To connect to XIAO ESP32S3, we need to install
Edge Impulse CLI.
Installation of Edge Impulse CLI
• Install Node.js v18 on
your host computer.
• For Windows users, install
the Additional necessary
Node.js tools when
prompted.
• Install the CLI tools via:
npm install -g edge-impulse-cli --force
• Run the edge impulse data forwarder via:
Edge-impulse-data-forwarder
The data forwarder will ask you for the server you want to
connect to, prompt you to log in, and then configure the device.

• Enter the username and password of your edge impulse


account.
• Enter the com port that the device is connected to your
computer.
• Select your project by clicking or by entering the name of
your project.
• Assign the name of the device as XIAO ESP32S3.
Example of the output of the forwarder
Edge Impulse data forwarder v1.5.0
? What is your user name or e-mail address (edgeimpulse.com)? jan@edgeimpulse.com
? What is your password? [hidden]
Endpoints:
Websocket: wss://remote-mgmt.edgeimpulse.com
API: https://studio.edgeimpulse.com
Ingestion: https://ingestion.edgeimpulse.com

[SER] Connecting to /dev/tty.usbmodem401203


[SER] Serial is connected
[WS ] Connecting to wss://remote-mgmt.edgeimpulse.com
[WS ] Connected to wss://remote-mgmt.edgeimpulse.com
? To which project do you want to add this device? accelerometer-demo-1
? 3 sensor axes detected. What do you want to call them? Separate the names with
',': accX, accY, accZ
? What name do you want to give this device? Jan's DISCO-L475VG
[WS ] Authenticated
Edge impulse Process Flow
● Collecting data from the device and forwarding it
using data-forwarder.
● Start sampling on edge-impulse after successfully
connecting device via data-forwarder
● Pre-process the sample data.
● Extracting
Training in Edge Impulse
● Open Edge Impulse, open your project.
● Open data acquisition tab and check whether the device is
connected.

● Start the data collection and it collects the data from your
device to train the model.
● Assign labels to the data while data acquisition.
● Create an impulse design.
● Set the timing and frequency for training your model in
the impulse design tab.
● Go through the
parameters in the other
tabs of impulse design
and build features to the
model.
● Start training the model.
● After the model is trained,
Start deployment.
● Go to Deployment and search for Arduino library to select
it.

● Scroll down to find EON compiler and choose ‘Unoptimized’.


● A Zip file will be created and downloaded to your computer.
● On your Arduino IDE, go to the Sketch tab select the option
Add.ZIP Library, and Choose the.zip file downloaded by the
Studio.
Coding
● Write an arduino code using the example codes from the
extracted ZIP file to get the labelled values from the
device such as left tilt, right tilt, straight, up, down etc.
● Write a python code that receives the orientation of the
device (from the arduino code) through websockets to the
mavproxy map.
● Run dronekit-sitl, mavproxy, arduino code and python
code simultaneously.
● This should result in change in the directions of copter in
the mavproxy map when the device is being tilted
simultaneously in respective directions.
IOT based Smart Locking
System
Project - 2
Workflow
1. Face Recognition
1.1. Setup Camera Web-Server
• Use ESP32S3 to run a web server for capturing facial images.
• Save the captured images in Flash memory.
1.2. Image Enrollment
• Enroll the captured image into the system for future
recognition.
1.3. Face Recognition
• Utilize the ESP Face Recognition Deep Learning library by Espressif to
recognize faces.
2. Keyword Detection
2.1. Model training
• Train a keyword recognition model using Edge Impulse.
• Build the trained model into an Arduino file for deployment.
Workflow (cont.)
3. Combine Face & Keyword Recognition:
• Integrate both face and keyword recognition processes into
a single file.
• Implement a sequential workflow where:
• Face recognition runs first.
• If a face is successfully recognised, it stops the
face recognition process and transitions to
keyword recognition.
• If keyword is successfully recognised the LED
starts to blink rapidly indicating successful
execution.
Keyword Spotting using Edge Impulse

Keyword Spotting (KWS) is integral to many voice recognition


systems, enabling devices to respond to specific words or phrases.
While this technology underpins popular devices like Google
Assistant or Amazon Alexa, it's equally applicable and achievable
on smaller, low-power devices. In this project we’ll be
implementing a KWS system using TinyML on the XIAO ESP32S3
microcontroller board.
Machine learning Workflow:

You might also like