Smart Fan
Smart fan with adaptive speed control and intelligent environment sensing
This project introduces a Smart Fan that intelligently adapts its rotation and speed based on real-time environmental data and user presence. The system leverages face detection and distance measurement to optimize airflow, enhancing user comfort while conserving energy.
My Contribution
I was the project lead in this measurements and instrumentation project. I developed the custom sensor appilcation to adjust fan speed and rotation angle integrating face detection for distance estimation with servo motor control. This involved capturing video, detecting faces, calculating distances, and controlling the servo motor based on the detected face positions.
Project Objectives
The main objectives of this project are:
- Build a smart system using actuators and sensors(one custom) and integrate them using LabVIEW and NI-DAQmx.
- Regulate power using variable DC power supply and control with Rigol Ultra Sigma.
Features
- Smart Rotation: Utilizes face detection to adjust the fan’s rotation angle based on user location within a 5-meter range.
- Distance-Based Speed Control: Generates PWM signals based on the user’s distance, enabling precise fan speed adjustments through LabVIEW.
- Environmental Sensing: Monitors and adjusts fan operation based on temperature and humidity.
- Energy Efficiency: Reduces energy consumption by dynamically adjusting fan operations.
Real-Time Distance Estimation
The system uses a Mediapipe model for face detection, and OpenCV for processing the video feed from the webcamera connected to the Raspberry Pi.
- Face Detection: The algorithm detects faces in real-time, drawing bounding boxes around them.
- Distance Measurement: The area of the bounding box is used to estimate the distance to the face and angles.
Here is a brief snippet of the core logic for face detection and servo control:
face_centers = obj_data(frame)
if face_centers:
if len(face_centers) == 1:
face_x = face_centers[0][0]
angle = -90 + (180 * (face_x / width))
farthest_face = min(face_centers, key=lambda p: p[2])
farthest_distance_meters = pixels_to_meters(farthest_face[2])
if farthest_distance_meters < 1:
pwm_distance.ChangeDutyCycle(3)
elif 1 <= farthest_distance_meters < 2:
pwm_distance.ChangeDutyCycle(15)
else:
pwm_distance.ChangeDutyCycle(26)
distance_text = f"Distance to Farthest Face: {farthest_distance_meters:.2f} m"
print(distance_text)
cv2.putText(frame, distance_text, (50, 200), cv2.FONT_HERSHEY_SIMPLEX, 1, (255, 0, 0), 2)
if angle < -80:
angle = -80
elif angle > 80:
angle = 80
print(angle)
angle = abs(round(angle))
if angle < 30:
setAngle(30)
elif angle < 60:
setAngle(60)
else:
setAngle(80)
else:
for i in range(len(face_centers)):
for j in range(i + 1, len(face_centers)):
dist = distance(face_centers[i][0], face_centers[i][1], face_centers[j][0], face_centers[j][1])
cv2.line(frame, (face_centers[i][0], face_centers[i][1]), (face_centers[j][0], face_centers[j][1]), (0, 255, 0), 2)
leftmost = min(face_centers, key=lambda p: p[0])
rightmost = max(face_centers, key=lambda p: p[0])
left_right_distance = distance(leftmost[0], leftmost[1], rightmost[0], rightmost[1])
farthest_face = min(face_centers, key=lambda p: p[2])
farthest_distance_meters = pixels_to_meters(farthest_face[2])
if farthest_distance_meters < 1:
pwm_distance.ChangeDutyCycle(3)
elif 1 <= farthest_distance_meters < 2:
pwm_distance.ChangeDutyCycle(15)
else:
pwm_distance.ChangeDutyCycle(26)
distance_text = f"Distance to Farthest Face: {farthest_distance_meters:.2f} m"
print(distance_text)
cv2.putText(frame, distance_text, (50, 200), cv2.FONT_HERSHEY_SIMPLEX, 1, (255, 0, 0), 2)
if left_right_distance > 0:
left_right_distance_meters = pixels_to_meters(left_right_distance)
cv2.line(frame, (leftmost[0], leftmost[1]), (rightmost[0], rightmost[1]), (255, 0, 0), 2)
lr_distance_text = f"Left to Right Distance: {left_right_distance_meters:.2f} m"
print(lr_distance_text)
cv2.putText(frame, lr_distance_text, (50, 150), cv2.FONT_HERSHEY_SIMPLEX, 1, (255, 0, 0), 2)
face_range = rightmost[0] - leftmost[0]
if face_range > 0:
angle = abs(round((-90 + (180 * (rightmost[0] - leftmost[0]) / width))))
if angle < 30:
setAngle(30)
elif angle < 60:
setAngle(60)
else:
setAngle(80)
angle_text = f"Servo Angle: {angle:.2f} degrees"
print(angle_text)
print("\n")
cv2.putText(frame, angle_text, (50, 100), cv2.FONT_HERSHEY_SIMPLEX, 1, (255, 0, 0), 2)
else:
print("No face detected. Servo remains in current position.")
pwm_distance.ChangeDutyCycle(0)
pwm_servo.ChangeDutyCycle(7.5)
sleep(1)
pwm_servo.start(0)
Servo Motor Control
Based on the detected faces’ positions, the system adjusts the angle of a servo motor using RPi.GPIO.
- Angle Calculation: The horizontal position of the faces is used to calculate the appropriate angle for the servo.
- PWM Control: The Raspberry Pi uses PWM at 50Hz to control the servo motor, aligning it with the detected faces.
System Workflow
The block diagram below illustrates how the components of the Smart Table Fan system interact:
- Raspberry Pi: Processes video input from the webcam for face detection using OpenCV and MediaPipe, calculates distance, and controls the fan’s rotation angle.
- Webcam: Captures real-time video to detect user presence and measure distance.
- LabVIEW: Integrates data from sensors and controls the system, managing fan speed and rotation.
- Servo Motor MG-995: Adjusts the fan’s direction based on the user’s position from the Raspberry Pi.
- DC Fan: Modulates fan speed according to PWM signals sent from the Raspberry Pi to power supply controlled by LabVIEW.
- NI DAQ mx: Acquires data from various sensors and actuators, providing real-time feedback to the LabVIEW interface.
- Rigol Variable DC Power Supply: Modifies fan speed based on control signals from LabVIEW.
- LM35 Temperature Sensor: Measures ambient temperature.
- AMT1001 Humidity Sensor: Monitors humidity levels in the environment.
Calibration Curves
To ensure the system responds accurately to environmental changes, calibration curves were developed for distance, temperature, and humidity.
How It Works
-
Face Detection:
- The system captures frames from the video feed and processes them to detect faces.
- It can consider up to four faces, limiting the processing load and focusing on the most relevant data.
-
Distance Calculation:
- For each detected face, the system calculates the area of the bounding box.
- The area is then converted into a distance measurement, estimating how far the face is from the camera.
-
Servo Angle Adjustment:
- Based on the horizontal positions of the detected faces, the system calculates an appropriate angle for the servo motor.
- The servo motor adjusts to this angle, potentially aligning with the direction of the detected faces.
-
PWM Output:
- The system also adjusts the PWM signal output based on the distance to the farthest face.
- This output could be used for further control or feedback mechanisms.
Conclusion
The project demonstrates a practical application of computer vision with physical hardware, effectively detecting faces and controlling a servo motor.
For more details, you can explore the GitHub repository.