A mix of arduino experiments.
RedBearDuoReadPotSetRGBLED Demo Video
Circuit + physical controls: Designed the light using 2 RGB LEDs. The individual color hues are selectable using slider custom physical control. The brightness of the LED changes automatically based on ambient light (inversely proportional to light level)
Android Smartphone app: Simple Android-based smartphone app that allows you to select a color via RGB color sliders, via voice control and via the accelerometer.
Lo-fi enclosure: Lo-fienclosure that diffuses the LEDs and exposes the controls and power was created with a crotcheted case and cotton balls inside.
Arduino, circuit, sensors+actuators: Designed the smart space device using a servo motor mounted on top of the box and it responds in real time to the location of the user as detected by the camera from the phone. An ultrasonic sensor is mounted on top of the servo, enabling it rotate towards the user. The ultrasonic sensor detects the distance to the detected user and sends it to Android for consumption.
Android Smartphone app: The Android-based smartphone app detects faces in real time, tracks their location on the screen, sends the location back to Arduino so that the servo can rotate towards it. And it receives the distance measurement from Arduino and displays it on the top right corner of the screen.
Lo-fi enclosure: A Lo-fi enclosure made with a cardboard box, holds the servo and ultrasonic sensor on top, with all the circuitry hidden inside the box. Lastly, the android phone facing forward is glued to the front with camera towards the user.
3D enclosure: I later 3D-printed a stand for the servo motor to sit on a flat surface and a motor attachment for the ultrasonic sensor to sit on the servo propeller.
Creative Component 1: My creative component is an accessibility feature that vocally describes the face that has been detected. It will say something like: “ I see a female of about age 31, wearing reading glasses and about 200 centimeters away”.
Creative Component 2: My second creative component (I call the snapchat effect :) ) used the android smiling probability + graphics overlay + face landmarks to show the detected state of the user's emotion via a smiley with 2 emotional states happy and sad.
Update Demo with Fabrication:
Addendum: Microsoft cognitive service API (https://westus.dev.cognitive.microsoft.com/docs/services/563879b61984550e40cbbe8d/operations/563879b61984550f30395236) requires a subscription key. You need to update in code and the region in url as well.