One of our group members had previous interest in how a virtual reality environment could be implemented using Unreal Engine and Blender. We saw this final project as an opportunity to take this interest to the next level.
Our concept is similar to Motion Capture (Mo-Cap) technology, that’s commonly used for video game developing, to make to movement of players as life-like as possible. Mo-Cap records the orientation of objects in a way that they can manipulated to derive a variety of different motions. For our Cyber-Hand, we used flex sensors along the fingers as an alternative. Instead of recording the orientation of the joints and fingers visually (like mo-cap would), we are able to use these sensors as a means of recognizing a change of resistance and it’s inversely proportionate change in output voltage, so we can then translate it to an input for the Unreal Engine to reflect the range of motion in the hand to the virtual hand we generated with the Unreal and Blender software.
Velostat is a material used for packaging electrical devices that can be damaged by electrostatic discharge. Its unique properties allow it’s resistance to change with an applied pressure. By using this material in our flex sensors and putting them in series with another resistor we created a voltage divider, when the finger is bent we receive upwards of 1.65 V and when the finger is straightened out, roughly 0.3V. These values are delivered to the Analog to Digital Converter and then converted to percentages that Unreal processes as position.
(Note: In the video we used a 5V applied voltage, as opposed to the 3.3V input. This is why our output voltages values in the video reflect 0.45-2.5V.)
The fundamental goal of this project was to recreate the same type of crosswalk that could be found at the intersection of Eastern Parkway and Speed School. The scope of the project contained two stop lights (6 LEDs Total), walk/don’t walk graphics, battery powered crosswalk button, a sounding buzzer for echolocation, and a “time remaining” indicator.
The A3BU board worked well for all functions that we needed to complete this project. The board provided us with enough GPIO pins to provide power control for 6 of the 3V LED’s, as well as frequency pulse with modulation to control the buzzer. An analog -to-digital converter (ADC) pin was also utilized to detect digital-hi’s when the crosswalk button was pressed. The battery provided a voltage that would then be tested against a certain range, and if the value was in that range, the crosswalk logic would trigger.
LED Layout with A3BU, buzzer, and button.
The LCD display served a large purpose as it displayed all of our crosswalk functions. When the crosswalk button was pressed and the light turns red, a ASCII graphic of a “walkman” appears letting you know its safe to cross. A incremented bar also appears, gradually growing larger allowing the walker to see how much time they have left to cross. Once this timer ends, an ASCII “stop-hand” appears, letting the user know it’s no longer safe to cross.
LCD displaying the ‘walkman’ and the timer bar at the top
The buzzer was turned off and on by varying the duty cycle on the GPIO output, while the pitched was changed by editing the frequency of the pulses. Here is a video displaying all the functions.
Protect the Brew. After a bad experience with collecting money for a Beer Olympics I came up with the idea for our project, a smart beverage dispenser. One that would limit access to the delicious golden ale inside. Despite my original intentions this project can also be implemented to prevent underage drinking and to keep track of how much people drink. So don’t tell me you had 6 beers when you only drank 4.
The key aspect to this project is the fingerprint scanner. It provides the security we wanted for the project at an affordable price. With two microcontrollers, a Homebrew Draft System, the fingerprint scanner and a solenoid valve the project began. The system is designed so the beverage will only dispense after your fingerprint is verified. We used an Arduino to communicate between the scanner and the A3BU which controlled the other functions. The system will identify who accessed the system and display it on the A3BU LCD. It will also activate one of the LED’s that indicate the status of the fingerprint, red if denied and green if approved. Assuming the approval signal is received by the A3BU it will send a signal that activates a solenoid motor through a transistor circuit. The System will then dispense the beverage for 20 seconds which at 15 psi will fill up a cup.
An amazing video of our project (includes excellent music)
2. Schematic action
3. How it all works
Hardware for the skittles sorter included two 180° servos, one 360° servo, a TCS34725 RGB sensor from Adafruit, an RGB LED, an Arduino Uno, and the Atmel A3BU board. The Arduino sent PWM signals to the two 180° servos, which corresponded to angle values between 0° and 180°. The continuous servo also received a PWM value but translated this value to a speed and direction. The Arduino was incapable of supplying power all components, so the A3BU served as an additional power source.
The star of the show, the RGB sensor (TCS3472), utilized a photodiode array composed of red, green, blue filtered and clear photodiodes, an analog to digital (ADC) converter that converted the photodiode currents to 16 bit values. The sensor communicated with the Arduino via I2C communication protocol. The Adafruit TCS3472 package also incorporated 3.3V regulator and an onboard LED used to illuminate the target (the skittle). The TCS3472 interfaced with the Arduino by connecting the I2C Clock on the sensor to the SCL (serial clock line) input on the Arduino and the I2C Data on the sensor to the SDA (serial data line) on the Arduino. Adafruit also provided a tutorial and sample code for using the TCS3472.
We planned on using an RGB LED as an indicator light. We intended for the LED to change color based on the input received from the RGB sensor. However, mid-project the LED broke and its replacement was not fully functional. Due to time constraints this issue was not investigated further.
The plan for the A3BU was to display a running tab of the number of skittles sorted on its LCD. We intended to send a digital signal from the Arduino to the A3BU, use this signal to update a variable on the A3BU and display this value on the LCD. Unfortunately, due to time constraints this sub-project was abandoned. Ultimately, the A3BU was used as an additional power source and to display a simple text phrase on an illuminated LCD.
Idea: Use an infrared (IR) LED signal to send a coded message from one A3BU to another.
Process: We built an emitter that used the A3BU on board button to pulse an infrared LED. Another A3BU with a photoresistor circuit connected to it received the signal and translated it into a letter. For example, three short pulses would be ‘A’.
We wanted to be able to control the robot remotely so we chose to set up a Raspberry Pi to get the job done. The pi ran a python program for the GUI made with TKinter. All we had to do was use remote desktop viewers that way, we could access the GUI on the pi from an external tablet.
We chose to use TKinter when making the GUI mostly due to ease. We were using Python2 and its built in editor. All it took was importing the TKinter library and we had access to writing with TKinter commands. We simply created a window, added buttons, labels, and a slider, and from there it was a lot of customization. When a button is pressed, it calls the associated method which in turn sends a character to serial.
Here is an example of our GUI code for how a button and its method works:
It may look a bit messy here, but this bad boy is inspired. We based our idea off of the laser harp MIDI instrument. Although we weren’t able to use real lasers due to power constraints and safety issues (pff safety…) (Instructor note: one demerit for flippancy), we found that infrared distance sensors get the job done, with the added feature of allowing multiple notes per “string”! Basically, a MIDI instrument sends a digital signal rather than an audio signal. The instrument is plugged into a computer via USB, and through one of many MIDI software programs, the digital signal is converted to musical data which can be output as a real sound by the computer. The MIDI software is able to control the sound output, allowing us to produce everything from harp sounds, to synths, to drums all with our nifty little gadget. But, be not fooled, ye mortals, this contraption was no walk in the park. Read on to find out more about the ups and downs and struggles of Mellow Mushroom and the MIDI harp.
The objective of our final project was to create a musical instrument using an LED matrix controlled by buttons and implement touch sensors that control certain frequencies on a speaker. Basically an LED piano. We used a speaker/buzzer to play the frequencies and we implemented capacitive touch for the “playing” of the instrument. The wire you choose to touch determines the note played. We housed the circuit in a cardboard box modeled as a piano to make it more appealing to the eye and to create an actual playable instrument.
On the inside of the piano box, the touch sensor wires are connected to rectangular pieces of aluminum foil that, on the top of the box, represent the “keys” of the piano. When a piece of foil is touched, the respective note is played. This resulted in a playable piano!
We ended up having trouble using multiple buttons for the LEDs so after that issue, we decided to use only one button that will play a pre-programmed song. The LEDs are now placed in the box in groups of four above each key. We combined the code for the LEDs and the capacitive touch and set our instrument to light up the LEDs above the key touched.