Music Keypad – Code Snippet

One of the most difficult parts of our projects was figuring out how to initialize our keypad and retrieve inputs to the A3BU.  In the code below, we initialize the 4 ADC pins to receive inputs from the 4 buttons on the keypad.  Then we have to set each of the inputs to -1 initially.  The code stated that the light on the microcontroller, LED0, will turn on when button one is pushed via an if statement. We also added a string of code to display on the LCD screen to show what button was pressed. At first, the LCD screen seemed to act a bit intermittent, and after speaking with the professor, a 10k resistor was added. By adding the resistor, we were able to have button one work successfully.  We started to implement button two. We had the issue of something not working properly. Both buttons were wired exactly the same and the code was also identical. After a bit of troubleshooting, the code was changed from  “if (ioport_get_pin_level(COL1) > 0)” to “if (ioport_get_pin_level(COL1) == 1)”. By setting a more definite condition, we were able to clear that issue up. We were able to get button one and two to work successfully. We, then, added button three and four. They were tested and passed with no errors.  When not pressed, the keypad button in question is set to 0: off. When pressed, it is set to 1: on.

Code to initialize our inputs from the numeric keypad. COL1 is for button 1, COL2 for button 2, COL3 for button 3, and COL4 for button 4.
Code to get the input from the keypad back to the board

Once we programmed our A3BU to receive inputs from the keypad, we had to create the desired output.  To create our short jingles, we used pulse width modulation to vary the frequencies and duty cycles.  The desired notes corresponded with a specific frequency, and we chose 85 as the optimum duty cycle to get the clearest tone.  By adding in delays, we were able to control the length of the note and the length of the pauses between the notes.

Code snippet from part of the song “Jingle Bells”

“Jingle Bells”

Audio Visualizer — Big Picture

The goal of our project was to create a visual representation of the input frequencies collected by a microphone. To do this we had to receive an analog signal from a microphone, and use this signal to drive the speeds of three different fans. After receiving the signal from the microphone, we ran the raw electrical signal through a fourier transform in order to distinguish the various frequencies within the signal. Once we had the multiple frequencies that made up our signal we divided these frequencies into 3 separate bins: low, medium, and high. We used the average of these bins in order to determine the speed at which the fan should run. If the average of the frequencies contained in the low bin increases, then the speed at which fan 1 is rotating will also increase. Fan 2 which corresponds to the medium range frequencies, and fan 3 which corresponds to the high range of frequencies, both operate in a similar manner. Our goal was achieved as we were able to get the fans to fluctuate in speed based on the various frequencies collected.

Anagram Solver Explained and Demo – Team SilverCharm

This is a quick demo of our teams final project. In it you can see a quick run through of the basic functionality we implemented with close ups and reruns. We show our whole board and explain step-by-step what happening, and what’s to be expected.

Continue reading Anagram Solver Explained and Demo – Team SilverCharm

Cyber-Hand: The Big Picture

One of our group members had previous interest in how a virtual reality environment could be implemented using Unreal Engine and Blender. We saw this final project as an opportunity to take this interest to the next level.

Our concept is similar to Motion Capture (Mo-Cap) technology, that’s commonly used for video game developing, to make to movement of players as life-like as possible. Mo-Cap records the orientation of objects in a way that they can manipulated to derive a variety of different motions. For our Cyber-Hand, we used flex sensors along the fingers as an alternative. Instead of recording the orientation of the joints and fingers visually (like mo-cap would), we are able to use these sensors as a means of recognizing a change of resistance and it’s inversely proportionate change in output voltage, so we can then translate it to an input for the Unreal Engine to reflect the range of motion in the hand to the virtual hand we generated with the Unreal and Blender software.

Cyber-Hand: Voltage Divider Schematic

Velostat is a material used for packaging electrical devices that can be damaged by electrostatic discharge. Its unique properties allow it’s resistance to change with an applied pressure. By using this material in our flex sensors and putting them in series with another resistor we created a voltage divider, when the finger is bent we receive upwards of 1.65 V and when the finger is straightened out, roughly 0.3V. These values are delivered to the Analog to Digital Converter and then converted to percentages that Unreal processes as position.

(Note: In the video we used a 5V applied voltage, as opposed to the 3.3V input. This is why our output voltages values in the video reflect 0.45-2.5V.)

Crosswalk Simulator/Stoplight: The Big Picture

The fundamental goal of this project was to recreate the same type of crosswalk that could be found at the intersection of Eastern Parkway and Speed School. The scope of the project contained two stop lights (6 LEDs Total), walk/don’t walk graphics, battery powered crosswalk button, a sounding buzzer for echolocation,  and a “time remaining” indicator.

The A3BU board worked well for all functions that we needed to complete this project. The board provided us with enough GPIO pins to provide power control for 6 of the 3V LED’s, as well as frequency pulse with modulation to control the buzzer. An analog -to-digital converter (ADC) pin was also utilized to detect digital-hi’s when the crosswalk button was pressed. The battery provided a voltage that would then be tested against a certain range, and if the value was in that range, the crosswalk logic would trigger.

LED Layout with A3BU, buzzer, and button.

Final Project pic 2

The LCD display served a large purpose as it displayed all of our crosswalk functions. When the crosswalk button was pressed and the light turns red, a ASCII graphic of a “walkman” appears letting you know its safe to cross. A incremented bar also appears, gradually growing larger allowing the walker to see how much time they have left to cross. Once this timer ends, an ASCII “stop-hand” appears, letting the user know it’s no longer safe to cross.

 

LCD displaying the ‘walkman’ and the timer bar at the top

Final Project pic 1

The buzzer was turned off and on by varying the duty cycle on the GPIO output, while the pitched was changed by editing the frequency of the pulses. Here is a video displaying all the functions.