The ball’s rotation is done through the implementation of a Timer Compare Interrupt. Each time this interrupt occurs, a global flag is set in the code. The main program loop checks this flag each loop and updates the servo position each time it is set. The flag is then cleared and the loop continues until the interrupt occurs again. A snippet of the code that changes the servo position is shown below along with a video of the spinning.
One of the most difficult parts of our projects was figuring out how to initialize our keypad and retrieve inputs to the A3BU. In the code below, we initialize the 4 ADC pins to receive inputs from the 4 buttons on the keypad. Then we have to set each of the inputs to -1 initially. The code stated that the light on the microcontroller, LED0, will turn on when button one is pushed via an if statement. We also added a string of code to display on the LCD screen to show what button was pressed. At first, the LCD screen seemed to act a bit intermittent, and after speaking with the professor, a 10k resistor was added. By adding the resistor, we were able to have button one work successfully. We started to implement button two. We had the issue of something not working properly. Both buttons were wired exactly the same and the code was also identical. After a bit of troubleshooting, the code was changed from “if (ioport_get_pin_level(COL1) > 0)” to “if (ioport_get_pin_level(COL1) == 1)”. By setting a more definite condition, we were able to clear that issue up. We were able to get button one and two to work successfully. We, then, added button three and four. They were tested and passed with no errors. When not pressed, the keypad button in question is set to 0: off. When pressed, it is set to 1: on.
Once we programmed our A3BU to receive inputs from the keypad, we had to create the desired output. To create our short jingles, we used pulse width modulation to vary the frequencies and duty cycles. The desired notes corresponded with a specific frequency, and we chose 85 as the optimum duty cycle to get the clearest tone. By adding in delays, we were able to control the length of the note and the length of the pauses between the notes.
The goal of our project was to create a visual representation of the input frequencies collected by a microphone. To do this we had to receive an analog signal from a microphone, and use this signal to drive the speeds of three different fans. After receiving the signal from the microphone, we ran the raw electrical signal through a fourier transform in order to distinguish the various frequencies within the signal. Once we had the multiple frequencies that made up our signal we divided these frequencies into 3 separate bins: low, medium, and high. We used the average of these bins in order to determine the speed at which the fan should run. If the average of the frequencies contained in the low bin increases, then the speed at which fan 1 is rotating will also increase. Fan 2 which corresponds to the medium range frequencies, and fan 3 which corresponds to the high range of frequencies, both operate in a similar manner. Our goal was achieved as we were able to get the fans to fluctuate in speed based on the various frequencies collected.
This is a quick demo of our teams final project. In it you can see a quick run through of the basic functionality we implemented with close ups and reruns. We show our whole board and explain step-by-step what happening, and what’s to be expected.
One of our group members had previous interest in how a virtual reality environment could be implemented using Unreal Engine and Blender. We saw this final project as an opportunity to take this interest to the next level.
Our concept is similar to Motion Capture (Mo-Cap) technology, that’s commonly used for video game developing, to make to movement of players as life-like as possible. Mo-Cap records the orientation of objects in a way that they can manipulated to derive a variety of different motions. For our Cyber-Hand, we used flex sensors along the fingers as an alternative. Instead of recording the orientation of the joints and fingers visually (like mo-cap would), we are able to use these sensors as a means of recognizing a change of resistance and it’s inversely proportionate change in output voltage, so we can then translate it to an input for the Unreal Engine to reflect the range of motion in the hand to the virtual hand we generated with the Unreal and Blender software.