Step 1: To make one of these sensors start by taking a piece of duct tape the length of the finger it will be used for.
Step 2: Then cut a piece of conductive thread that is long enough to make in a U-shape that hangs off the end of the duct tape. The thread should be long enough to be attached to a wire and then connected to a breadboard, and placed on one side of the duct tape. Then do the same thing with a second piece of conductive thread, but have the thread hang off the other end of the duct tape.
Step 3: Cut out 2 pieces of Velostat (one for each conductive thread) and cover thread with the Velostat. Then get a 3rd piece of Velostat which will be placed between the 2 pieces covering the thread. Folding the duct tape is the final step in making the flex sensor.It’s important to make sure the threads don’t touch in any way, this will cause a short circuit.
Step 4: After making the sensor a multi-meter can be used to measure the resistance of our sensor. By attaching the two terminals of the multi-meter to the two threads on the sensor a resistance shows up on the multi-meter display. This resistance changes as the sensor bends. As one would bend their finger, the distance between the fingertip gets closer to your palm. When the distance between the ends of the sensor get closer, the pressure changes.
One of our group members had previous interest in how a virtual reality environment could be implemented using Unreal Engine and Blender. We saw this final project as an opportunity to take this interest to the next level.
Our concept is similar to Motion Capture (Mo-Cap) technology, that’s commonly used for video game developing, to make to movement of players as life-like as possible. Mo-Cap records the orientation of objects in a way that they can manipulated to derive a variety of different motions. For our Cyber-Hand, we used flex sensors along the fingers as an alternative. Instead of recording the orientation of the joints and fingers visually (like mo-cap would), we are able to use these sensors as a means of recognizing a change of resistance and it’s inversely proportionate change in output voltage, so we can then translate it to an input for the Unreal Engine to reflect the range of motion in the hand to the virtual hand we generated with the Unreal and Blender software.
The best way to display the bend graphically was to show a 3D-representation of a hand doing the same action. The first way to do this that came to mind would be to make a small “game” simulation in Unreal Engine 4.
The most important parts of this simulation are the UpdateFingers function, which takes input from the A3BU, and the Animation Blending, which takes the input and displays the appropriate graphical representation.
The UpdateFingers function reads a line from the serial stream of the a3bu as a string of the form “100,100,100,100,100” which corresponds as follows “thumb%,index%,middle%,ring%,pinkie%”. It then takes this string, parses it into 5 discrete values, and passes those values to local percentage variables. Those percentage variables were then used for the Animation Blending.
Processing for Animation Blending:
The processing step takes the percentage variables and scales them to the total time of each animation. These time values are then saved in new time variables.
Animation Blending takes the time variables and sets the close animation for each finger to the corresponding point in time. These animations are then blended into the final pose.
Velostat is a material used for packaging electrical devices that can be damaged by electrostatic discharge. Its unique properties allow it’s resistance to change with an applied pressure. By using this material in our flex sensors and putting them in series with another resistor we created a voltage divider, when the finger is bent we receive upwards of 1.65 V and when the finger is straightened out, roughly 0.3V. These values are delivered to the Analog to Digital Converter and then converted to percentages that Unreal processes as position.
(Note: In the video we used a 5V applied voltage, as opposed to the 3.3V input. This is why our output voltages values in the video reflect 0.45-2.5V.)