IBTS
Team members:
- Eric Anderson
- Chris Messick
- John O'Neal
- Michael Santoro
We intend to build a system which can accurately track the position and orientation of a user's hands. This will provide a computer interface for a variety of possible applications such as 3D modeling and remote surgery. We will track the hands using ultrasonic trilateration. A base station will control ultrasonic emitters, which will send acoustic "pings" to circuits on the hands at the same time as a synchronizing RF signal. Small receivers on the tips of each finger and the back of the hand will listen for these pings. The 3D location of these six points will then be calculated and relayed back to the base station using a microprocessor and RF transceiver.
Success of the system will be gauged by several factors. The first is how accurately we can track the individual points on the hands. To measure this we will compare the calculated location to the actual physical position of a sensor with respect to the emitters. The second factor will be how well we can extrapolate finger position and hand orientation in the 3D model. The final metric of success will be the performance of the entire system, meaning how quickly we will be able to track the hands and how smoothly we can render the model so that as the user's physical hands move it appears that the model is updated instantaneously.