Tuesday, 30 April 2013

My Wearable Interfaces

Fusion Band




The Fusion Band fuses together muscle and motion sensors, to create a wearable wrist/armband that can be used as a computer interface.

Two version were experimented with
1)  Gyro version
2) Accelerometer and magnetometer

Both versions use an EMG sensor to detect muscle activation.
The Electronics include an Arduino Pro Mini

Applications Include:

3D Object Manipulation
Writing
Computer Control
Robotic Arm Control
Robot Simulation Control

These can be found at:  Fusion Band Applications






Fabric Based Force Sensor Matrix




This is a wearable touch interface made from a fabric based force sensor.
It is a 6 by 8 (or a 48 point) force sensor matrix.
Resistive fabric sandwiched between strips of conductive fabric makes the base of the sensor

Graph was made using Processing.
Electronics was Implemented on an Arduino Pro Mini.

My simpler fabric force sensors can be found at: Simple Fabric Force Sensors


Fusion Band Applications


This project involved building and designing the Fusion Band, a wearable human machine interface that uses movement and muscle sensors, to acquire human intention to control a variety of different applications. Two version of the Fusion Band were created. MEA Fusion Band - Magnetometer, EMG, and Accelerometer. GE Fusion Band - Gyro and EMG.
Both Versions use an EMG sensor to detect muscle activation, while gyro, accelerometer and magnetometer detect motion.
Applications Include:
  • 3D Object Manipulation 
  • Writing 
  • Computer Control
  • Robotic Arm Control 
  • Robot Simulation Control 
The full video demonstrating the potential of the Fusion Band is provided below.




Robotic Arm Application

This project involved building and designing the Fusion Band, a wearable human machine interface that uses movement and muscle sensors, to aquire human intention to control a variety of different applications. The application shown in this video demonstrates the control of a robotic arm and gripper. Human Motion controls the yaw and pitch of the robotic arm, while muscle activation open and closes the gripper.




3d Object Manipulation Application

This project involved building and designing the Fusion Band, a wearable human machine interface that uses movement and muscle sensors, to aquire human intention to control a variety of different applications. The application shown in this video demonstrates how the Fusion Band can be used in 3D object manipulation for virtual worlds.






Robot Simulation Control Application

This project involved building and designing the Fusion Band, a wearable human machine interface that uses movement and muscle sensors, to aquire human intention to control a variety of different applications. The application shown in this video demonstrates how the Fusion Band can be used to control Mobile Sim, which is a robotics simulator. In this example application the robot will only be able to adjust heading when the EMG signal reaches an appropriate level, resulting in a steering wheel effect (only moving when touched)




Computer Control Application

This project involved building and designing the Fusion Band, a wearable human machine interface that uses movement and muscle sensors, to aquire human intention to control a variety of different applications. The application shown in this video demonstrates how we can control a computer using the Fusion Band. The EMG signal acts as a click, and the motion movements move the mouse.





Writing Application

This project involved building and designing the Fusion Band, a wearable human machine interface that uses movement and muscle sensors, to aquire human intention to control a variety of different applications. The application shown in this video demonstrates how a user can write on a computer screen using the fusion band,. This may have interesting applications for disabled users.



Simple Test of an IR Distance Sensor


This was a simple test to demonstrate how a IR LED and photodiode light detector can be used to detect the distance of an object like a human hand. The sensors can detect up to a few centimeters away. This is a simple setup using a couple of resistors, an arduino mega, and Processing for graphic representation
This could be used on the front of a robot to detect if the robot is going to crash.


Arduino Touche Projects


This project is an Arduino version of 'Touche' made by the Disney Corporation.
It is a capacitive sensor that can detect a range of different touch gestures. The video demonstrates 4 different classes of gesture, no touching, one finger touch, two finger touch, and touching of water. It is accomplished through swept capacitive sensing, which analyzes the capacitance at various frequencies. More details on the components and software used can be found on youtube in the description.
This Project was based on the tutorial found on Instructables




Force Sensors Projects

4 by 4 Fabric Sensor Matrix

This project is a force sensor matrix. It is resistive based, meaning as you apply pressure to the sensor the resistance changes, which can be monitored by electronics. This force sensor is entirely made out of fabric, meaning that it can be incorporated into soft materials, like blankets, shoes, or clothes.



Simple Fabric Based Forced Sensor

This is a single force sensor made out of fabric, which can be sown into clothing to incorporate wearable technology. This is ideal for simple wearable buttons, and could be used to control your phone, music player, or any form of portable technology.





Low cost simple force sensor 

This project attempted to make a very low cost and simple force sensor. It used aluminium sellotape, and conductive/resistive foam which you find with electronic microcontrollers to reduce static charge. The video shows one part of the force sensor matrix being 




EMG Control of a Virtual Hand

This project involved research into Electromyography (EMG), which studies the electrical properties of our muscles. I was using this rectified moving average value of 5 electrodes (electrodes acquire the signal) at specific locations on the forearm. The amplitude and the order of the signals was used to determine which finger i was moving. The implementation is not perfect as it only uses simple threshold based signal processing, but shows the great potential that EMG has at deciphering our movements, in order to intuitively control robotics, prosthesis, or any electrical device.


Pioneer Robot Assignment

This project was completed as part of a 3rd year University of Essex (UoE) robotics assignment. It required using pioneer robots to move around an arena, with obstacle avoidance using a laser and 8 sonar's, and color tracking to detect a football and push it into a goal.