Scientific and Commercial Events
Wednesday 15.00-16.00 – Events are listed below.
SE 1: SmartHand - Control of Robotic and Prosthetic Arms and Hands Implementing a Tongue Control System
Daniel Johansen, Aalborg University
at booth 1
A demonstration will be given of the SmartHand system and the developed control scheme implementing a tongue control system for control of robotic/prosthetic arms and hands. The SmartHand allows for the use of five different types of grasps. The control scheme allows for direct activation of a specific type of grasp using the tongue control system.
SE 2: The iTongue and iHandle control systems - Driving wheelchairs and controlling computers, phones and tablets
Gert Spender, TKS A/S
in the lobby (near booth 1)
This event will give a demonstration of the iTongue and the iHandle used for driving a wheelchair, and used for text and mouse input to computers, phones, and tablets. There will be a competition, allowing for visitors of ICNR 2014 to take control of the wheelchair themselves, competing with each other to complete a short track marked on the floor as fast as possible only using the iHandle for controlling the wheelchair.
SE 3: Kinect markerless kinematic acquisition
John Hansen, Aalborg University
at booth 15
Full-body acquistions and automatic skeleton detection.
SE 4: The actuated guitar
Jeppe Veirum Larsen, Aalborg University
in the poster area
Rehabilitation for people with hemiplegia og stroke victims using a real electrical guitar. Demonstration of the actuated guitar. The guitar is designed to give people who has become paralysed in one side of the body by a stroke or similar the ability to play a real electrical guitar.
SE 6: Robotic rehabilitation of immobilized or weak patients
Joachim Kristensen, LT Automation
at booth 13
We will demonstrate the robot, how it work and how the robot can help reduce the physical load on the hospital personnel and some of the opportunities that lie within using a robot for rehabilitation. It will also be possible to ask questions regarding the robot.
SE 7: Brain-computer interfaces for communication and rehabilitation
Alexander Lechner, G-tec
at booth 7
For more than 25 years researchers all over the world have been working on the development of a Brain-Computer Interface (BCI). During the last years some patients have been supervised by the researchers themselves to use such BCI systems in daily life. The Austrian company g.tec medical engineering GmbH now brings the first patient-ready BCI on the market. The EEG-based spelling system is called intendiX® and enables the user to select keys from a matrix just by paying attention to a target symbol on the screen. In this way the patient can write messages or commands. intendiX® can speak the written text, print it or copy it into an e-mail message. The system is designed to be used without the assistance of a technician and can be installed and operated by the caregiver. For most users intendiX® works pretty fine after only a few minutes of training. For paralyzed patients the system has to be tried and evaluated in every specific case. This demonstration should give you an insight into the power of this system.
Another new development is called recoveriX®, a motor-recovery neurofeedback trainer, which uses Motor-Imagery (MI) based BCI technology. Using MI can induce neural plasticity and thus serve as an important tool to enhance motor rehabilitation for stroke patients. Rehabilitation is most effective when users get immersive feedback that relates to the activities they imagine or perform. If a stroke patient keeps trying to imagine or perform the same movement, while receiving feedback that helps to guide this movement, then users might regain the ability to grasp, or at least recover partial grasp function. recoveriX includes several feedback methods, to assist the rehabilitation. The prototype of this system will be presented at the exhibition during the conference.
SE 8: Stretch sensor
Line A. Rode and Valérie Daussin Laurent, Idéklinikken, Aalborg University Hospital
at booth 4
Volunteers will get a sensor on the foot for 3 minutes and the results will be presented to the person: Do you have a risk for hyperpronation?
SE 9: Kinect-based tele-rehabilitation system for hand function
Daniel Simonsen, Aalborg University
at booth 15
This exhibition presents a closed-loop rehabilitation system based on functional electrical stimulation driven by a Microsoft Kinect sensor for assisting rehabilitation of hand function. The system assists hand opening and grasping during training. The system is capable of detecting hand postures and objects and automatically controlling electrical stimulation during a hand function exercise.
Schedule at top of page.