1 a reconfigurable wheelchair with a sit-to-stand facility for a disabled child
Modeling & simulation of a reconfigurable wheelchair with a sit-to-stand facility for a disabled child
This paper discusses the modelling and simulation of a reconfigurable wheelchair with a sit-to-stand facility for a disabled 25kg child. The prototype is designed in SolidWorks® and simulated in MATLAB® to get the suitable forces that have to be provided by the motors to activate the system. Three different techniques will be used in modelling the wheelchair. SolidWorks® software package will be used in the initial stage in order to get an estimation of the necessary motor power. A second model of the system will be built using SimMechanics Matlab toolbox®. In order to check the results obtained from both approaches; SolidWorks and SimMechanics, the model built designed by SolidWorks is embedded in SimMechanics where the system has been simulated for the third time. In all the three simulation stages; the system is considered to do a complete cycle of motion; sit-to-stand and a stand-to-sit. The system performance has been detected based on an open loop scenario where no control is implemented as this stage. Further analysis of the system has been done considering the energy consumption in two different modes; sit-to-stand mode and a forward straight line motion for a prescribed distance.
This paper discusses the modelling and simulation of a reconfigurable wheelchair with a sit-to-stand facility for a disabled 25kg child. The prototype is designed in SolidWorks® and simulated in MATLAB® to get the suitable forces that have to be provided by the motors to activate the system. Three different techniques will be used in modelling the wheelchair. SolidWorks® software package will be used in the initial stage in order to get an estimation of the necessary motor power. A second model of the system will be built using SimMechanics Matlab toolbox®. In order to check the results obtained from both approaches; SolidWorks and SimMechanics, the model built designed by SolidWorks is embedded in SimMechanics where the system has been simulated for the third time. In all the three simulation stages; the system is considered to do a complete cycle of motion; sit-to-stand and a stand-to-sit. The system performance has been detected based on an open loop scenario where no control is implemented as this stage. Further analysis of the system has been done considering the energy consumption in two different modes; sit-to-stand mode and a forward straight line motion for a prescribed distance.
2 Development of smart mobile walker for elderly and disabled
Development of smart mobile walker for elderly and disabled
This paper introduces a smart mobile walker which support walking aid, sit-to-stand aid, and electric scooter functions for elderly and disabled. It is designed for indoor use like rehabilitation hospital. It helps users exercise sit-to-stand motion, make gaits, and move place to place inside a facility. To support these functions, it changes its configuration. A smart device provides a GUI to communication with users. Detailed implementation of each function is described.
3 Smart walker development based on experts' feedback for elderly and disabled
Smart walker development based on experts' feedback for elderly and disabled
This paper introduces a smart mobile walker which was developed for elderly and disabled. It supports three features, which are walking aid, sit-to-stand aid, and electric scooter. Implementation of each function is briefly described. The overall performance of the smart mobile walker is evaluated by experts in rehabilitation hospitals and continuously being improved.
3 Wearable multi-sensor gesture recognition for paralysis patients
Wearable multi-sensor gesture recognition for paralysis patients
Quadriplegia and paraplegia are disabilities that result from injuries to the spinal cord and neuromuscular disorders such as cerebral palsy. Patients suffering from quadriplegia have varied levels of impaired motor movements, hence, performing quotidian tasks like controlling home appliances is challenging for quadriplegics. The use of hand and eye gestures to perform these tasks is a plausible remedy, but available solutions often assume considerable limb movement, are not fit for long-term use, and may not be applicable to quadriplegics with varied range of motor impairments. To address this problem, we present the design, implementation, and evaluation of a multi-sensor gesture recognition system that uses comfortable and low power wearable sensors. We have designed an EOG-based headband using textile electrodes and a glove that uses flex sensors and an accelerometer to detect eye and hand gestures. The gestures are used to control appliances remotely in a home setting and we show that they have good accuracy, latency, and energy consumption characteristics.
Quadriplegia and paraplegia are disabilities that result from injuries to the spinal cord and neuromuscular disorders such as cerebral palsy. Patients suffering from quadriplegia have varied levels of impaired motor movements, hence, performing quotidian tasks like controlling home appliances is challenging for quadriplegics. The use of hand and eye gestures to perform these tasks is a plausible remedy, but available solutions often assume considerable limb movement, are not fit for long-term use, and may not be applicable to quadriplegics with varied range of motor impairments. To address this problem, we present the design, implementation, and evaluation of a multi-sensor gesture recognition system that uses comfortable and low power wearable sensors. We have designed an EOG-based headband using textile electrodes and a glove that uses flex sensors and an accelerometer to detect eye and hand gestures. The gestures are used to control appliances remotely in a home setting and we show that they have good accuracy, latency, and energy consumption characteristics.
4 Hand Activity Based Emergency Response Communication by Disabled While Using Data Gloves
Gesture and Hand Activity Based Emergency Response Communication by Patients, Elderly and Disabled While Using Data Gloves
Monitoring patients on crucial and emergency conditions on clinical basis involve mandatory periodical but lively communication between patient and caretakers. Miscommunication leads to adverse effects and failure in life saving. A simple wearable system can interpret the implicit communication of the users to the caretakers or to an automated life support device. Simple and obvious hand movements or activity may be used for the above purpose. But the challenge is in interpretation of clear and well distinct input classes extracted from the wearable. The proposed system suggests well distinct classes of gestures that are suitable for such system developments. It also gives a clear insight about the combinations that are not very suitable for such implicit communication. The experimental results show that the realization of suggested hand movement activity sets over the coupling of other movements for classes' is always have distinct thresholds which are reliable for error free communication.
Monitoring patients on crucial and emergency conditions on clinical basis involve mandatory periodical but lively communication between patient and caretakers. Miscommunication leads to adverse effects and failure in life saving. A simple wearable system can interpret the implicit communication of the users to the caretakers or to an automated life support device. Simple and obvious hand movements or activity may be used for the above purpose. But the challenge is in interpretation of clear and well distinct input classes extracted from the wearable. The proposed system suggests well distinct classes of gestures that are suitable for such system developments. It also gives a clear insight about the combinations that are not very suitable for such implicit communication. The experimental results show that the realization of suggested hand movement activity sets over the coupling of other movements for classes' is always have distinct thresholds which are reliable for error free communication.
5 Recognition of obstacle distribution via vibrotactile stimulation for the visually disabled
Recognition of obstacle distribution via vibrotactile stimulation for the visually disabled
A tactile display is able to help the walking support system of a visually disabled person. A walking guide system must detect an obstacle's distribution in walking space including hanging and/or protruding obstacles, and provide useful feedback for safe walking. In this study, we investigate the applicability of a tactile display to the walking guide. The obstacle information is transmitted to the palm by tactile stimulation. The three dimensional (3D) detection of an obstacle's distribution using ultrasonic sensors, the fabrication of a tactile stimulator using vibration motors, and the mapping of the detected results to an array-type tactile stimulator are proposed. An experiment on the recognition of an obstacle's distribution via tactile stimulation is performed to evaluate the possibility of a walking guide. In the experimental result, the average recognition rate was 95.14 % (5.73, standard error).
A tactile display is able to help the walking support system of a visually disabled person. A walking guide system must detect an obstacle's distribution in walking space including hanging and/or protruding obstacles, and provide useful feedback for safe walking. In this study, we investigate the applicability of a tactile display to the walking guide. The obstacle information is transmitted to the palm by tactile stimulation. The three dimensional (3D) detection of an obstacle's distribution using ultrasonic sensors, the fabrication of a tactile stimulator using vibration motors, and the mapping of the detected results to an array-type tactile stimulator are proposed. An experiment on the recognition of an obstacle's distribution via tactile stimulation is performed to evaluate the possibility of a walking guide. In the experimental result, the average recognition rate was 95.14 % (5.73, standard error).
6 Optical Interface prototype to operate computer for Disabled Personnel
Optical Interface prototype to operate computer for Disabled Personnel
The proposed designed prototype “Optical Interface System (OIS)” provides computer accessibility and a novel solution to control home appliances for disabled personnel. The prototype consists of an optical sensor network interfaced to microcontroller; that detects users intended direction for the movement of the cursor on the computer screen and translates it into 2D motion using stepper motors. The stepper motors helps in tracking ball position using modified ball mouse. The optical sensor is activated on focusing light beam (i.e. Laser) attached to an head strap. Designed Graphical User Interface allows user to control the devices that are interfaced to the controller. The results showed that focusing of light beam on to an optical sensor by head movement was achieved successfully and the precise movement of the cursor was able to control. The function of control of home appliances was made by interfacing LED and buzzer to the controller. The designed prototype is simple in its physical structure and very cost effective, thus serves the motto of providing an affordable solution to HCI (human computer interface) for disabled personnel.
The proposed designed prototype “Optical Interface System (OIS)” provides computer accessibility and a novel solution to control home appliances for disabled personnel. The prototype consists of an optical sensor network interfaced to microcontroller; that detects users intended direction for the movement of the cursor on the computer screen and translates it into 2D motion using stepper motors. The stepper motors helps in tracking ball position using modified ball mouse. The optical sensor is activated on focusing light beam (i.e. Laser) attached to an head strap. Designed Graphical User Interface allows user to control the devices that are interfaced to the controller. The results showed that focusing of light beam on to an optical sensor by head movement was achieved successfully and the precise movement of the cursor was able to control. The function of control of home appliances was made by interfacing LED and buzzer to the controller. The designed prototype is simple in its physical structure and very cost effective, thus serves the motto of providing an affordable solution to HCI (human computer interface) for disabled personnel.
7 A system for remote operation of Devices: Helpful for Elderly & Disabled people
A system for remote operation of Devices: Helpful for Elderly & Disabled people
This paper proposes an electronic system for remote operation of various electrical devices. The range of the system is about 30 meter in indoor environment. The system consists of a wireless remote handset and a fixed actuator unit. The system can be easily integrated in an existing electrical wiring. This system is very helpful for elderly and disabled people who have restricted movement or are bedridden. Such people can operate many devices just with press of a single button of the remote. The device is omni-directional and hence can control devices which are not in line-of-sight unlike IR based remote controllers. The remote controller and the actuator communicate bi-directionally which helps to know the status of the devices which are ON at a point of time. The system is implemented using microcontrollers.
This paper proposes an electronic system for remote operation of various electrical devices. The range of the system is about 30 meter in indoor environment. The system consists of a wireless remote handset and a fixed actuator unit. The system can be easily integrated in an existing electrical wiring. This system is very helpful for elderly and disabled people who have restricted movement or are bedridden. Such people can operate many devices just with press of a single button of the remote. The device is omni-directional and hence can control devices which are not in line-of-sight unlike IR based remote controllers. The remote controller and the actuator communicate bi-directionally which helps to know the status of the devices which are ON at a point of time. The system is implemented using microcontrollers.
8 A Friendly Mobile Robot for Disabled Children
A Friendly Mobile Robot for Disabled Children
Disabled children, when compared to other children, have fewer opportunities for exploring and interacting with the world. Thus, they are exposed to the feeling that they are unable to do anything by themselves. In this sense, the use of mobile robots may help these children to overcome their limitation and provide means to develop social skills. This paper describes partial results of on-going research on control architectures for mobile robots concerning hardware and software aspects. We propose a behavior-based architecture for the interaction between humans and robots, particularly children with severe motor disabilities. The main goal is to create a modular, flexible and scalable development environment, which motivates children to interact with the robot and the world.
Disabled children, when compared to other children, have fewer opportunities for exploring and interacting with the world. Thus, they are exposed to the feeling that they are unable to do anything by themselves. In this sense, the use of mobile robots may help these children to overcome their limitation and provide means to develop social skills. This paper describes partial results of on-going research on control architectures for mobile robots concerning hardware and software aspects. We propose a behavior-based architecture for the interaction between humans and robots, particularly children with severe motor disabilities. The main goal is to create a modular, flexible and scalable development environment, which motivates children to interact with the robot and the world.
9 Digital Library for Disabled Persons
Digital Library for Disabled Persons
The ways to improve the efficiency of submission information using multimedia technology were examined in this report, thus ensuring the use of a powerful new tool for perception of information by disabled persons. The formation peculiarities of multimedia information content for disabled persons were analyzed. This paper considers digital library as an information system, where information is formed and collected from different sources, sorted, structured and intellectually processed and it's proposed a set of information technology services, making multimedia information content accessible for disabled persons.
The ways to improve the efficiency of submission information using multimedia technology were examined in this report, thus ensuring the use of a powerful new tool for perception of information by disabled persons. The formation peculiarities of multimedia information content for disabled persons were analyzed. This paper considers digital library as an information system, where information is formed and collected from different sources, sorted, structured and intellectually processed and it's proposed a set of information technology services, making multimedia information content accessible for disabled persons.
10 Design & Implementation of robotic system to transport disabled people
Design & Implementation of robotic system to transport disabled people
The use of robotics in human healthcare is growing interest. In most institutions, services provided to the elderly and disabled are unsatisfactory, mainly due to their dependence on human assistance. Utilizing robots will spare human work force and help decreasing the risk of infection; and subsequently, economizing on personnel and hygiene expenses. This paper discussed these issues in three points, in order to present a low cost autonomous robotic wheelchair design to transport the elderly and disabled within hospitals. First, a design of a robot was implemented on a small prototype. The robot was navigated using line-following with obstacle detection technique; which is economical, simple and easily executed. Second, a design of a robotic wheelchair was presented. The wheelchair was developed from manual wheelchairs used in hospitals after proper adjustments, and a design of two gearboxes has been proposed. Third, the advantage of using robotic wheelchairs inside hospitals was discussed by utilizing findings from previous studies. The robot prototype circuit was built using optical and ultrasonic sensors, two DC motors and L293D motor driver; and programmed with Atmega16L Microcontroller. The robot was able to follow a predetermined line-path and detect obstacles within an acceptable range.
The use of robotics in human healthcare is growing interest. In most institutions, services provided to the elderly and disabled are unsatisfactory, mainly due to their dependence on human assistance. Utilizing robots will spare human work force and help decreasing the risk of infection; and subsequently, economizing on personnel and hygiene expenses. This paper discussed these issues in three points, in order to present a low cost autonomous robotic wheelchair design to transport the elderly and disabled within hospitals. First, a design of a robot was implemented on a small prototype. The robot was navigated using line-following with obstacle detection technique; which is economical, simple and easily executed. Second, a design of a robotic wheelchair was presented. The wheelchair was developed from manual wheelchairs used in hospitals after proper adjustments, and a design of two gearboxes has been proposed. Third, the advantage of using robotic wheelchairs inside hospitals was discussed by utilizing findings from previous studies. The robot prototype circuit was built using optical and ultrasonic sensors, two DC motors and L293D motor driver; and programmed with Atmega16L Microcontroller. The robot was able to follow a predetermined line-path and detect obstacles within an acceptable range.
11 A supportive Friend at work: Robotic workplace Assistance for the Disabled
A supportive Friend at work: Robotic workplace Assistance for the Disabled
This article presents the evolution of an assistive robotic system, the Functional Robot with Dexterous Arm and User-Friendly Interface for Disabled People (FRIEND), from a robot supporting disabled people in their activities of daily living (ADL) into a robot supporting people with disabilities in real workplaces. In its fourth generation, FRIEND supports the end user, a quadriplegic individual, to work as a librarian with the task of retrospectively cataloging collections of old books. All of the book manipulation tasks, such as grasping the book from the book cart and placing it on the specially designed book holder for reading by the end user, are carried out autonomously by the FRIEND system. The retrospective cataloging itself is done by the end user. This article discusses all of the technical adjustments and improvements to the FRIEND system that are necessary to meet the challenges of a robot supporting a disabled person working on a regular basis. These challenges concern the shared autonomy between system and user, system effectiveness, safety in interaction with the user, and user acceptability. The focus is on both the vision-based control of book manipulation as a key factor for autonomous robot functioning and on an advanced human-machine interface (HMI), which enables the end user to intervene if the autonomous book manipulation fails. The experimental results of an in-depth evaluation of the system performance in supporting the end user to perform the librarian task are presented. It has been shown that working together, the FRIEND system and the end user had an overall success rate of 95%. These results may help to raise interest in the research field of workplace assistive robotics, establish new projects, and, eventually, supply such systems to the people whose working lives they could greatly improve.
This article presents the evolution of an assistive robotic system, the Functional Robot with Dexterous Arm and User-Friendly Interface for Disabled People (FRIEND), from a robot supporting disabled people in their activities of daily living (ADL) into a robot supporting people with disabilities in real workplaces. In its fourth generation, FRIEND supports the end user, a quadriplegic individual, to work as a librarian with the task of retrospectively cataloging collections of old books. All of the book manipulation tasks, such as grasping the book from the book cart and placing it on the specially designed book holder for reading by the end user, are carried out autonomously by the FRIEND system. The retrospective cataloging itself is done by the end user. This article discusses all of the technical adjustments and improvements to the FRIEND system that are necessary to meet the challenges of a robot supporting a disabled person working on a regular basis. These challenges concern the shared autonomy between system and user, system effectiveness, safety in interaction with the user, and user acceptability. The focus is on both the vision-based control of book manipulation as a key factor for autonomous robot functioning and on an advanced human-machine interface (HMI), which enables the end user to intervene if the autonomous book manipulation fails. The experimental results of an in-depth evaluation of the system performance in supporting the end user to perform the librarian task are presented. It has been shown that working together, the FRIEND system and the end user had an overall success rate of 95%. These results may help to raise interest in the research field of workplace assistive robotics, establish new projects, and, eventually, supply such systems to the people whose working lives they could greatly improve.
12 Wearable wireless tongue controlled assistive device using optical sensors
Wearable wireless tongue controlled assistive device using optical sensors
Studies show that globally, about 1 in 50 people are living with paralysis and the amount increasing year by year. Paralysis can leave present these people with no interest in life leading to mental depression. An assistive device can aid a disabled person with a support to lead an easy life and it should be able to efficiently discern user's intention providing ease of usage. This paper proposes a wireless, wearable tongue controlled assistive device called Optical Sensor Based Tongue Controlled Assistive Device (OTCAD). Here the tongue movement is sensed using a small and low cost infrared (IR) sensor, mounted bilaterally near user's cheek. The user uses his/her tongue to change the reflection intensity of the sensor and this signal change is converted into user commands through signal processing. These commands are then wirelessly transmitted to PC/Smartphone through which the user can control their home environment, move a wheel chair or pass information to their care taker.
Studies show that globally, about 1 in 50 people are living with paralysis and the amount increasing year by year. Paralysis can leave present these people with no interest in life leading to mental depression. An assistive device can aid a disabled person with a support to lead an easy life and it should be able to efficiently discern user's intention providing ease of usage. This paper proposes a wireless, wearable tongue controlled assistive device called Optical Sensor Based Tongue Controlled Assistive Device (OTCAD). Here the tongue movement is sensed using a small and low cost infrared (IR) sensor, mounted bilaterally near user's cheek. The user uses his/her tongue to change the reflection intensity of the sensor and this signal change is converted into user commands through signal processing. These commands are then wirelessly transmitted to PC/Smartphone through which the user can control their home environment, move a wheel chair or pass information to their care taker.
13 Adapted interfaces and interactive electronic devices for the smart home
Adapted interfaces and interactive electronic devices for the smart home
This paper discusses a novel TV-based home platform to provide multimedia interactive services and tele-assistance. This system is based on a low-cost architecture that integrates a wide range of interaction and control devices to dramatically improve the users' experience. Besides, accessibility has been a major concern, as this proposal is intended to facilitate access to new information and communication technologies to elder and disabled people, who spend most of the time at home. The solution proposed is open and independent of specific hardware and software providers, and implements a modular add-on strategy that facilitates customization and the inclusion of new functionalities.
This paper discusses a novel TV-based home platform to provide multimedia interactive services and tele-assistance. This system is based on a low-cost architecture that integrates a wide range of interaction and control devices to dramatically improve the users' experience. Besides, accessibility has been a major concern, as this proposal is intended to facilitate access to new information and communication technologies to elder and disabled people, who spend most of the time at home. The solution proposed is open and independent of specific hardware and software providers, and implements a modular add-on strategy that facilitates customization and the inclusion of new functionalities.
14 Brain controlled multiagent aerial vehicles system
Brain controlled multiagent aerial vehicles system
This paper describes designed control system for group of quadcopters with connection to the brain-computer interface. Quadcopter can be controlled mentally. Electroencephalogram signals are used as control commands. The system is able to keep formation during the movement on the optimal trajectory for the fastest exploration of locality. The application provides both manual and full-automatic control with streaming video capturing. Realized system may be used by disabled people for drug delivery and observation of neighborhood or by the operators in industry as well as in exploration of locality in case of technogenic accident and catastrophe, elemental calamity.
15f a network system combined with ambulatory & physiological measurements for challenged kids
Development of a network system combined with ambulatory and non-conscious physiological measurements for supporting challenged kids
Various physiological measurement techniques have been developed to support healthcare and daily living of adult including elderly. However, in light of the rapid growth of the declining birth rate, promotion in care and life support for children are not enough. Especially in rehabilitation for disabled children, i.e., challenged kids, it is important for therapist to evaluate the efficacy of rehabilitation and the health condition. Share of these information with educational, welfare, and government institutions are also needed for accurate life support. Therefore, the quantitative data of the activities and daily health status are helpful. From these viewpoints, we are developing a new network system for monitoring the activities and the health status of children using ambulatory and non-conscious physiological measurements as well as data browse at anytime and anywhere. Firstly, we propose a wearable gait monitoring system to support evaluation for the efficacy of rehabilitation. In this study, the present system can successfully detect the characteristics of postural changes in children with disorder of movement, demonstrating its usefulness and availability to the evaluation for the effect of the brace attached to the subject's lower limb.16 Smart home system using android application
Smart home system using android application
This paper presents the overall design of Home Automation System (HAS) with low cost and wireless remote control. This system is designed to assist and provide support in order to fulfil the needs of elderly and disabled in home. Also, the smart home concept in the system improves the standard living at home. The main control system implements wireless Bluetooth technology to provide remote access from PC/laptop or smart phone. The design remains the existing electrical switches and provides more safety control on the switches with low voltage activating method. The switches status is synchronized in all the control system whereby every user interface indicates the real time existing switches status. The system intended to control electrical appliances and devices in house with relatively low cost design, user-friendly interface and ease of installation.
This paper presents the overall design of Home Automation System (HAS) with low cost and wireless remote control. This system is designed to assist and provide support in order to fulfil the needs of elderly and disabled in home. Also, the smart home concept in the system improves the standard living at home. The main control system implements wireless Bluetooth technology to provide remote access from PC/laptop or smart phone. The design remains the existing electrical switches and provides more safety control on the switches with low voltage activating method. The switches status is synchronized in all the control system whereby every user interface indicates the real time existing switches status. The system intended to control electrical appliances and devices in house with relatively low cost design, user-friendly interface and ease of installation.
17 Speed based classification of mechanomyogram using fuzzy logic
Speed based classification of mechanomyogram using fuzzy logic
Mechanomyogram (MMG) signals are the mechanical signals obtained from muscles during contractions. They are less sensitive to skin impedance, sensor placement and require only low cost hardware to process the signal. Till date there are only very few applications in which MMG signals are used. The work aims at development of a standalone system for generating control signals required to drive assistive devices which provide support for disabled and elderly people. This paper presents the initial phase of the work, which focuses on the development of a fuzzy classifier. The classifier is developed to categorize the different speeds of elbow movements into rest, slow and fast. For this, MMG signal from biceps brachii are acquired and processed. Two time-domain features namely, mean absolute value and variance are extorted from the segmented data and is given to the fuzzy inference system. The average accuracy of the classifier is found to be 72.72%
Mechanomyogram (MMG) signals are the mechanical signals obtained from muscles during contractions. They are less sensitive to skin impedance, sensor placement and require only low cost hardware to process the signal. Till date there are only very few applications in which MMG signals are used. The work aims at development of a standalone system for generating control signals required to drive assistive devices which provide support for disabled and elderly people. This paper presents the initial phase of the work, which focuses on the development of a fuzzy classifier. The classifier is developed to categorize the different speeds of elbow movements into rest, slow and fast. For this, MMG signal from biceps brachii are acquired and processed. Two time-domain features namely, mean absolute value and variance are extorted from the segmented data and is given to the fuzzy inference system. The average accuracy of the classifier is found to be 72.72%
18 A combined Cognitive Multimedia Model for children with intellectual disabilities
A combined Cognitive Multimedia Model for children with intellectual disabilities
Children with Down syndrome and other developmental disabilities are often faced with intellectual disorders ranging from mild to severe. This study proposes a multimedia-based learning model which combines Mayer's Cognitive Multimedia Learning Model with Skinner's Operant Conditioning, and involves implementing phonological awareness in the learning process. This is implemented with the use of animated multimedia tutorials and exercises allow children with Down syndrome and other disabilities an improved learning opportunity. This system is tested on a group of 100 disabled children and results have indicated increased levels of motivation and high relative performance scores.
Children with Down syndrome and other developmental disabilities are often faced with intellectual disorders ranging from mild to severe. This study proposes a multimedia-based learning model which combines Mayer's Cognitive Multimedia Learning Model with Skinner's Operant Conditioning, and involves implementing phonological awareness in the learning process. This is implemented with the use of animated multimedia tutorials and exercises allow children with Down syndrome and other disabilities an improved learning opportunity. This system is tested on a group of 100 disabled children and results have indicated increased levels of motivation and high relative performance scores.
19 Context-aware elderly entertainment support system in assisted living environment
Context-aware elderly entertainment support system in assisted living environment
Worldwide elderly population is increasing. Many of these elderly are placed in assisted living environment, and they often need cognitive or physical support in their daily lives. To address such need, a lot of attention have been given from various perspectives such as health monitoring, medication adherence, body-sensor network, smart home for elderly and disabled and so on. However, very less focus has been directed to address the entertainment needs of the elderly, which is an important aspect for an enjoyable living. This paper proposes the design of a context-aware elderly entertainment support system (CAEESS) by investigating the entertainment requirements based on the perspectives of elderly and caregiver. Our focus is on the technical platform to fulfill the identified requirements. The proposed system facilitates the elderly to access various entertainment services through different modes of interaction and also empowers the caregiver to select entertainment media or resources for the elderly when needed. We believe that the proposed approach will provide a clear understanding of the requirements of CAEESS and enable us to effectively address the problems related to the independent and happy living of the elderly.
Worldwide elderly population is increasing. Many of these elderly are placed in assisted living environment, and they often need cognitive or physical support in their daily lives. To address such need, a lot of attention have been given from various perspectives such as health monitoring, medication adherence, body-sensor network, smart home for elderly and disabled and so on. However, very less focus has been directed to address the entertainment needs of the elderly, which is an important aspect for an enjoyable living. This paper proposes the design of a context-aware elderly entertainment support system (CAEESS) by investigating the entertainment requirements based on the perspectives of elderly and caregiver. Our focus is on the technical platform to fulfill the identified requirements. The proposed system facilitates the elderly to access various entertainment services through different modes of interaction and also empowers the caregiver to select entertainment media or resources for the elderly when needed. We believe that the proposed approach will provide a clear understanding of the requirements of CAEESS and enable us to effectively address the problems related to the independent and happy living of the elderly.
20 Expert system for the decision on the ability to drive power wheelchair based on fuzzy logic
Expert system for the decision on the ability to drive power wheelchair based on fuzzy logic
This paper presents an expert system based on the fuzzy logic. The main idea is to study and realize a tool to help the ergo-therapist for the decision on the capacity of the disabled person to drive safely a power wheelchair. The decision will also concern the optimal number of sensors and the suitable type of assistance (light, sound...) to integrate on this wheelchair in order to assist the handicapped during his navigation. In order to test the efficiency of this expert system, based on fuzzy logic, a 3D simulator for driving wheelchair was created.
This paper presents an expert system based on the fuzzy logic. The main idea is to study and realize a tool to help the ergo-therapist for the decision on the capacity of the disabled person to drive safely a power wheelchair. The decision will also concern the optimal number of sensors and the suitable type of assistance (light, sound...) to integrate on this wheelchair in order to assist the handicapped during his navigation. In order to test the efficiency of this expert system, based on fuzzy logic, a 3D simulator for driving wheelchair was created.
21 The use of NFC and Android technologies to enable a KNX-based smart home
The use of NFC and Android technologies to enable a KNX-based smart home
In recent years, due the improvement of humans' living standard, smart homes are receiving an increasing interest. They can provide several useful services such as support for the elderly and disabled people, access control, environmental monitoring, and home automation. Furthermore, with the widespread diffusion of mobile devices (i.e., smartphones, tablets) and their integration with new auto-identification technologies (such as the NFC technology), the need to control and manage the smart home through these devices is increasing. In this context, the main goal of this work is to develop and validate an architecture, both hardware and software, able to monitor and manage a KNX-based home automation system through an Android mobile device in an efficient and safe way. More in detail, a software system able to configure an Android application consistently with the home automation implant was designed and implemented as well as an Android application able to manage the entire home automation system based on the KNX standard. A further Android module, which exploits NFC technology, was developed in order to address the access control issue. A real use case is presented, which demonstrate the effectiveness of the proposed software system.
In recent years, due the improvement of humans' living standard, smart homes are receiving an increasing interest. They can provide several useful services such as support for the elderly and disabled people, access control, environmental monitoring, and home automation. Furthermore, with the widespread diffusion of mobile devices (i.e., smartphones, tablets) and their integration with new auto-identification technologies (such as the NFC technology), the need to control and manage the smart home through these devices is increasing. In this context, the main goal of this work is to develop and validate an architecture, both hardware and software, able to monitor and manage a KNX-based home automation system through an Android mobile device in an efficient and safe way. More in detail, a software system able to configure an Android application consistently with the home automation implant was designed and implemented as well as an Android application able to manage the entire home automation system based on the KNX standard. A further Android module, which exploits NFC technology, was developed in order to address the access control issue. A real use case is presented, which demonstrate the effectiveness of the proposed software system.
22 Silicon eyes: navigation assistant for visually impaired using Braille keypad and SMS
Silicon eyes: GPS-GSM based navigation assistant for visually impaired using capacitive touch Braille keypad and smart SMS facility
This paper proposes an aim to provide blind navigation information via audible messages and haptic feedback, helping the visually disabled to localize where they are and their mobility path way. This also proposes a method that allows blind people to enter notes and control device operation via Braille capacitive touch keypad instead of sending SMS by entering the number and text. An emergency button triggers an SMS from the GSM module that will send the present location (GPS coordinates) of the user to a remote phone number asking for help. In addition, the device provides the information needed to the user, in audio format using audio codec, including time, calendar, and object color using a 24-bit color sensor, obstacle distance using SONAR, navigation direction, ambient light and temperature conditions.
This paper proposes an aim to provide blind navigation information via audible messages and haptic feedback, helping the visually disabled to localize where they are and their mobility path way. This also proposes a method that allows blind people to enter notes and control device operation via Braille capacitive touch keypad instead of sending SMS by entering the number and text. An emergency button triggers an SMS from the GSM module that will send the present location (GPS coordinates) of the user to a remote phone number asking for help. In addition, the device provides the information needed to the user, in audio format using audio codec, including time, calendar, and object color using a 24-bit color sensor, obstacle distance using SONAR, navigation direction, ambient light and temperature conditions.
23 Delivering educational services using home theatre personal computers
Delivering educational services using home theatre personal computers - A solution for people with special needs
We introduce an e-Iearning system specifically designed for disabled people. This system is implemented on top of a home theatre personal computer (HTPC), a small computer connected to a TV set to offer interactive services. This hardware platform also facilitated the integration of control devices designed for disabled people. An adapted version of Moodle, one of the most popular e-Iearning management solutions, provides the required e-Iearning functionalities and access to educational content.
We introduce an e-Iearning system specifically designed for disabled people. This system is implemented on top of a home theatre personal computer (HTPC), a small computer connected to a TV set to offer interactive services. This hardware platform also facilitated the integration of control devices designed for disabled people. An adapted version of Moodle, one of the most popular e-Iearning management solutions, provides the required e-Iearning functionalities and access to educational content.
24 Social network framework for deaf and blind people based on cloud computing
Social network framework for deaf and blind people based on cloud computing
Most of the governments and civil society organizations work hardly to promote the disabled people especially blind and deaf persons to join the normal community and practice the regular daily life activities. Indeed, Information Technology with its modern methodologies such as mobile and Cloud computing has an impressive role in enhancing the inter-communication among the people with different disabilities and normal pupils from one side and among the disabled people themselves who have the same or different impairments. However, a few numbers of suggested systems are quite limited for the Arabic Region. Additionally, according to our knowledge, there is no proposed system for connecting the blind and deaf people within direct Arabic language-based conversations. In this paper, we propose a comprehensive framework constructed upon three main modern technologies: mobile devices, Cloud resources and social networks to provide a seamless communication between the blind and deaf people especially for those living in the Arabic countries. Moreover, it is designed to facilitate the communication with normal people through various directions by using recent methodologies such as time-of-flight camera and social networks. The main modules and components of the suggested framework and its possible scenarios are fully analyzed and described.
Most of the governments and civil society organizations work hardly to promote the disabled people especially blind and deaf persons to join the normal community and practice the regular daily life activities. Indeed, Information Technology with its modern methodologies such as mobile and Cloud computing has an impressive role in enhancing the inter-communication among the people with different disabilities and normal pupils from one side and among the disabled people themselves who have the same or different impairments. However, a few numbers of suggested systems are quite limited for the Arabic Region. Additionally, according to our knowledge, there is no proposed system for connecting the blind and deaf people within direct Arabic language-based conversations. In this paper, we propose a comprehensive framework constructed upon three main modern technologies: mobile devices, Cloud resources and social networks to provide a seamless communication between the blind and deaf people especially for those living in the Arabic countries. Moreover, it is designed to facilitate the communication with normal people through various directions by using recent methodologies such as time-of-flight camera and social networks. The main modules and components of the suggested framework and its possible scenarios are fully analyzed and described.
25 A Remote Computer Control System Using Speech Recognition Technologies of Mobile Devices
A Remote Computer Control System Using Speech Recognition Technologies of Mobile Devices
This paper presents a remote control computer system using speech recognition technologies of mobile devices for the blind and physically disabled population. These people experience difficulty and inconvenience using computers through a keyboard and/or mouse. The purpose of this system is to provide a way that the blind and physically disabled population can easily control many functions of a computer via speech. The configuration of the system consists of a mobile device such as a smartphone, a PC server, and a Google server that are connected to each other. Users can command a mobile device to do something via speech such as directly controlling computers, writing emails and documents, calculating numbers, checking the weather forecast, and managing a schedule. They are then immediately executed. The proposed system also provides the blind people with a function via TTS (text to speech) of the Google server if they want to receive contents of the document stored in a computer.
This paper presents a remote control computer system using speech recognition technologies of mobile devices for the blind and physically disabled population. These people experience difficulty and inconvenience using computers through a keyboard and/or mouse. The purpose of this system is to provide a way that the blind and physically disabled population can easily control many functions of a computer via speech. The configuration of the system consists of a mobile device such as a smartphone, a PC server, and a Google server that are connected to each other. Users can command a mobile device to do something via speech such as directly controlling computers, writing emails and documents, calculating numbers, checking the weather forecast, and managing a schedule. They are then immediately executed. The proposed system also provides the blind people with a function via TTS (text to speech) of the Google server if they want to receive contents of the document stored in a computer.
26 Kinect-based Powered Wheelchair Control System
Kinect-based Powered Wheelchair Control System
With the trend of the aging society, how to take good care of the elders has become the significant issue. Based on the techniques of Kinect interface and indoor positioning, this paper develops a power wheelchair control system to improve the living quality of disabled elders. To realize hand gesture recognition with Kinect interface, the disabled elder uses hand gesture to call the wheelchair. The power wheelchair moves to the location of disabled elder automatically. At the same time, the elder can request the power wheelchair to any places via a touch panel. When the disabled elder does not need the wheelchair furthermore, the power wheelchair will go back to its parking location.
With the trend of the aging society, how to take good care of the elders has become the significant issue. Based on the techniques of Kinect interface and indoor positioning, this paper develops a power wheelchair control system to improve the living quality of disabled elders. To realize hand gesture recognition with Kinect interface, the disabled elder uses hand gesture to call the wheelchair. The power wheelchair moves to the location of disabled elder automatically. At the same time, the elder can request the power wheelchair to any places via a touch panel. When the disabled elder does not need the wheelchair furthermore, the power wheelchair will go back to its parking location.
27 Design of a novel wearable human computer interface based on electrooculograghy
Design of a novel wearable human computer interface based on electrooculograghy
People with severe motor disabilities are not able to move their limbs voluntarily and speech overtly, though the cognitive parts of their brain are intact. Human computer interfaces, as an assistive technology, provide a new channel of communication to help these people. In this study, a novel wearable miniaturized human computer interface system was designed and tested allowing these people to state their intentions and feelings just by using their eyes. The system that can be installed on glasses, records the electrooculogram signal and transfers the digitized data wirelessly to a laptop. By analyzing the signals, eight directions of eye movements consist of up, down, right, left and four diagonal directions, as well as the voluntary blinking were recognized and used in a high performance graphical user interface to type alphabetical letters and numbers, just by two moves and two selections. Two experiments were conducted to evaluate the performance of the developed system. Precision, sensitivity and accuracy of recognizing the user intention were obtained 95%, 98% and 93% respectively and the average rate of communication was 5.88 character per minute. This low-cost wearable device is light-weight with small size which assures high level of mobility and comfort. The users could learn to type with the system in a short time, and easily work with it without fatigue.
People with severe motor disabilities are not able to move their limbs voluntarily and speech overtly, though the cognitive parts of their brain are intact. Human computer interfaces, as an assistive technology, provide a new channel of communication to help these people. In this study, a novel wearable miniaturized human computer interface system was designed and tested allowing these people to state their intentions and feelings just by using their eyes. The system that can be installed on glasses, records the electrooculogram signal and transfers the digitized data wirelessly to a laptop. By analyzing the signals, eight directions of eye movements consist of up, down, right, left and four diagonal directions, as well as the voluntary blinking were recognized and used in a high performance graphical user interface to type alphabetical letters and numbers, just by two moves and two selections. Two experiments were conducted to evaluate the performance of the developed system. Precision, sensitivity and accuracy of recognizing the user intention were obtained 95%, 98% and 93% respectively and the average rate of communication was 5.88 character per minute. This low-cost wearable device is light-weight with small size which assures high level of mobility and comfort. The users could learn to type with the system in a short time, and easily work with it without fatigue.
28 A novel approach: intelligent voice response system to navigate mobile devices for blind
A novel approach: Voice enabled interface with intelligent voice response system to navigate mobile devices for visually challenged people
Now day's usage mobile devices are increasing in numbers. It can provide a foundation for improving the communication, learning and teaching environments. However, not all potential users have the capabilities that allow them to use the existing methodologies. To improve the search and navigation of various application and features of mobile devices we have proposed a model that entails the hands-free voice navigation system will be helpful for the visually challenged people. This model provides the opportunity to operate the mobile devices without using the keypad.
Now day's usage mobile devices are increasing in numbers. It can provide a foundation for improving the communication, learning and teaching environments. However, not all potential users have the capabilities that allow them to use the existing methodologies. To improve the search and navigation of various application and features of mobile devices we have proposed a model that entails the hands-free voice navigation system will be helpful for the visually challenged people. This model provides the opportunity to operate the mobile devices without using the keypad.
29 information delivery system in a major disaster for deaf people based on embedded web system
Performance evaluation of information delivery system in a major disaster for deaf people based on embedded web system
In this paper, we present an outline of the new Information Delivery System During a Major Disaster for People Who are Deaf (IDDD) designed using a web platform such as node.js and GCM (Google Cloud Message), and explain the performance measurement results. Especially, a new implementation method using web system for embedded system and M2M system is explained. Also, we explained delay to provide disaster information to a user and spreading that information to multiple displays are shorter than previous IDDD.
In this paper, we present an outline of the new Information Delivery System During a Major Disaster for People Who are Deaf (IDDD) designed using a web platform such as node.js and GCM (Google Cloud Message), and explain the performance measurement results. Especially, a new implementation method using web system for embedded system and M2M system is explained. Also, we explained delay to provide disaster information to a user and spreading that information to multiple displays are shorter than previous IDDD.
30 EHigh level functions for the intuitive use of an assistive robot
High level functions for the intuitive use of an assistive robot
This document presents the research project ARMEN (Assistive Robotics to Maintain Elderly People in a Natural environment), aimed at the development of a user friendly robot with advanced functions for assistance to elderly or disabled persons at home. Focus is given to the robot SAM (Smart Autonomous Majordomo) and its new features of navigation, manipulation, object recognition, and knowledge representation developed for the intuitive supervision of the robot. The results of the technical evaluations show the value and potential of these functions for practical applications. The paper also documents the details of the clinical evaluations carried out with elderly and disabled persons in a therapeutic setting to validate the project.
This document presents the research project ARMEN (Assistive Robotics to Maintain Elderly People in a Natural environment), aimed at the development of a user friendly robot with advanced functions for assistance to elderly or disabled persons at home. Focus is given to the robot SAM (Smart Autonomous Majordomo) and its new features of navigation, manipulation, object recognition, and knowledge representation developed for the intuitive supervision of the robot. The results of the technical evaluations show the value and potential of these functions for practical applications. The paper also documents the details of the clinical evaluations carried out with elderly and disabled persons in a therapeutic setting to validate the project.
31 An in-home medication management solution based on intelligent packaging and ubiquitous sensing
An in-home medication management solution based on intelligent packaging and ubiquitous sensing
A healthcare solution for medication noncompliance problem would help to save $177 billion annually in the United States. In addition, an in-home healthcare station (IHHS) is needed to meet the rapidly increasing demands for daily monitoring with on-site diagnosis and prognosis. In this paper, an intelligent medication management system is proposed based on intelligent package and ubiquitous sensing technologies. Preventive medication management is enabled by an intelligent package sealed by Controlled Delamination Material (CDM) and controlled by RFID link. Various vital parameters are collected by wearable biomedical sensors through the short range wireless link. Onsite diagnosis and prognosis based on these health parameters are supported by the scalable architecture. Additionally, friendly human-machine interface is emphasized to make it convenient for the elderly or disabled patients. A prototype system including the hardware, embedded software, user interface, database and some intelligent packages is implemented to verify the concepts.
A healthcare solution for medication noncompliance problem would help to save $177 billion annually in the United States. In addition, an in-home healthcare station (IHHS) is needed to meet the rapidly increasing demands for daily monitoring with on-site diagnosis and prognosis. In this paper, an intelligent medication management system is proposed based on intelligent package and ubiquitous sensing technologies. Preventive medication management is enabled by an intelligent package sealed by Controlled Delamination Material (CDM) and controlled by RFID link. Various vital parameters are collected by wearable biomedical sensors through the short range wireless link. Onsite diagnosis and prognosis based on these health parameters are supported by the scalable architecture. Additionally, friendly human-machine interface is emphasized to make it convenient for the elderly or disabled patients. A prototype system including the hardware, embedded software, user interface, database and some intelligent packages is implemented to verify the concepts.
32 an “eyes-closed” brain-computer interface system for comm. of patients with oculomotor impairment
Development of an “eyes-closed” brain-computer interface system for communication of patients with oculomotor impairment
The goal of this study was to develop a new steady-state visual evoked potential (SSVEP)-based BCI system, which can be applied to disabled individuals with impaired oculomotor function. The developed BCI system allows users to express their binary intentions without needing to open their eyes. To present visual stimuli, we used a pair of glasses with two LEDs flickering at different frequencies. EEG spectral patterns were classified in real time while participants were attending to one of the presented visual stimuli with their eyes closed. Through offline experiments performed with 11 healthy participants, we confirmed that SSVEP responses could be modulated by visual selective attention to a specific light stimulus penetrating through the eyelids, and could be classified with accuracy high enough for use in a practical BCI system. After customizing the parameters of the proposed SSVEP-based BCI paradigm based on the offline analysis results, binary intentions of five healthy participants and one locked-in state patient were classified online. The average ITR of the online experiments reached to 10.83 bits/min with an average accuracy of 95.3 %. An online experiment applied to a patient with ALS showed a classification accuracy of 80 % and an ITR of 2.78 bits/min, demonstrating the practical feasibility of our BCI paradigm.
The goal of this study was to develop a new steady-state visual evoked potential (SSVEP)-based BCI system, which can be applied to disabled individuals with impaired oculomotor function. The developed BCI system allows users to express their binary intentions without needing to open their eyes. To present visual stimuli, we used a pair of glasses with two LEDs flickering at different frequencies. EEG spectral patterns were classified in real time while participants were attending to one of the presented visual stimuli with their eyes closed. Through offline experiments performed with 11 healthy participants, we confirmed that SSVEP responses could be modulated by visual selective attention to a specific light stimulus penetrating through the eyelids, and could be classified with accuracy high enough for use in a practical BCI system. After customizing the parameters of the proposed SSVEP-based BCI paradigm based on the offline analysis results, binary intentions of five healthy participants and one locked-in state patient were classified online. The average ITR of the online experiments reached to 10.83 bits/min with an average accuracy of 95.3 %. An online experiment applied to a patient with ALS showed a classification accuracy of 80 % and an ITR of 2.78 bits/min, demonstrating the practical feasibility of our BCI paradigm.
33 A hybrid BCI for enhanced control of a telepresence robot
A hybrid BCI for enhanced control of a telepresence robot
Motor-disabled end users have successfully driven a telepresence robot in a complex environment using a Brain-Computer Interface (BCI). However, to facilitate the interaction aspect that underpins the notion of telepresence, users must be able to voluntarily and reliably stop the robot at any moment, not just drive from point to point. In this work, we propose to exploit the user's residual muscular activity to provide a fast and reliable control channel, which can start/stop the telepresence robot at any moment. Our preliminary results show that not only does this hybrid approach increase the accuracy, but it also helps to reduce the workload and was the preferred control paradigm of all the participants.
Motor-disabled end users have successfully driven a telepresence robot in a complex environment using a Brain-Computer Interface (BCI). However, to facilitate the interaction aspect that underpins the notion of telepresence, users must be able to voluntarily and reliably stop the robot at any moment, not just drive from point to point. In this work, we propose to exploit the user's residual muscular activity to provide a fast and reliable control channel, which can start/stop the telepresence robot at any moment. Our preliminary results show that not only does this hybrid approach increase the accuracy, but it also helps to reduce the workload and was the preferred control paradigm of all the participants.
34 EEG-Based Brain-Controlled Mobile Robots: A Survey
EEG-Based Brain-Controlled Mobile Robots: A Survey
EEG-based brain-controlled mobile robots can serve as powerful aids for severely disabled people in their daily life, especially to help them move voluntarily. In this paper, we provide a comprehensive review of the complete systems, key techniques, and evaluation issues of brain-controlled mobile robots along with some insights into related future research and development issues. We first review and classify various complete systems of brain-controlled mobile robots into two categories from the perspective of their operational modes. We then describe key techniques that are used in these brain-controlled mobile robots including the brain-computer interface techniques and shared control techniques. This description is followed by an analysis of the evaluation issues of brain-controlled mobile robots including participants, tasks and environments, and evaluation metrics. We conclude this paper with a discussion of the current challenges and future research directions.
EEG-based brain-controlled mobile robots can serve as powerful aids for severely disabled people in their daily life, especially to help them move voluntarily. In this paper, we provide a comprehensive review of the complete systems, key techniques, and evaluation issues of brain-controlled mobile robots along with some insights into related future research and development issues. We first review and classify various complete systems of brain-controlled mobile robots into two categories from the perspective of their operational modes. We then describe key techniques that are used in these brain-controlled mobile robots including the brain-computer interface techniques and shared control techniques. This description is followed by an analysis of the evaluation issues of brain-controlled mobile robots including participants, tasks and environments, and evaluation metrics. We conclude this paper with a discussion of the current challenges and future research directions.
35 Wheelchair obstacle avoidance based on fuzzy controller and ultrasonic sensors
Wheelchair obstacle avoidance based on fuzzy controller and ultrasonic sensors
Electric wheelchair is one of the most used for the movement of disabled and aged people. This work introduces an obstacle avoidance system aimed for providing more autonomous navigation of a electric wheelchair (EW) in unknown indoor environments. These technologies seek to increase the independence of people with disabilities and improve their quality of life by making the most of each individuals abilities. Furthermore, the integration of an ultrasonic sensor to avoid obstacle and a fuzzy controller to generates velocity for aim to join the target position. A prototype of EW has been equipped with control unit based on two micro-controller. The first for manage the motors velocity. The second for explore the ultrasonic sensors. Two micro-controller exchanges the information with a PC board type PC 104, which is used to process data from sensors and encoders. The information is processed by a control algorithm based on fuzzy logic. The control algorithm is optimized using the gradient method to minimize the path traveled to reach the desired position. The practical implementation demonstrates the algorithm validity for obstacle avoidance and goal achievement with a minimum path and greater security.
Electric wheelchair is one of the most used for the movement of disabled and aged people. This work introduces an obstacle avoidance system aimed for providing more autonomous navigation of a electric wheelchair (EW) in unknown indoor environments. These technologies seek to increase the independence of people with disabilities and improve their quality of life by making the most of each individuals abilities. Furthermore, the integration of an ultrasonic sensor to avoid obstacle and a fuzzy controller to generates velocity for aim to join the target position. A prototype of EW has been equipped with control unit based on two micro-controller. The first for manage the motors velocity. The second for explore the ultrasonic sensors. Two micro-controller exchanges the information with a PC board type PC 104, which is used to process data from sensors and encoders. The information is processed by a control algorithm based on fuzzy logic. The control algorithm is optimized using the gradient method to minimize the path traveled to reach the desired position. The practical implementation demonstrates the algorithm validity for obstacle avoidance and goal achievement with a minimum path and greater security.
36 A Dual-Mode Human Computer Interface Combining Speech & Tongue Motion with Severe Disabilities
A Dual-Mode Human Computer Interface Combining Speech and Tongue Motion for People with Severe Disabilities
We are presenting a new wireless and wearable human computer interface called the dual-mode Tongue Drive System (dTDS), which is designed to allow people with severe disabilities to use computers more effectively with increased speed, flexibility, usability, and independence through their tongue motion and speech. The dTDS detects users' tongue motion using a magnetic tracer and an array of magnetic sensors embedded in a compact and ergonomic wireless headset. It also captures the users' voice wirelessly using a small microphone embedded in the same headset. Preliminary evaluation results based on 14 able-bodied subjects and three individuals with high level spinal cord injuries at level C3-C5 indicated that the dTDS headset, combined with a commercially available speech recognition (SR) software, can provide end users with significantly higher performance than either unimodal forms based on the tongue motion or speech alone, particularly in completing tasks that require both pointing and text entry.
We are presenting a new wireless and wearable human computer interface called the dual-mode Tongue Drive System (dTDS), which is designed to allow people with severe disabilities to use computers more effectively with increased speed, flexibility, usability, and independence through their tongue motion and speech. The dTDS detects users' tongue motion using a magnetic tracer and an array of magnetic sensors embedded in a compact and ergonomic wireless headset. It also captures the users' voice wirelessly using a small microphone embedded in the same headset. Preliminary evaluation results based on 14 able-bodied subjects and three individuals with high level spinal cord injuries at level C3-C5 indicated that the dTDS headset, combined with a commercially available speech recognition (SR) software, can provide end users with significantly higher performance than either unimodal forms based on the tongue motion or speech alone, particularly in completing tasks that require both pointing and text entry.
37 Video Demo: An Egocentric Vision Based Assistive Co-robot
Video Demo: An Egocentric Vision Based Assistive Co-robot
We present the video demo of the prototype of an egocentric vision based assistive co-robot system. In this co-robot system, the user is wearing a pair of glasses with a forward looking camera, and is actively engaged in the control loop of the robot in navigational tasks. The egocentric vision glasses serve for two purposes. First, it serves as a source of visual input to request the robot to find a certain object in the environment. Second, the motion patterns computed from the egocentric video associated with a specific set of head movements are exploited to guide the robot to find the object. These are especially helpful for quadriplegic individuals who do not have the needed hand functionality for control with other modalities (e.g., joystick). In our co-robot system, when the robot does not fulfill the object finding task in a pre-specified time window, it would actively solicit user controls for guidance. Then the users can use the egocentric vision based gesture interface to orient the robot towards the direction of the object. After that the robot will automatically navigate towards the object until it finds it. Our experiments validated the efficacy of the closed-loop design to engage the human in the loop.
We present the video demo of the prototype of an egocentric vision based assistive co-robot system. In this co-robot system, the user is wearing a pair of glasses with a forward looking camera, and is actively engaged in the control loop of the robot in navigational tasks. The egocentric vision glasses serve for two purposes. First, it serves as a source of visual input to request the robot to find a certain object in the environment. Second, the motion patterns computed from the egocentric video associated with a specific set of head movements are exploited to guide the robot to find the object. These are especially helpful for quadriplegic individuals who do not have the needed hand functionality for control with other modalities (e.g., joystick). In our co-robot system, when the robot does not fulfill the object finding task in a pre-specified time window, it would actively solicit user controls for guidance. Then the users can use the egocentric vision based gesture interface to orient the robot towards the direction of the object. After that the robot will automatically navigate towards the object until it finds it. Our experiments validated the efficacy of the closed-loop design to engage the human in the loop.
38 Implementation of a real-time human movements classifier by using mobile equipment
Implementation of a real-time human movements classifier by using mobile equipment
This paper deals with implementation of a real-time system for classification of human movements based on the use of information recorded through an Android smartphone, namely acceleration and heart rate signals. One of the main features of the application is that the overall equipment is cost effective, since triaxial accelerometers are currently widely available on Smartphone without additional costs. Similar applications previously appeared in literature, considered ad-hoc devices both to measure and communicate data. The classification is carried out based on the use of 5 classification indexes, appropriately computed from recorded signals. The present version of the Android application allows discriminating among 9 classes of movement and can classify individual movements or operate in continuous mode. Results shows that the classification based on strictly acceleration signals is extremely effective, while it is slightly less accurate discrimination between pairs of movements whose discrimination is related exclusively to heart rate. The continuous operation mode could be interesting for the remote surveillance of individuals (e.g. elderly and/or disable people).
This paper deals with implementation of a real-time system for classification of human movements based on the use of information recorded through an Android smartphone, namely acceleration and heart rate signals. One of the main features of the application is that the overall equipment is cost effective, since triaxial accelerometers are currently widely available on Smartphone without additional costs. Similar applications previously appeared in literature, considered ad-hoc devices both to measure and communicate data. The classification is carried out based on the use of 5 classification indexes, appropriately computed from recorded signals. The present version of the Android application allows discriminating among 9 classes of movement and can classify individual movements or operate in continuous mode. Results shows that the classification based on strictly acceleration signals is extremely effective, while it is slightly less accurate discrimination between pairs of movements whose discrimination is related exclusively to heart rate. The continuous operation mode could be interesting for the remote surveillance of individuals (e.g. elderly and/or disable people).
39 Design and development of navigation system by using RFID technology
Design and development of navigation system by using RFID technology
In this modern era, independent mobility for blind and partially sighted people is an important objective to achieve. There are many assistive way to help visually impaired people namely, Guide Dog, White Cane as well as the tactile paving which is a very common assistive tool throughout the world, support the visually disable person walk in the correct path from one place to another. Therefore, RFID technology is introduced in this project to support the visual disable people more efficiently in outdoor activities. The system has been developed based on the integration of RFID wireless technology and voice system which assembled on the traditional white cane in order to help the visual impairment to identify the surrounding landmark via verbal notification. The tactile detection by RFID system composed by RFID system integrated on traditional white cane and RFID TAG which installed on the tactile paving where the TAG stored unique information uses to navigate/notify the user once they scan/tap the tactile paving by the designed white cane. The proposed RFID integrated white cane is successfully designed and evaluated the range of RFID tag which can be detected.
In this modern era, independent mobility for blind and partially sighted people is an important objective to achieve. There are many assistive way to help visually impaired people namely, Guide Dog, White Cane as well as the tactile paving which is a very common assistive tool throughout the world, support the visually disable person walk in the correct path from one place to another. Therefore, RFID technology is introduced in this project to support the visual disable people more efficiently in outdoor activities. The system has been developed based on the integration of RFID wireless technology and voice system which assembled on the traditional white cane in order to help the visual impairment to identify the surrounding landmark via verbal notification. The tactile detection by RFID system composed by RFID system integrated on traditional white cane and RFID TAG which installed on the tactile paving where the TAG stored unique information uses to navigate/notify the user once they scan/tap the tactile paving by the designed white cane. The proposed RFID integrated white cane is successfully designed and evaluated the range of RFID tag which can be detected.
40 Technology in Locomotion and Domotic Control for Quadriplegic
Technology in Locomotion and Domotic Control for Quadriplegic
The electronic control technology for mobility and domotics control (home automation systems) can be a great help to people with spinal injuries who have major limitations in the mobility and in the use of devices for normal life activity. The design of different type of technologies to provide to the patient aids is able to increase his quality of life. A spinal cord injury (SCI) is typically defined as damage or trauma to the spinal cord that in turn results in a loss or impaired function resulting in reduced mobility or feeling. Equipment manufacturers say that designing applications for users with disabilities is not cost-effective. Most of these systems are designed for users who are not disabled; therefore, systems that address disabled users need special interfaces in order to be accessible. In this paper we present a method for developing an electric and mechanical prototype for quadriplegic people provided that they can perform specific grade of mobility. Using an infrared technique, computer vision technology and mechanical design, users can perform some activities for improve his quality of live and give some grade of independence.
The electronic control technology for mobility and domotics control (home automation systems) can be a great help to people with spinal injuries who have major limitations in the mobility and in the use of devices for normal life activity. The design of different type of technologies to provide to the patient aids is able to increase his quality of life. A spinal cord injury (SCI) is typically defined as damage or trauma to the spinal cord that in turn results in a loss or impaired function resulting in reduced mobility or feeling. Equipment manufacturers say that designing applications for users with disabilities is not cost-effective. Most of these systems are designed for users who are not disabled; therefore, systems that address disabled users need special interfaces in order to be accessible. In this paper we present a method for developing an electric and mechanical prototype for quadriplegic people provided that they can perform specific grade of mobility. Using an infrared technique, computer vision technology and mechanical design, users can perform some activities for improve his quality of live and give some grade of independence.
41 A Voice-Input Voice-Output Communication Aid for People with Severe Speech Impairment
A Voice-Input Voice-Output Communication Aid for People with Severe Speech Impairment
A new form of augmentative and alternative communication (AAC) device for people with severe speech impairment-the voice-input voice-output communication aid (VIVOCA)-is described. The VIVOCA recognizes the disordered speech of the user and builds messages, which are converted into synthetic speech. System development was carried out employing user-centered design and development methods, which identified and refined key requirements for the device. A novel methodology for building small vocabulary, speaker-dependent automatic speech recognizers with reduced amounts of training data, was applied. Experiments showed that this method is successful in generating good recognition performance (mean accuracy 96%) on highly disordered speech, even when recognition perplexity is increased. The selected message-building technique traded off various factors including speed of message construction and range of available message outputs. The VIVOCA was evaluated in a field trial by individuals with moderate to severe dysarthria and confirmed that they can make use of the device to produce intelligible speech output from disordered speech input. The trial highlighted some issues which limit the performance and usability of the device when applied in real usage situations, with mean recognition accuracy of 67% in these circumstances. These limitations will be addressed in future work.
A new form of augmentative and alternative communication (AAC) device for people with severe speech impairment-the voice-input voice-output communication aid (VIVOCA)-is described. The VIVOCA recognizes the disordered speech of the user and builds messages, which are converted into synthetic speech. System development was carried out employing user-centered design and development methods, which identified and refined key requirements for the device. A novel methodology for building small vocabulary, speaker-dependent automatic speech recognizers with reduced amounts of training data, was applied. Experiments showed that this method is successful in generating good recognition performance (mean accuracy 96%) on highly disordered speech, even when recognition perplexity is increased. The selected message-building technique traded off various factors including speed of message construction and range of available message outputs. The VIVOCA was evaluated in a field trial by individuals with moderate to severe dysarthria and confirmed that they can make use of the device to produce intelligible speech output from disordered speech input. The trial highlighted some issues which limit the performance and usability of the device when applied in real usage situations, with mean recognition accuracy of 67% in these circumstances. These limitations will be addressed in future work.
42 implementation of Home Automated Telemanagement platform for interactive biking exercise
Design and implementation of Home Automated Telemanagement platform for interactive biking exercise
Geriatric rehabilitation facilitates therapeutic interventions whose purpose is to restore functional ability or enhance residual functional capability in elderly people with disabling impairments. Despite great demand for such interventions, limited research has been conducted in utilizing telemedicine to promote geriatric rehabilitation. Previously we reported successful implementation of physical telerehabilitation in patients with multiple sclerosis. In this article, we seek to extend our experience in implementing physical telerehabilitation systems to supporting home-based exercise in older adults. The goal of this report is to describe design and implementation of a geriatric telerehabilitation system facilitating safe cycling exercise at senior citizen homes.
Geriatric rehabilitation facilitates therapeutic interventions whose purpose is to restore functional ability or enhance residual functional capability in elderly people with disabling impairments. Despite great demand for such interventions, limited research has been conducted in utilizing telemedicine to promote geriatric rehabilitation. Previously we reported successful implementation of physical telerehabilitation in patients with multiple sclerosis. In this article, we seek to extend our experience in implementing physical telerehabilitation systems to supporting home-based exercise in older adults. The goal of this report is to describe design and implementation of a geriatric telerehabilitation system facilitating safe cycling exercise at senior citizen homes.
43 Improved biomedical device for spasticity quantification
Improved biomedical device for spasticity quantification
Spasticity is a disabling motor disorder, resulting from neurological impairments. This disorder has now a remarkable social and economic impact, affecting approximately 1.9% of the world population. The methods for spasticity quantification play an important role in the medical treatments' success and may reduce the associated costs. The Modified Asworth Scale is the most commonly method used for the evaluation of this disorder, but experts agree that it is not a precise and reproducible method. Despite the literature presents several innovative methods, there is still a need for improvements. This paper presents the development of a biomedical device for spasticity quantification, based on the velocity dependent increase of the stretch reflexes threshold. The proposed approach was tested in clinical environment, with the collaboration of patients with spasticity. The experimental trials allow confirming the sensibility, reproducibility and reliability of the proposed approach.
Spasticity is a disabling motor disorder, resulting from neurological impairments. This disorder has now a remarkable social and economic impact, affecting approximately 1.9% of the world population. The methods for spasticity quantification play an important role in the medical treatments' success and may reduce the associated costs. The Modified Asworth Scale is the most commonly method used for the evaluation of this disorder, but experts agree that it is not a precise and reproducible method. Despite the literature presents several innovative methods, there is still a need for improvements. This paper presents the development of a biomedical device for spasticity quantification, based on the velocity dependent increase of the stretch reflexes threshold. The proposed approach was tested in clinical environment, with the collaboration of patients with spasticity. The experimental trials allow confirming the sensibility, reproducibility and reliability of the proposed approach.
44 Mind-controlled augmentative & alternative commun. for people with severe motor disabilities
Mind-controlled augmentative and alternative communication for people with severe motor disabilities
This paper describes the research conducted to design the Arabic Brain Communicator (ABC), which is a brain-controlled typing system designed to facilitate communication for people with severe motor disabilities in Arabic. A user centered design was adopted; it included empirical investigations and meetings with Subject-Matter Experts and possible users. Activities conducted in the analysis and design of the system are discussed.
This paper describes the research conducted to design the Arabic Brain Communicator (ABC), which is a brain-controlled typing system designed to facilitate communication for people with severe motor disabilities in Arabic. A user centered design was adopted; it included empirical investigations and meetings with Subject-Matter Experts and possible users. Activities conducted in the analysis and design of the system are discussed.
45 Haptic hand-tremor simulation for enhancing empathy with disabled users
Haptic hand-tremor simulation for enhancing empathy with disabled users
This paper presents a system designed to induce, in healthy subjects, artificial hand-tremor that is observed in persons affected by neurological impairments. The objective is to allow a healthy user to feel in first person the effect of the impairment while performing common manipulative tasks in order for her/him to understand and gain empathy with the impaired person. The developed tool is based on a wrist-attached desktop haptic interface with a workspace that is comparable to that of the arm of the user. Such device is able to exert controlled forces on the user's wrist and induce hand-tremor whose frequency and amplitude is correlated with different pathologies. The control of the device is based on the recording and playback of tremor signals acquired by a motion tracker. In this paper, we present the system with its dynamic characteristics and three different types of controller are experimentally tested and compared.
This paper presents a system designed to induce, in healthy subjects, artificial hand-tremor that is observed in persons affected by neurological impairments. The objective is to allow a healthy user to feel in first person the effect of the impairment while performing common manipulative tasks in order for her/him to understand and gain empathy with the impaired person. The developed tool is based on a wrist-attached desktop haptic interface with a workspace that is comparable to that of the arm of the user. Such device is able to exert controlled forces on the user's wrist and induce hand-tremor whose frequency and amplitude is correlated with different pathologies. The control of the device is based on the recording and playback of tremor signals acquired by a motion tracker. In this paper, we present the system with its dynamic characteristics and three different types of controller are experimentally tested and compared.
46 Improving the quality of life of dependent & disabled people through home automation
Improving the quality of life of dependent and disabled people through home automation and tele-assistance
Lack of mobility in certain groups of dependents forces them to spend a lot of time at home. In many cases, this limitation makes these people to stay most of the time in a specific room in their houses such as the bedroom or living room, where the only means of entertainment and information gathering is the TV set. Most of present-day households have a personal computer, but the digital divide and lack of adaptation produces certain rejection in this population group. This paper discusses a proposal that leverages the familiar TV set to be used as the user interface for a complete tele-assistance system and control centre of home automation devices. For this, the system makes use of a Home Theatre Personal Computer (HTPC) connected to the TV and offers the features like the monitoring and remote monitoring of a wide range of vital signs, intelligent adaptation of services and interfaces according to the level and type of disability, and centralized control of home automation devices installed at home.
Lack of mobility in certain groups of dependents forces them to spend a lot of time at home. In many cases, this limitation makes these people to stay most of the time in a specific room in their houses such as the bedroom or living room, where the only means of entertainment and information gathering is the TV set. Most of present-day households have a personal computer, but the digital divide and lack of adaptation produces certain rejection in this population group. This paper discusses a proposal that leverages the familiar TV set to be used as the user interface for a complete tele-assistance system and control centre of home automation devices. For this, the system makes use of a Home Theatre Personal Computer (HTPC) connected to the TV and offers the features like the monitoring and remote monitoring of a wide range of vital signs, intelligent adaptation of services and interfaces according to the level and type of disability, and centralized control of home automation devices installed at home.
47 Email Access by Visually Impaired
Email Access by Visually Impaired
Web accessibility refers to the inclusive practice of making web based applications usable by people of all abilities and disabilities. When web applications are correctly designed, developed and edited, all users can have equal access to information and functionality, also they can be accommodated without decreasing the usability of the application for non disabled users. The most common and essential need for using the internet is accessing emails. Little systematic applied research has been conducted on how a vision impaired user can have an access to his emails and this paper aims to fill some of that gap.
Web accessibility refers to the inclusive practice of making web based applications usable by people of all abilities and disabilities. When web applications are correctly designed, developed and edited, all users can have equal access to information and functionality, also they can be accommodated without decreasing the usability of the application for non disabled users. The most common and essential need for using the internet is accessing emails. Little systematic applied research has been conducted on how a vision impaired user can have an access to his emails and this paper aims to fill some of that gap.
48 Adaptive architecture for assisted living systems
Adaptive architecture for assisted living systems
Recent achievements of telemedicine and surveillance techniques open new challenges for development of assisted living systems for elderly or disabled. This paper presents an universal approach to designing of multimodal health monitoring systems with regard to a paradigm of ubiquitous and personalized medicine. The design combines advantages of intelligent reprogrammable sensors, flexibility of reconfigurable networks built on human body area or embedded in building infrastructures and automatic, person-dependent decision making based on presumptions and experience represented in artificial intelligence. Considering these key features leads to a system design suitable for majority of human surveillance purposes including home care, hospices, rehabilitation and sport training. The paper also presents a prototype system designed accordingly to proposed rules and tested in some experimental setups intended to simulate volunteers' homes. The results confirm that the system adapts to environment-specific relations, provides seamless monitoring with no limit of indoor and outdoor mobility and adapts to subject's habits in recognition of normal, suspected and dangerous events.
Recent achievements of telemedicine and surveillance techniques open new challenges for development of assisted living systems for elderly or disabled. This paper presents an universal approach to designing of multimodal health monitoring systems with regard to a paradigm of ubiquitous and personalized medicine. The design combines advantages of intelligent reprogrammable sensors, flexibility of reconfigurable networks built on human body area or embedded in building infrastructures and automatic, person-dependent decision making based on presumptions and experience represented in artificial intelligence. Considering these key features leads to a system design suitable for majority of human surveillance purposes including home care, hospices, rehabilitation and sport training. The paper also presents a prototype system designed accordingly to proposed rules and tested in some experimental setups intended to simulate volunteers' homes. The results confirm that the system adapts to environment-specific relations, provides seamless monitoring with no limit of indoor and outdoor mobility and adapts to subject's habits in recognition of normal, suspected and dangerous events.
49 Handicapped assisting robot
Handicapped assisting robot
This article presents the Robotic system which was developed at the University of Bermen's Institute of Automation (IAT). This system offers increased control functionality for the disabled users. This robot consists of an electric wheel-chair, equipped with the robot arm MANUS. A computer controls both the devices the man-machine interface (MMI) consists of a flat screen and a speech interface. Friend's hardware and software are described and the current state of development is then presented, as well as research results that will be integrated soon. After a short explanation of speech interface, the methods developed for semiautonomous control are described. These are programmed by demonstration, visual serving and configuration planning based on the method of imaginary links. We also described the state of integration and our experience to date.
This article presents the Robotic system which was developed at the University of Bermen's Institute of Automation (IAT). This system offers increased control functionality for the disabled users. This robot consists of an electric wheel-chair, equipped with the robot arm MANUS. A computer controls both the devices the man-machine interface (MMI) consists of a flat screen and a speech interface. Friend's hardware and software are described and the current state of development is then presented, as well as research results that will be integrated soon. After a short explanation of speech interface, the methods developed for semiautonomous control are described. These are programmed by demonstration, visual serving and configuration planning based on the method of imaginary links. We also described the state of integration and our experience to date.
Wednesday, January 8, 2014
Subscribe to:
Comments (Atom)