Visual-to-tactile interface to detect motions in real-time for persons with visual impairments
Document Type
Conference Proceeding
Date of Original Version
1-1-2006
Abstract
In this paper we propose an assistive technology (AT) device for individuals with visual impairments. This device would extract motion information from images acquired from a capture device and present the information through a tactile interface. The algorithm utilized provides information about the amount of motion present in the field of view and the centroid of the motion. Depending upon the number of stimulators and the location of the active stimulators, the direction and size of the objects could be determined. Currently, a simulation of the operation is performed using a computer to process images captured from an inexpensive camera. The primary goal of this design is to provide an AT device that is practical, cost-effective, and useful for individuals with visual impairments. © 2006 IEEE.
Publication Title, e.g., Journal
Proceedings of the IEEE Annual Northeast Bioengineering Conference, NEBEC
Volume
2006
Citation/Publisher Attribution
Chabot, Eugene, and Ying Sun. "Visual-to-tactile interface to detect motions in real-time for persons with visual impairments." Proceedings of the IEEE Annual Northeast Bioengineering Conference, NEBEC 2006, (2006): 63-64. https://digitalcommons.uri.edu/ele_facpubs/788