TY - GEN
T1 - Conceptual design of a robotic assistant system based on projections onto flat surfaces for human-machine interaction applications
AU - Yepez, J. G.Vargas
AU - Moreno, A. Yepes
AU - Prada, S. Roa
N1 - Publisher Copyright:
© 2016 IEEE.
PY - 2017/1/9
Y1 - 2017/1/9
N2 - The possibility of more intuitive human-machine interfaces has sparked the development of new visual technologies. The way humans interact with elements of their environment should not be limited to the screens of phones or computers. Other alternatives where a sensation of spatial freedom are under development. Projection systems, using continuous light on surrounding surfaces, represent a major area of exploration. The great level of development in artificial vision hardware and software tools enables the acquisition of data from the user and his/her environment, while in the background, a software can analyze in real time the variations of the scene without intervention of the user. This kind of data processing makes possible the integration between what the user is doing and seeing. The device proposed in this paper uses an arrangement of infrared sensors that capture the hand gestures from the user, and then points towards a projection surface in the user workspace. A gesture recognition software platform recreates the 3D environment of the user and analyzes the motion of the key points of the user hands. After obtaining these data, a process of comparison with previously established patterns determines if the user is performing some kind of preset command with his hands. If so, the system immediately converts this signal into a command that will be executed with the assistance of an Arduino platform. The embedded platform carries out a previously established protocol, in which, by making use of a mechatronic system of 2 degrees of freedom that supports a micro projector, it allows the user to adjust the position of the projected image over a surface, allowing the user to use a 360° virtual space. The main goal of this system is to generate an interface where the gestures of the hands of the user not only allow him/her to interact with software-level elements, but also with mechatronic components that may be physically present such as robots or home automation devices. Once the robotic assistant system proposed is in operation, its user will not have to refer to a physical screen to enter control commands; instead of that, a robotic assistant system, with more intuitive and natural motions, will facilitate the realization of the user commands.
AB - The possibility of more intuitive human-machine interfaces has sparked the development of new visual technologies. The way humans interact with elements of their environment should not be limited to the screens of phones or computers. Other alternatives where a sensation of spatial freedom are under development. Projection systems, using continuous light on surrounding surfaces, represent a major area of exploration. The great level of development in artificial vision hardware and software tools enables the acquisition of data from the user and his/her environment, while in the background, a software can analyze in real time the variations of the scene without intervention of the user. This kind of data processing makes possible the integration between what the user is doing and seeing. The device proposed in this paper uses an arrangement of infrared sensors that capture the hand gestures from the user, and then points towards a projection surface in the user workspace. A gesture recognition software platform recreates the 3D environment of the user and analyzes the motion of the key points of the user hands. After obtaining these data, a process of comparison with previously established patterns determines if the user is performing some kind of preset command with his hands. If so, the system immediately converts this signal into a command that will be executed with the assistance of an Arduino platform. The embedded platform carries out a previously established protocol, in which, by making use of a mechatronic system of 2 degrees of freedom that supports a micro projector, it allows the user to adjust the position of the projected image over a surface, allowing the user to use a 360° virtual space. The main goal of this system is to generate an interface where the gestures of the hands of the user not only allow him/her to interact with software-level elements, but also with mechatronic components that may be physically present such as robots or home automation devices. Once the robotic assistant system proposed is in operation, its user will not have to refer to a physical screen to enter control commands; instead of that, a robotic assistant system, with more intuitive and natural motions, will facilitate the realization of the user commands.
UR - http://www.scopus.com/inward/record.url?scp=85011983767&partnerID=8YFLogxK
U2 - 10.1109/CCRA.2016.7811424
DO - 10.1109/CCRA.2016.7811424
M3 - Libros de Investigación
AN - SCOPUS:85011983767
T3 - 2016 IEEE Colombian Conference on Robotics and Automation, CCRA 2016 - Conference Proceedings
BT - 2016 IEEE Colombian Conference on Robotics and Automation, CCRA 2016 - Conference Proceedings
A2 - Lindado, Henry Carrillo
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 1st IEEE Colombian Conference on Robotics and Automation, CCRA 2016
Y2 - 29 September 2016 through 30 September 2016
ER -