Miehlbradt et al. suggest an alternative to current control interfaces (frequently employing joysticks) that that require intensive practice. They have developed an intuitive gesture based interface for real and simulated drones. They recorded the upper-body kinematics and muscle activities during the generation of movements that would imitate the behavior of a flying drone. After identifying two main interaction strategies used by the participants, they assessed the capacity of potential users to actively steer the path of a virtual drone employing these two strategies. Eventually, they evaluated the transferability of the skills acquired during simulation training to the control of a real drone. Their abstract, and a video:
The accurate teleoperation of robotic devices requires simple, yet intuitive and reliable control interfaces. However, current human–machine interfaces (HMIs) often fail to fulfill these characteristics, leading to systems requiring an intensive practice to reach a sufficient operation expertise. Here, we present a systematic methodology to identify the spontaneous gesture-based interaction strategies of naive individuals with a distant device, and to exploit this information to develop a data-driven body–machine interface (BoMI) to efficiently control this device. We applied this approach to the specific case of drone steering and derived a simple control method relying on upper-body motion. The identified BoMI allowed participants with no prior experience to rapidly master the control of both simulated and real drones, outperforming joystick users, and comparing with the control ability reached by participants using the bird-like flight simulator Birdly.
No comments:
Post a Comment