Vision-driven tele-operation for robot manipulation
Material type:
- 600 HIM
Item type | Current library | Call number | Status | Date due | Barcode | |
---|---|---|---|---|---|---|
![]() |
JRD Tata Memorial Library | 600 HIM (Browse shelf(Opens below)) | Available | ET00155 |
include bibliographic reference and index
MTech (Res); 2023; Electrical communication engineering
It’s worth the time to acknowledge just how amazingly well we humans can perform tasks with our hands. Starting from picking up a coin to buttoning up our shirts. All these tasks for robots are still at the very forefront of robotics research & require significant interactions between vision, perception, planning & control. Becoming an expert in all of them is quite a challenge. Tele-operation augments the robot’s capability for performing complex tasks in unstructured en- vironments and unfamiliar objects with human support. It offers the robots reasoning skills, intuition, and creativity for performing these tasks in unstructured environments and unfamiliar objects. However, most Tele-operation techniques either use some sort of sensor/gloves or expensive cameras to capture the gestures of the human, making the operation bulky as well as expensive. We present a vision-based Tele-operation of the KUKA IIWA industrial robot arm that imitates in real-time the natural motion of the human operator seen from a depth camera.
There are no comments on this title.