XR and AI based human in the loop assistance for industry X.0 and above (Record no. 433028)

MARC details
000 -LEADER
fixed length control field 03783nam a22002537a 4500
008 - FIXED-LENGTH DATA ELEMENTS--GENERAL INFORMATION
fixed length control field 250116b |||||||| |||| 00| 0 eng d
041 ## - LANGUAGE CODE
Language code of text/sound track or separate title en
082 ## - DEWEY DECIMAL CLASSIFICATION NUMBER
Classification number 006.3
Item number RAJ
100 ## - MAIN ENTRY--PERSONAL NAME
Personal name Raj, Subin
245 ## - TITLE STATEMENT
Title XR and AI based human in the loop assistance for industry X.0 and above
260 ## - PUBLICATION, DISTRIBUTION, ETC. (IMPRINT)
Place of publication, distribution, etc Bangalore :
Name of publisher, distributor, etc Indian Institute of Science,
Date of publication, distribution, etc 2024.
300 ## - PHYSICAL DESCRIPTION
Extent xv, 240 p. :
Other physical details col. ill.
Accompanying material e-Thesis
Size of unit 25.92 Mb
500 ## - GENERAL NOTE
General note Includes bibliographical references.
502 ## - DISSERTATION NOTE
Dissertation note PhD;2024;Department Design and Manufacturing.
520 ## - SUMMARY, ETC.
Summary, etc The current generation of Industry 4.0 integrates automation and data in manufacturing, while the upcoming generation, Industry 5.0, aims to enable seamless interaction between humans, robots, and machines for sustainable and personalized product and service creation. This requires manufacturing components according to customer wishes, leading to frequent changes in the production line. In this context, automation proves unsustainable and could be addressed through a human-in-the-loop system. The effectiveness of the human-in-the-loop system depends on how well the information transfers to them. The recent evaluation of eXtended Reality (XR) technology allows a new way of information exchange. In this thesis, we focus on developing an XR-based human-in-the-loop system for various complex industry tasks. We introduce a new way of information rendering using deep learning and machine learning algorithms. Additionally, we propose an effective method of data generation to train deep learning models, which reduces the time and manpower required to develop the dataset. The proposed method of data generation improves the model's performance. We develop the Mixed Reality (MR) environment for manual assembly guidance and propose a machine learning and deep learning-based markerless instruction rendering method. We analyze the MR-based manual assembly guidance with simple and complex assembly tasks. The assembly instructions are given to the user through a multimodal user interface that provides instructions based on the user's hand and eye position. The proposed multimodal user interface effectively guides the user to complete the assembly task. However, some assembly tasks require human-robot cooperation due to human limitations. When humans and robots work cooperatively, the effectiveness of task completion depends on information exchange. Therefore, we develop an XR-based multimodal user interaction to improve Human-Robot Interaction (HRI) with haptic feedback, which assists users in remotely performing tasks. The haptic feedback, generated using the potential field method, aids users in controlling robots without collisions. We introduce a multimodal XR-based framework for effective communication between humans and robots, utilizing vision models to provide information about the remote side to local users. Seamless communication between humans and robots is facilitated through multiple modalities such as haptic, speech, visual, and text. The system's effectiveness in pick-and-place activities and adaptation for complex industry use cases such as peg in the hole and welding is further evaluated. The results demonstrate the system's efficacy in two specific human-robot collaborative scenarios in industries: arc welding and assembly tasks. The XR-based multimodal instruction method effectively guides the user in performing manual assembly tasks and enables a new way of human-robot cooperation.
650 ## - SUBJECT ADDED ENTRY--TOPICAL TERM
Topical term or geographic name as entry element Mixed Reality
650 ## - SUBJECT ADDED ENTRY--TOPICAL TERM
Topical term or geographic name as entry element Virtual Reality
650 ## - SUBJECT ADDED ENTRY--TOPICAL TERM
Topical term or geographic name as entry element Human Robot Cooperation
650 ## - SUBJECT ADDED ENTRY--TOPICAL TERM
Topical term or geographic name as entry element Teleoperation
650 ## - SUBJECT ADDED ENTRY--TOPICAL TERM
Topical term or geographic name as entry element Multimodal User Interface
700 ## - ADDED ENTRY--PERSONAL NAME
Personal name Advised by Biswas, Pradipta; Chakrabarti, Amaresh
856 ## - ELECTRONIC LOCATION AND ACCESS
Uniform Resource Identifier https://etd.iisc.ac.in/handle/2005/6771
942 ## - ADDED ENTRY ELEMENTS (KOHA)
Koha item type Thesis

No items available.

                                                                                                                                                                                                    Facebook    Twitter

                             Copyright © 2024. J.R.D. Tata Memorial Library, Indian Institute of Science, Bengaluru - 560012

                             Contact   Phone: +91 80 2293 2832