You are here

Novel Mixed Reality Interface for Effective and Efficient Human Robot Interaction with Unique Mobility Platforms

Title: A Novel Mixed Reality Interface for Effective and Efficient Human Robot Interaction with Unique Mobility Platforms.

Inaccessible until May 8, 2020 due to copyright restrictions.

Name(s): Kopinsky, Ryan J., author
Collins, Emmanuel G., professor directing dissertation
Roberts, Rodney G., university representative
Clark, Jonathan E., committee member
Oates, William, committee member
Barber, Daniel J., committee member
Florida State University, degree granting institution
College of Engineering, degree granting college
Department of Mechanical Engineering, degree granting department
Type of Resource: text
Genre: Text
Doctoral Thesis
Issuance: monographic
Date Issued: 2017
Publisher: Florida State University
Place of Publication: Tallahassee, Florida
Physical Form: computer
online resource
Extent: 1 online resource (99 pages)
Language(s): English
Abstract/Description: Autonomous robots are increasingly working alongside humans in a variety of environments. While simple applications in controlled environments work fine with fully autonomous robots and little interaction between human and robot, mission-critical applications in unstructured and uncertain environments require a stronger collaboration between human and robot. An example of such an instance occurs in dismounted military operations in which one or more autonomous robots act as part of a team of soldiers. The performance of the human-robot team depends largely on the interaction between human and robot, more specifically the communication interfaces between the two. Furthermore, due to the complex and unstructured environments in which dismounted military missions take place, robots need to have a diverse skill set. Therefore, a variety of sensors, robot platform types (e.g. wheeled vs legged) and other capabilities are needed. The goal of this research was to understand how robot platform type and visual complexity of the human-robot interface, in particular a Mixed Reality interface, affect cooperative human-robot teaming in dismounted military operations. More specifically, the research objectives were to understand how robot platform type (wheeled vs. legged) impacts the human's perception of robot capability and performance, and to assess how visual complexity of a Mixed Reality interface affects accuracy and response time for an information reporting task and a signal detection task. The results of this study revealed that an increased visual complexity of the Mixed Reality-based human-robot interface improved response time and accuracy for an information reporting task and resulted in a more usable interface. Furthermore, the results indicated that the response time and accuracy for a signal detection task did not differ between high visual complexity and low visual complexity modes of the human-robot interface, which was likely due to a low task load. Users of the interface in high visual complexity mode reported lower perceived workload and better perceived performance compared to users of the interface in low visual complexity mode. Moreover, the findings of this study demonstrated that the unique appearance of a biologically-inspired legged robot was not enough to result in a difference in perceived performance and trust compared to a more traditional- looking wheeled robot. Therefore, there was no basis to conclude that the unique appearance of the legged robot resulted in the user anthropomorphizing the legged robot more than the wheeled robot. Additionally, free response feedback from users revealed that Mixed Reality-based head-mounted displays have the potential to overcome the shortcomings of Augmented Reality-based head-mounted displays and offer a suitable alternative to hand-held displays in dismounted military operations. Finally, this study demonstrated that an increase in visual complexity of a Mixed Reality-based human-robot interface results in improved effectiveness of human robot interaction and ultimately human-robot team performance as long as the additional complexity supports the tasks of the human.
Identifier: FSU_SUMMER2017_Kopinsky_fsu_0071E_14062 (IID)
Submitted Note: A Dissertation submitted to the Department of Mechanical Engineering in partial fulfillment of the requirements for the degree of Doctor of Philosophy.
Degree Awarded: Summer Semester 2017.
Date of Defense: July 18, 2017.
Keywords: Autonomous Robots, Dismounted Military, Human Robot Interaction, Legged Robots, Mixed Reality, Virtual Reality
Bibliography Note: Includes bibliographical references.
Advisory Committee: Emmanuel G. Collins, Professor Directing Dissertation; Rodney G. Roberts, University Representative; Jonathan E. Clark, Committee Member; William S. Oates, Committee Member; Daniel J. Barber, Committee Member.
Subject(s): Mechanical engineering
Computer science
Persistent Link to This Record:
Owner Institution: FSU

Choose the citation style.
Kopinsky, R. J. (2017). A Novel Mixed Reality Interface for Effective and Efficient Human Robot Interaction with Unique Mobility Platforms. Retrieved from