I want what you've got: Cross platform portabiity and human-robot interaction assessment.

PDF Version Also Available for Download.

Description

Human-robot interaction is a subtle, yet critical aspect of design that must be assessed during the development of both the human-robot interface and robot behaviors if the human-robot team is to effectively meet the complexities of the task environment. Testing not only ensures that the system can successfully achieve the tasks for which it was designed, but more importantly, usability testing allows the designers to understand how humans and robots can, will, and should work together to optimize workload distribution. A lack of human-centered robot interface design, the rigidity of sensor configuration, and the platform-specific nature of research robot development ... continued below

Creation Information

Julie L. Marble, Ph.D.*.; Few, Douglas A. & Bruemmer, David J. August 1, 2005.

Context

This article is part of the collection entitled: Office of Scientific & Technical Information Technical Reports and was provided by UNT Libraries Government Documents Department to Digital Library, a digital repository hosted by the UNT Libraries. More information about this article can be viewed below.

Who

People and organizations associated with either the creation of this article or its content.

Publisher

Provided By

UNT Libraries Government Documents Department

Serving as both a federal and a state depository library, the UNT Libraries Government Documents Department maintains millions of items in a variety of formats. The department is a member of the FDLP Content Partnerships Program and an Affiliated Archive of the National Archives.

Contact Us

What

Descriptive information to help identify this article. Follow the links below to find similar items on the Digital Library.

Description

Human-robot interaction is a subtle, yet critical aspect of design that must be assessed during the development of both the human-robot interface and robot behaviors if the human-robot team is to effectively meet the complexities of the task environment. Testing not only ensures that the system can successfully achieve the tasks for which it was designed, but more importantly, usability testing allows the designers to understand how humans and robots can, will, and should work together to optimize workload distribution. A lack of human-centered robot interface design, the rigidity of sensor configuration, and the platform-specific nature of research robot development environments are a few factors preventing robotic solutions from reaching functional utility in real word environments. Often the difficult engineering challenge of implementing adroit reactive behavior, reliable communication, trustworthy autonomy that combines with system transparency and usable interfaces is overlooked in favor of other research aims. The result is that many robotic systems never reach a level of functional utility necessary even to evaluate the efficacy of the basic system, much less result in a system that can be used in a critical, real-world environment. Further, because control architectures and interfaces are often platform specific, it is difficult or even impossible to make usability comparisons between them. This paper discusses the challenges inherent to the conduct of human factors testing of variable autonomy control architectures and across platforms within a complex, real-world environment. It discusses the need to compare behaviors, architectures, and interfaces within a structured environment that contains challenging real-world tasks, and the implications for system acceptance and trust of autonomous robotic systems for how humans and robots interact in true interactive teams.

Source

  • Performance metrics for intelligent systems,Gaithersburg, MD,08/24/2004,08/26/2004

Language

Item Type

Identifier

Unique identifying numbers for this article in the Digital Library or other systems.

  • Report No.: INEEL/CON-04-02277
  • Grant Number: DE-AC07-99ID-13727
  • Office of Scientific & Technical Information Report Number: 911044
  • Archival Resource Key: ark:/67531/metadc882324

Collections

This article is part of the following collection of related materials.

Office of Scientific & Technical Information Technical Reports

Reports, articles and other documents harvested from the Office of Scientific and Technical Information.

Office of Scientific and Technical Information (OSTI) is the Department of Energy (DOE) office that collects, preserves, and disseminates DOE-sponsored research and development (R&D) results that are the outcomes of R&D projects or other funded activities at DOE labs and facilities nationwide and grantees at universities and other institutions.

What responsibilities do I have when using this article?

When

Dates and time periods associated with this article.

Creation Date

  • August 1, 2005

Added to The UNT Digital Library

  • Sept. 22, 2016, 2:13 a.m.

Description Last Updated

  • Nov. 7, 2016, 7:51 p.m.

Usage Statistics

When was this article last used?

Yesterday: 0
Past 30 days: 0
Total Uses: 8

Interact With This Article

Here are some suggestions for what to do next.

Start Reading

PDF Version Also Available for Download.

International Image Interoperability Framework

IIF Logo

We support the IIIF Presentation API

Julie L. Marble, Ph.D.*.; Few, Douglas A. & Bruemmer, David J. I want what you've got: Cross platform portabiity and human-robot interaction assessment., article, August 1, 2005; [Idaho Falls, Idaho]. (digital.library.unt.edu/ark:/67531/metadc882324/: accessed June 23, 2018), University of North Texas Libraries, Digital Library, digital.library.unt.edu; crediting UNT Libraries Government Documents Department.