Visual accuracy dominates over haptic speed for state estimation of a partner during collaborative sensorimotor interactions

Author(s)Lakesh, Rakshith
Author(s)Sullivan, Seth R.
Author(s)Germain, Laura St.
Author(s)Roth, Adam M.
Author(s)Calalo, Jan A.
Author(s)Buggeln, John
Author(s)Ngo, Truc
Author(s)Marchhart, Vanessa R. F.
Author(s)Carter, Michael J.
Author(s)Cashaback, Joshua G. A.
Date Accessioned2023-08-30T19:17:32Z
Date Available2023-08-30T19:17:32Z
Publication Date2023-07-01
DescriptionThis article was originally published in Journal of Neurophysiology. The version of record is available at: https://doi.org/10.1152/jn.00053.2023. Copyright © 2023 the American Physiological Society. This article will be embargoed until July 1, 2024.
AbstractWe routinely have physical interactions with others, whether it be handing someone a glass of water or jointly moving a heavy object together. These sensorimotor interactions between humans typically rely on visual feedback and haptic feedback. Recent single-participant studies have highlighted that the unique noise and time delays of each sense must be considered to estimate the state, such as the position and velocity, of one’s own movement. However, we know little about how visual feedback and haptic feedback are used to estimate the state of another person. Here, we tested how humans utilize visual feedback and haptic feedback to estimate the state of their partner during a collaborative sensorimotor task. Across two experiments, we show that visual feedback dominated haptic feedback during collaboration. Specifically, we found that visual feedback led to comparatively lower task-relevant movement variability, smoother collaborative movements, and faster trial completion times. We also developed an optimal feedback controller that considered the noise and time delays of both visual feedback and haptic feedback to estimate the state of a partner. This model was able to capture both lower task-relevant movement variability and smoother collaborative movements. Taken together, our empirical and modeling results support the idea that visual accuracy is more important than haptic speed to perform state estimation of a partner during collaboration. NEW & NOTEWORTHY Physical collaboration between two or more individuals involves both visual and haptic feedback. Here, we investigated how visual and haptic feedback is used to estimate the movements of a partner during a collaboration task. Our experimental and computational modeling results parsimoniously support the notion that greater visual accuracy is more important than faster yet noisier haptic feedback when estimating the state of a partner.
SponsorThis study was supported by National Science Foundation (NSF) Grant 2146888 (to J.G.A.C.) and Natural Sciences and Engineering Research Council (NSERC) of Canada RGPIN-2018-05589 (to M.J.C.).
CitationLokesh, Rakshith, Seth R. Sullivan, Laura St. Germain, Adam M. Roth, Jan A. Calalo, John Buggeln, Truc Ngo, Vanessa R. F. Marchhart, Michael J. Carter, and Joshua G. A. Cashaback. “Visual Accuracy Dominates over Haptic Speed for State Estimation of a Partner during Collaborative Sensorimotor Interactions.” Journal of Neurophysiology 130, no. 1 (2023): 23–42. https://doi.org/10.1152/jn.00053.2023.
ISSN1522-1598
URLhttps://udspace.udel.edu/handle/19716/33280
Languageen_US
PublisherJournal of Neurophysiology
Keywordscollaboration
Keywordshuman-human interaction
Keywordsoptimal feedback control
Keywordssensorimotor
Keywordsuncontrolled manifold
TitleVisual accuracy dominates over haptic speed for state estimation of a partner during collaborative sensorimotor interactions
TypeArticle
Files
Original bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
Visual Accuracy Dominates Over Haptic Speed for State Estimation of a Partner During Collaborative Sensorimotor Interactions.pdf
Size:
9.39 MB
Format:
Adobe Portable Document Format
Description:
Main article
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
2.22 KB
Format:
Item-specific license agreed upon to submission
Description: