Background: Assistive robotic manipulators (ARMs) have been developed to provide enhanced assistance and independence in performance of daily activities among people with spinal cord injury when a caregiver is not on site. However, the current commercial ARM user interfaces (UIs) may be difficult to learn and control. A touchscreen mobile UI was developed to overcome these challenges. Objective: The object of this study was to evaluate the performance between 2 ARM UIs, touchscreen and the original joystick, using an ARM evaluation tool (ARMET). Methods: This is a pilot study of people with upper extremity impairments (N = 8). Participants were trained on 2 UIs, and then they chose one to use when performing 3 tasks on the ARMET: flipping a toggle switch, pushing down a door handle, and turning a knob. Task completion time, mean velocity, and open interviews were the main outcome measurements. Results: Among 8 novice participants, 7 chose the touchscreen UI and 1 chose the joystick UI. All participants could complete the ARMET tasks independently. Use of the touchscreen UI resulted in enhanced ARMET performance (higher mean moving speed and faster task completion). Conclusions: Mobile ARM UIs demonstrated easier learning experience, less physical effort, and better ARMET performance. The improved performance, the accessibility, and lower physical effort suggested that the touchscreen UI might be an efficient tool for the ARM users.

People with higher levels of spinal cord injury (SCI) need to manipulate everyday objects during activities of daily living (ADLs). In the United States, about 19.9 million people have difficulties with physical tasks relating to upper extremity functioning, including lifting, grasping, pushing/pulling, reaching, dressing, and eating.1 The availability of assistance from caregivers is decreasing.2,3 This growing demand for ADL assistance along with the reduction of attendant care will lead to compromised quantity and quality of personal care and result in reduced quality of life for people with disabilities and older adults.

Assistive robotic manipulators (ARMs) have been developed to provide enhanced assistance in completing ADLs when a caregiver is not on site.4,5 It is estimated that about 150,000 people in the United States can benefit from using ARMs.6 The ARM manual user interface (UI) plays an important role in conveying the user's manipulation intention to the ARM movements. The primary physical UIs of commercial ARMs, including a 4×4 keypad7 and 3-axis joystick,8,9 provide robotic control for Cartesian, joint, and finger movements.10 Previous studies demonstrated their cost-effectiveness and long-term usage.4,11,12 The joystick has 3 degrees of freedom allowing horizontal and twisting movement. Two buttons on the joystick knob allow changes among translation, rotation, and finger modes. Use of the joystick results in efficient performance, but mode changing and the twisting motion may be difficult or even impossible for some users to perform.13 

A review article compared 19 commercial and developing ARMs using 5 criteria: interaction safety, robustness, adaptability, energy, and position control.14 However, their performance and control accessibility were not discussed. In addition, an article10 that reviewed clinical ARM performance evaluation since 1970 indicated that previous ARM/UI performance was difficult to compare systematically because the testing was not standardized and validated to allow comparison with the improved UIs. Thus, a standardized ARM evaluation tool (ARMET)15 was developed to standardize the testing protocols so that the performance results between UIs could be compared with minimized procedural and environmental variations.

To improve the ARM UIs, the Personal Mobility and Manipulation Appliance (PerMMA) was developed with 4 alternative UIs: touchpad,16 6-axis joystick,17,18 mobile touchscreen,19 and autonomous control.18 The PerMMA also established a framework for sharing autonomy.20 The PerMMA mobile touchscreen UI19 was developed to fit the needs of smartphone users with disabilities.21,22 The mobile touchscreen UI with the built-in Bluetooth and Wi-Fi connectivity has the potential to overcome the limitations of current control interfaces and provide wire-free manipulation assistance.

To validate interface improvement via the mobile touchscreen UI, this study compared the ARM performance between a touchscreen and commercial joystick UIs using the ARMET.15 

PerMMA-touchscreen UI

The PerMMA mobile touchscreen UI has 3 components: a mobile application on a tablet or smartphone, a personal computer with a control program, and an ARM (Figure 1). The ARM used in the study was a right-handed JACO robotic manipulator made by Kinova (Boisbriand, Quebec, Canada). The touchscreen UI on the mobile device application, called Jacontrol (Figure 2c), requires only 2 accessible finger gestures – tapping and one-finger sliding – that were evaluated in a study23 as the only 2 accessible actions among 16 participants with dexterity impairments, including 8 powered wheelchair users. In the Jacontrol app UI19 (Figure 2c), the central circle controls the robotic horizontal movements. The circle on the right-hand side controls the robotic vertical motion. The buttons on the left-hand side control the fingers opening/closing and the mode change for the translation and wrist rotation. This UI arrangement helps to reduce mode changing and increases performance with less confusion on the up/down motion. The JACO control program (Figure 1) communicates with the mobile app through Bluetooth or Wi-Fi and translates the mobile app command to robotic movements through the JACO driver.

ARMET

The ARMET15 system, as shown in Figures 2a and 2b, consists of 6 electronic components that are designed to measure task completion time and ISO-9241 throughput,24,25 a factor that is widely used to describe the performance of physical input devices based on Fitts' Law.26 The 6 components simulate commonly performed ADL tasks, including pushing large and small size circular buttons such as door openers and elevator buttons, turning on one rectangular rocker light switch, flipping a toggle switch, pushing down a door handle, and turning a knob. The task completion time was computed from the releasing of the horizontal start pad to the target button or achieving the desired knob/handle angle. A user-friendly data acquisition interface was integrated into the system.

Evaluation protocol

The study was conducted at the Human Engineering Research Laboratories (HERL) and approved by the Institutional Review Board (IRB) of the University of Pittsburgh. Participants who were 18 years old and older with upper limb impairments and limitations in ADLs were recruited in the study. After informed consent was obtained, participants were introduced to both UIs – the touchscreen and joystick – and had hands-on practice with both. Due to individual learning variability, each participant was given at least 20 minutes for hands-on practice with both UIs, until they could pass the basic tests and 3 simple tasks on the ARMET – big button, elevator button, and toggle switch. The participant was first introduced to the joystick UI using the official training materials. After the introduction, the joystick was mounted at the location that was comfortable for the participant to control. The participant was given enough time to practice and try the basic test. It usually took 5 to 10 minutes for each UI, but the participant could request more time. The basic test included translational and rotational motion, such as up/down, forward/backward, or left/right, to reflect comprehension from the UI mapping to the robotic motion. After the basic test with the joystick UI, the touchscreen was introduced and installed. The participant went through the same process as for the joystick UI. The participant could switch back to joystick UI for comparison. The training was ended when the participant felt comfortable in using one UI and pass the basic test. The goals of the training were (a) to help the participant become familiar with the ARM and both UIs and (b) to select the most comfortable UI for the performance evaluation test. Afterward, the participant chose one UI for the ARMET performance evaluation. During the evaluation, each participant was asked to perform the 3 tasks (flipping a toggle switch, turning a door handle, and turning a knob) on the ARMET for 3 times each. The order of the tasks was randomized to minimize the learning effect between tasks. The time to complete each task of the ARM was recorded during testing. The quickest result of each participant from the 3 trials was used for comparison and data analysis. Following the completion of ARMET testing, each participant was asked to answer open interview questions.

Outcome measurements

The ARMET records the task completion time and the ISO 9241-9 throughput24 for the pushbutton tasks. The task completion time was measured from the release of the start button to the on/off state change of the target buttons. ISO 9241-9 throughput is based on Fitts' Law. In this study, we included task completion time as the primary outcome measure, because the throughput could not be applied to the complex tasks of turning on a toggle switch, turning a door handle, or turning a knob. In addition, mean velocity of each task was recorded to investigate the difference of the speed control between UIs.

Participants

Eight participants with disabilities (age range, 22–71 years old; 7 men; 5 SCI, 1 muscular dystrophy, 1 stroke, 1 neck surgery) were enrolled in the study from 2014 to 2016. Among the 8 participants of first-time ARM users, 7 participants preferred the touchscreen UI and 1 participant with stroke chose the joystick. Participants were first evaluated on their upper extremity functions using a valid and reliable clinical measure, QuickDASH,27 and an interview questionnaire about activities for which they needed help, as shown in Table 1. All participants could complete the basic movement test, 3 simple tasks, and all trials on the ARMET successfully. All participants showed no difficulty using the touchscreen UI.

ARMET

The results of the ARMET measurements are shown in Table 2. Because there was only 1 participant using the joystick UI, no statistical methods were applied for comparison. In comparing the fastest trial of each participant in each task, the touchscreen UI showed relatively faster average performance than the joystick UI, especially in the door handle and toggle switch tasks. However, the knob-turning task using the joystick UI took about the same time as the average of the touchscreen UI (Figure 3A). The mean velocity of the joystick UI was slower (Figure 3B).

Participants using powered mobility devices could control the touchscreen UI even with limited hand and wrist function. One participant with SCI used the right thumb for touching the screen and the left thumb for guiding the arm's direction (Figure 4). Another participant used a stylus pen Velcro attached to the hand holder. Other participants either used fingers or knuckles in combination with arm and shoulder movements to perform the sliding and tapping actions.

Open interview

In the open interview, the reason that the participant with stroke chose the joystick UI was familiarity and intuitiveness. Other participants chose the touchscreen UI because it was “easier to use,” and it required “less physical effort” and provided “more control” with a similar user experience to the touchscreen on an iPhone or a touchscreen computer.

As shown in Table 1, 7 of the 8 participants with upper extremity disabilities selected the touchscreen UI for ARMET testing after less than half an hour practice. This higher adaptation rate suggested that the touchscreen UI was easier for learning and controlling so that they felt more ready for the ARMET test. In addition, participants with SCI and muscular dystrophy with impaired arm and hand functions all selected the touchscreen UI, even though they use a joystick for the wheelchair driving. This indicated that the touchscreen UI was accessible and required less physical effort than the joystick UI. The joystick UI was only used by a participant with stroke who used the unaffected arm and hand.

As shown in Figure 3, the ARMET test results were compared with task completion time and mean moving velocity. Nevertheless, due to the unbalanced sample size, we were unable to conduct a statistical comparison between UIs. However, the average task completion time of the touchscreen UI with people with impaired upper extremities was shorter than the time taken by the joystick UI with the unaffected limb. The mean velocity of the touchscreen UI was also higher. All 3 tasks require finer movement control such as turning and alignment. Because these are small targets on the ARMET, the participants also had to avoid collision with the box or the corner of the box. This complicated manipulation would make participants move more carefully to prevent overshoot or damage on the ARM.

During the testing, one common error of using the joystick UI was the up/down motion error. The up/down motion was controlled by spinning the joystick knob clockwise/counter-clockwise. This control method is confusing and may damage the ARM if the ARM is close to the tabletop. The touchscreen UI minimizes this error with a clear indication of up/down labels on the slider. Another common error was the mode switching error. Two small buttons on top of the knob were assigned to switch to different ARM control modes: translation, rotation, and finger. It took more attempts than the touchscreen for the user to select the desired control mode. In addition, the touchscreen UI has the finger control on the side of the screen, which reduced the time in switching to finger mode.

Higher average mean velocity and lower completion time when using the touchscreen UI suggests that the participants felt more comfortable in controlling the ARM and were quicker in completing complex tasks that require finer movements. The reasons for the faster performance using the touchscreen mobile UI could be due to fewer errors during the up/down robotic motion, fewer mode changes, and less physical effort.

The open interview revealed that participants viewed both UIs as easy to use. The touchscreen provided better visual-spatial mapping and an indication to the robotic action; this helped reduce errors in up/down movements due to the confusing clockwise/counter-clockwise knob twisting control. The open/closing hand on-screen buttons reduced the time needed to switch to finger modes. The touchscreen sliding gesture required less physical effort than pushing against the spring in the joystick. The wireless touchscreen UI showed the advantage in placing the UI at the most comfortable location for the user; this maximized precision and range of motion (Figure 4).

Study limitation

All participants were first-time users, so their performance was not optimized. However, this helped to identify the initial barriers in UI learning and usage. All participants could complete the tasks on the ARMET. However, manipulation strategies were different among participants for more complex tasks such as turning a knob. We only recruited 8 people who had no difficulty using the touchscreen or joystick and excluded people with weaker motor functions. Only one participant chose the joystick; this may not reflect the average performance of all first-time users. The ISO 9241-9 throughput could not be compared, but the results confirmed the finding from a previous study.19 

As the use of mobile devices becomes more common among people with disabilities,28,29 accessible mobile applications enable people with disabilities to be independent and to live productive lives. This study revealed an option of mobile ARM UIs. PerMMA-touchscreen UI for mobile devices demonstrated easier learning experience, less physical effort, and better ARMET performance in lower task completion time and higher mean moving speed. Seven of 8 participants chose the touchscreen UI and reported that it required less physical effort to use. The improved task performance, the accessibility to people with SCI and muscular dystrophy, and lower physical effort needed suggests that the touchscreen UI might be more efficient. Future studies will include more people with various disabilities and compare ARM performance on more complicated ADLs.

This material is based upon work supported by Quality of Life Technology Engineering Research Center, National Science Foundation (grant 0540865), Craig H. Neilsen Foundation (grant 338434), Advanced Rehabilitation Research Training – National Institute on Disability, Independent Living, and Rehabilitation Research (grant 84.133P-1), the US Department of Veterans Affairs (grant B9250C), and with resources and use of facilities at the Human Engineering Research Laboratories (HERL). The contents of this article do not represent the views of the US Department of Veterans Affairs or the United States government.

1.
Brault
MW.
Americans with disabilities: 2010 household economic studies
.
July
27
,
2012
. .
2.
Wing
P
,
Langelier
M
,
Yamada
Y
,
Poonthota
A
,
Kumar
P.
Nursing aides, home health aides, and related health care occupations: National and local workforce shortages and associated data needs
.
2004
. .
3.
Stone
R.
Who will care for us? Addressing the long-term care workforce crisis
.
2001
. .
4.
Bach
JR
,
Zeelenberg
AP
,
Winter
C.
Wheelchair-mounted robot manipulators. Long term use by patients with Duchenne muscular dystrophy
.
Am J Phys Med Rehabil Assoc Acad Physiatr
.
1990
;
69
(
2
):
55
59
. http://www.ncbi.nlm.nih.gov/pubmed/2331340.
5.
Prior
SD
,
Warner
PR.
Wheelchair-mounted robots for the home environment
.
In
:
Intelligent Robots and Systems (IROS), IEEE International Workshop
;
1993
:
1194
1200
. .
6.
Laffont
I
,
Biard
N
,
Chalubert
G
,
et al
.
Evaluation of a graphic interface to control a robotic grasping arm: A multicenter study
.
Arch Phys Med Rehabil
.
2009
;
90
(
10
):
1740
1748
.
7.
Wakita
Y
,
Yamanobe
N
,
Nagata
K
,
Clerc
M.
Evaluation of single switch interface with robot arm to help disabled people daily life
.
In
:
ROMAN 2009, The 18th IEEE International Symposium on Robot and Human Interactive Communication
;
2009
:
429
.
doi:10.1109/ROMAN.2009.5326046
.
8.
Maheu
V
,
Archambault
PS
,
Frappier
J
,
Routhier
F.
Evaluation of the JACO robotic arm: Clinico-economic study for powered wheelchair users with upper-extremity disabilities
.
In
:
2011 IEEE International Conference on Rehabilitation Robotics
;
2011
:
1
5
. http://www.ncbi.nlm.nih.gov/pubmed/22275600.
9.
Routhier
F
,
Archambault
PS.
Usability of a joystick-controlled six degree-of-freedom robotic manipulator
.
In
:
RESNA Annual Conference
;
Las Vegas, Nevada
:
2010
:
1
7
.
10.
Chung
C-S
,
Wang
H
,
Cooper
RA.
Functional assessment and performance evaluation for assistive robotic manipulators: Literature review
.
J Spinal Cord Med
.
2013
;
36
(
4
):
273
289
.
11.
Oess
NP
,
Wanek
J
,
Curt
A.
Design and evaluation of a low-cost instrumented glove for hand function assessment
.
J Neuroeng Rehabil
.
2012
;
9
(
1
):
2
.
12.
Römer
GRBE
,
Stuyt
HJA
,
Peters
A.
Cost-savings and economic benefits due to the assistive robotic manipulator (ARM)
.
In
:
9th International Conference on Rehabilitation Robotics
;
2005
:
201
204
.
doi:10.1109/ICORR.2005.1501084
.
13.
Chung
C-S
,
Hannan
MJ
,
Wang
H
,
Kelleher
AR
,
Cooper
RA.
Adapted Wolf Motor Function Test for assistive robotic manipulators user interfaces: A pilot study
.
Presented at: RESNA Annual Conference
;
2014
. .
14.
Groothuis
SS
,
Stramigioli
S
,
Carloni
R.
Lending a helping hand: Toward novel assistive robotic arms
.
IEEE Robot Autom Mag
.
2013
;
20
(
1
):
20
29
.
15.
Chung
C-S
,
Wang
H
,
Kelleher
A
,
Cooper
RA.
Development of a standardized performance evaluation ADL task board for assistive robotic manipulators
.
Presented at: Proceedings of the Rehabilitation Engineering and Assistive Technology Society of North America Conference
;
2013
;
Seattle, Washington
.
16.
Wang
H
,
Grindle
GG
,
Candiotti
J
,
et al
.
The Personal Mobility and Manipulation Appliance (PerMMA): A robotic wheelchair with advanced mobility and manipulation
.
In
:
34th Annual International Conference of the IEEE EMBS
;
2012
:
3324
3327
.
17.
Cooper
R
,
Grindle
G
,
Vazquez
J
,
et al
.
Personal Mobility and Manipulation Appliance—design, development, and initial testing
.
Proc IEEE
.
2012
;
100
(
8
):
2505
2511
. .
18.
Chung
C-S
,
Wang
H
,
Cooper
RA.
Autonomous function of wheelchair-mounted robotic manipulators to perform daily activities
.
IEEE Int Conf Rehabil Robot
.
2013
;
2013
:
1
6
.
19.
Chung
C-S
,
Boninger
J
,
Wang
H
,
Cooper
RA.
The Jacontrol: Development of a smartphone interface for the assistive robotic manipulator
.
Presented at: RESNA Annual Conference
;
2015
;
Denver, Colorado
. .
20.
Chung
C-S.
Development and assessment of advanced assistive robotic manipulator user interfaces (doctoral dissertation). University of Pittsburgh; 2015. http://d-scholarship.pitt.edu/25771/
21.
Wu
Y
,
Liu
H
,
Brown
J
,
Kelleher
A
,
Cooper
RA.
A smartphone application for improving powered seat functions usage: A preliminary test
.
Presented at: RESNA Annual Conference
;
2013
:
2
5
. .
22.
Dicianno
BE
,
Parmanto
B
,
Fairman
AD
,
et al
.
Perspectives on the evolution of mobile (mHealth) technologies and application to rehabilitation [published online ahead of print June 12, 2014]
.
Phys Ther
.
doi:10.2522/ptj.20130534
.
23.
Trewin
S
,
Swart
C
,
Pettick
D.
Physical accessibility of touchscreen smartphones
.
In
:
ASSETS '13 Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility; 2013
:
1
8
.
doi:10.1145/2513383.2513446
.
24.
Douglas
S
,
Kirkpatrick
A
,
MacKenzie
I.
Testing pointing device performance and user assessment with the IS0 9241, Part 9 Standard. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems: The CHI Is the Limit; 1999:215–222. http://dl.acm.org/citation.cfm?id=303042. Accessed December 5, 2012
.
25.
Teather
RJ
,
Natapov
D
,
Jenkin
M.
Evaluating haptic feedback in virtual environments using ISO 9241–9
.
In
:
2010 IEEE Virtual Reality Conference (VR)
;
2010
:
307
308
.
doi:10.1109/VR.2010.5444753
.
26.
MacKenzie
I.
Fitts' Law as a research and design tool in human-computer interaction
.
Human-computer Interact
.
1992
;
7
(
1
):
91
139
. .
27.
Beaton
DE
,
Wright
JG
,
Katz
JN.
Development of the QuickDASH: Comparison of three item-reduction approaches
.
J Bone Joint Surg Am
.
2005
;
87
(
5
):
1038
1046
.
28.
Wu
Y-K
,
Liu
H-Y
,
Kelleher
A
,
Pearlman
J
,
Cooper
RA.
Evaluating the usability of a smartphone virtual seating coach application for powered wheelchair users [published online ahead of print April 11, 2016]
.
Med Eng Phys
.
doi:10.1016/j.medengphy.2016.03.001
.
29.
Trewin
S
,
Swart
C
,
Pettick
D.
Physical accessibility of touchscreen smartphones
.
In
:
ASSETS '13 Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility
;
2013
.
doi: 10.1145/2513383.2513446
.