A COVID-19 emergency response for remote control of a dialysis machine with mobile HRI


This research is led by Vikram Kapila, professor of mechanical and aerospace engineering. Principal authors are Ph.D. students Hassam Khan Wazir and Christian Lourido, and Sonia Mary Chacko, a researcher and recent Ph.D. graduate under Kapila.

Healthcare workers, at risk of contracting COVID-19 when in close proximity to infected patients, may transmit the virus to other hospital-bound patients, including those on dialysis. In order to circumvent this risk, the researchers proposed a remote control system for dialysis machines. The proposed setup uses dialysis machines fitted with robotic manipulators connected wirelessly to tablets allowing remote control by workers outside of patients’ rooms.

The system (see video here) comprises an off-the-shelf four degrees of freedom (DoF) robotic manipulator equipped with a USB camera. The robot base and camera stand are fixed on a platform, making the system installation and operation simple, just requiring the user to properly locate the robot in front of its workspace and point  the camera to a touchscreen (representing a dialysis machine instrument control panel touchscreen) with which the robot manipulator is required to interact. The user interface consisting of a mobile app is connected to the same wireless network as the robot manipulator system. To identify the surface plane of action of the robot, the mobile app uses the camera’s video-feed which includes a 2D image marker located in the plane of the instrument control panel touchscreen, in front of the robot manipulator. The robotic arm can facilitate complicated sequences of button and slider manipulation thanks to a complex series of algorithms that automate some functions of the machine.

The machine attached to the already in-use dialysis machine livestreams data and results directly to a tablet or computer operated by a remote user in another room. Users tend to report a more consistent performance from the system when the user interaction is performed with a computer versus a tablet, though no significant difference has been recorded between the two modes of operation.

One of the most significant features of this new technology is that creating a custom user interface is not required to operate it, given that the user is interacting directly with the video feed from the instrument control panel touchscreen. Thus, the system works on any device with a touchscreen. The proposed device can be administered almost instantly, which can make it useful in an emergency situation. In the future, options of applying AR (augmented reality) features to the system in order to optimize user experience and efficiency may be explored.

This work is supported in part by the National Science Foundation under ITEST grant DRL-1614085, RET Site grant EEC-1542286, and DRK- 12 grant DRL-1417769.