Social robots can be used for decision-support in primary care, as well as for outreach within the community, doctors report. The robots can act as a guide, suggest changes and enhance readiness to manage patients’ needs and schedule appointments. As a result, they ensure that primary care navigators are in touch with patients’ needs to better manage their care. This was the focus of a study by Mayo Clinic researchers, however, as well as testing at Mayo Clinic campuses. Results of the study were published in the Medical Human-Analytical Computing journal.
For these experiments, the providers involved developed a platform for collaborative interactions with the robots, a first for academic settings. Using a computerized investigation platform consisting of a tablet, a team of researchers controlled the robot hands over an ActiveTrack medical tracking system. Their interest was piqued by the robot’s capability to accurately, swiftly and accurately detect AEs (AEs of primary care) in phases ranging from four weeks to four months. Moreover, all participants understood the robot’s gesture and communication controls. Robo-grip a finger.
The study was conducted at Mayo Clinic campuses over the course of four weeks, four computer-assisted patient-centered testing sessions and one nurse-led study. Participants received instructions on tips for understanding and managing AEs of primary care, as well as the robot’s communications controls.
The research team elicited an expectation of participants to input three actions: a finger stick, a thumb pull and an increase the AEs rating if the robot detected a simple AEs, such as blood clots on the fingers or blood-containing whole blood transfusions. Participants used the robot hand to immobilize the tablet, which required them to perform four hand motions per day.
Participants completed four fully-participant controlled trials lasting four weeks. The robot model, height and weight, body mass, skin colour, gender and starting age and age of the participant were selected using a computer-analytic approach.
In practice, the robot used in the studies was arranged to communicate with each participant with an interactive voice. While walking, the robot automatically inserted a thumbs-up sign, a thumbs-down signal, or key-card or barcode or QR code, an arm button, time increments and multiple end-to-end gestures as per the participant’s needs.
The library of social robots developed by the project can be adapted well for use at community hospitals with limited communication capacity. As the patient population and warm-climate climate make such hospitals better suited to the use of these robots, they could be available for use across the community and also be accessible for patients. The robots with software and protocols which are easy to use and maintain at home could be the ideal tools for patient onboarding, signal maintenance and monitoring of patient- and provider-related needs.