The goal to be accomplished is to have a robot work collaboratively with a person in a patient care setting without electrical communication (e.g., Remote Control). By having a robotic bed side assistant available to assist with basic tasks within hospitals or nursing homes, the staff can be more focused on those who are in need while those who are stable can be cared for. To accomplish the task of a robot working with a patient, I broke the project up into several different tasks. These tasks include, Face Tracking, Impedance Control, and a vision system detecting angles with robotic response (Angular Response).
The Face Tracking system has the robotic arm follow the user’s face within a 2D plane (up, down, left, right). This task is to better understand and create the algorithms for the robot to respond to user feedback via vision. Instead of looking for a “face”, the system would look for a “cup”, “table”, or “phone”.
The Impedance Control system has the robotic arm change its position based on the forces applied to the end effector. To maintain personal independence and human responsivity, the robot’s motion needs to have fluidity and react if it bumps into someone. Human’s aren’t always predictable and can be clumsy. If the robot is bringing the patient his/her drink and the patient reaches for the drink prior to the robot stopping, the robot must know to stop or be flexible to the patient grabbing onto the cup prior to the robot reaching its calculated position.
The Angular Response system is similar to the Face Tracking system. The robot will follow a patient’s head within an arcing motion around his/her body and be able to identify where in the room the patient is looking. Since we operate in a 3D world, the robotic system will need to identify where the user is looking at in the room based off of the user’s head angle to further understand the task it is requested to do. The camera system would identify unique points on the object or the user’s face / body to calculate the angle that is offset to the camera.
The overall system consists of a smaller collaborative robotic arm on a mobile platform, multiple high-resolution cameras, a powerful computing system, and a collaborative robot gripper. The system will need to meet ISO standards for collaborative robotics and any further regulations the healthcare facility has in place.
The design has great market potential in the healthcare field, but as costs of manufacturing go down, it may be feasible for some to have at home for personal care. There is minimal hardware to the system, but most of the design time is for the programming of the system to be deployable anywhere with very little setup. With the system being unique, it is hard to compare costs to other similar systems. Rounding liberally, the estimated hardware costs are ~$350,000 for the entire robotic system.
ABOUT THE ENTRANT
Type of entry:individual
Nathan's hobbies and activities:Robotics & Automation, 3D Printing, CNC
Nathan is inspired by:Being able to help others with the rapidly advancing technology we have today definitely inspires me. However, the biggest inspiration for this project comes from my grandmother. Grandma Marie was a very influential person in my life, so when she began showing signs of dementia 10 years ago, life began to change within my family. As the disease progressed into Alzheimer’s, my fiercely independent grandmother slowly changed into a completely different person with enormous needs that grew exceedingly difficult to meet at home. After much thought and consideration, Grandma had to be admitted into a full care nursing facility. She struggled greatly during her 6 month stay at the nursing home. I found it heart wrenching to watch her and other patients struggle to reach for the simplest things or futilely attempt other small tasks that they used to be able to do quite easily. Being able to make a change in someone’s life is great, but being able to design and build something that can help change the lives of many is even greater. I would not have been able to further this project without the help of Dr. Ou Ma’s lab at the University of Cincinnati. I am grateful to him for allowing me to utilize his robotic equipment.