Engineers at Purdue University are combining technologies to create a system for remote physicians to assist first responders, in an intuitive way. The remote physician is able to use a large touchscreen TV to point to different parts of the body and describe how to perform lifesaving procedures. The live video stream from the scene can come from a body-worn camera or some other source, such as a drone hovering overhead. The first responder at the scene simply wears an augmented reality headset and all the annotations show up within the field of view, perfectly aligned with what the remote physician wanted to annotate.
A big part of the project, of course, was being able to translate the digital annotations from one person’s point of view to another’s, though the two can be looking from different heights and angles. This is done automatically and without any input from the users of the system, so both sides feel like the other side can see and understand what they’re doing.