Engineers from the University of Michigan have developed a new camera that replaces people with a stick figure avatar. It could prevent you from going viral after suffering an embarrassing fail on film, among other use cases.

They call it thePrivacyLens. Its two-camera setup leverages both a standard video camera and a thermal imaging system. Those two images are overlaid and aligned before its hybrid “Thermal Subtraction” mode uses object detection machine learning techniques to create bounding boxes around the subject’s thermal outline. These “define the pixels to be sanitized from the RGB image,” perthe recently released study, which are then removed from the video feed and replaced with a real-time animated stick figure that moves in sync with the subject. It’s essentially the same idea as theGlowstick Stickman dancing memethat sprang up on social media a few years back.

the stickman effect from PrivacyLens

The internet is overflowing with fail videos depicting people screwing up, falling down, stepping on rakes and generally having days go sideways on them. Those clips are most often uploaded intentionally to share on social media. The shots of you on the toilet, inadvertentlytaken by your passing Roombaand then leaked online, or the video of you falling down a subway escalator andthe footage being uploadedby a Massachusetts Bay Transportation Authority employee, most likely were not. Yet as cameras become more ubiquitous in both our public and private spaces, such mishaps are going to happen with increasing regularity.

The research team identified a number of threats that modern smart home monitoring devices might pose. One such is, as with the Roomba incident, a matter of the devices over-collecting data that you as the user have not authorized, or the unauthorized access and sharing of that data by Venezuelan workers taskedwith annotating the imagesto better train the robot vacuum’s algorithms. The research team also notes that even supposedly “encrypted” data streams, such as those from doorbell cameras, are still easily accessible to Ring employees, law enforcement, and any hacker proficient enough to gain access.

“Most consumers do not think about what happens to the data collected by their favorite smart home devices. In most cases, raw audio, images and videos are being streamed off these devices to the manufacturers’ cloud-based servers, regardless of whether the data is actually needed for the end application,” Alanson Sample, U-M associate professor of computer science and engineering, explained in a press release. “A smart device that removes personally identifiable information before sensitive data is sent to private servers will be a far safer product than what we currently have.”