X

Smart home cameras bring facial recognition ethics to your front door

Commentary: Capturing biometric data is questionable and largely unregulated. We should think twice before adding it to our smart homes.

Molly Price Former Editor
7 min read

Facial recognition isn't a futuristic dream, it's already here in a big way. It's in your hand when you use Apple's FaceID to your unlock your iPhone, it's in the airport when you smile to a camera to board your flight and thanks to a growing number of  smart home  products, it can even be in your home.

image-from-ios

This is part of a CNET special report exploring the benefits and pitfalls of facial recognition.

The Nest Hello doorbell recognizes familiar faces to tell you who's come calling and the Nest Cam IQ Indoor and Nest Cam IQ Outdoor both use it to keep tabs on who's at home or just outside. Knowing who's offering a bone is even one of the key features of Aibo , Sony's $2,900 robot dog

It's a trend that shows no signs of slowing -- TP-Link's Kasa line of smart home products announced at CES in January include cameras with facial recognition capability -- and manufacturers are proudly touting facial recognition as a helpful, cutting-edge feature. But should we be rushing to embrace facial recognition in our gardens, kitchen and bedrooms? 

Though we can't escape the technology when it's used in public settings such as airports or retailers, you walk a precarious ethical line when you start collecting biometric data on family, friends and strangers. You might gain the peace of mind that comes with knowing who's at the door, but it could come at the cost of compromising your loved ones' privacy by sending their biometric data back to manufacturers or even hackers.

What's legal

Before considering the ethics, it's important to consider the law. Data privacy is an increasingly hot topic on Capitol Hill, even if there isn't a federal law that governs facial recognition yet (two US senators proposed one last month). Three states, however, haven't waited for Congress to act.

aibo-hero

Sony's aibo learns individual faces to react to different people in different ways. 

Sarah Tew/CNET

Passed in 2008, Illinois' Biometric Information Privacy Act (BIPA) is the oldest legislation and the strictest. It regulates how biometric information is collected, stored, used and even destroyed. Texas followed a year later with the Texas Biometric Privacy Law, while Washington signed its own state House Bill 1493 two years ago.

face-id.jpg

A judge rules police can't force people to unlock phones with their faces or fingerprint.

Elijah Nouvelage / AFP/Getty Images

These bills are aimed largely at commercial implementations and don't apply to protecting biometric data on residential or private property. Eight other states including New York have attempted to pass laws protecting biometric information, but ultimately failed. 

Betsy Cooper, director of the Aspen Policy Hub, which models itself after tech incubators such as Y Combinator, teaching technology experts the ins and outs of policy making, says the legal landscape is uncertain. 

"There are growing interests in your biometric identity and how to regulate that," she said. "My research suggests that this is focused more on private entities, so on companies' use of this data rather than private consumer's use of this data, and so that sort of creates a space of uncertainty as to how consumers would be affected."

Smile, you're on camera

Cooper says that when capturing other people's biometric information it isn't just about what's legal or illegal. When a homeowner adds facial recognition technology, multiple relationships come into play.

"There are deep ethical questions," she said. "Because while the relationship between the individual and the person crossing their threshold is clear, the relationship between the person crossing the threshold and all those other companies and actors is less clear."

For Cooper, the issue extends into what the device manufacturer does with the biometric data of your loved ones and guests. That means it's up to you to do your due diligence when installing cameras, doorbells and associated apps. 

When considering a product, carefully read the terms and conditions and have a grasp of what happens to the video, images and facial data captured by your device in order to use it responsibly. It's not an easy task if you don't speak legalese. There are steps you can take to improve your privacy, but ultimately you're believing what the company discloses about how it uses your data.

nest-hello-product-photos-1

The Nest Hello Doorbell can recognize familiar faces and let you know when an unfamiliar face is at the door. 

Tyler Lizenby/CNET

Nest , for example, warns in its privacy policy that facial recognition responsibility lies squarely on the consumer:

"Depending on where you live and how you configure the Products and Services, you may need to get explicit consent to scan the faces of people visiting your home."

Sony has similar terms and conditions for Aibo. In addition to consenting to the collection of facial recognition data Sony says:

"Each Aibo Product owner further agrees that (s)he will obtain a similar consent from any person who (s)he allows in proximity to or to interact with his or her Aibo Product."

In other words, these companies are covering their bases by putting the onus on you to make sure everyone coming in contact with your purchased devices consents to their data being collected.

But it's not like you can post a "Smile, you're on camera" sign at your front door. It seems nearly impossible (or at least incredibly inconvenient) to obtain explicit consent at every interaction. 

With devices able to record us or capture our image everywhere we turn when we're out in public, we've largely given up on expecting any sort of notification.

"As a society, we've sort of gotten to the point where it is accepted that somebody can be recording you on Instagram, just if you walk across the street or any time that you attend sporting events, there are cameras everywhere that may put you on the big screen," Cooper says. 

People and private property

As a property owner, you could argue that you have a right to know what's going on in and around your home. And because the person is on your property, you have the right to gather facial recognition data about them. Renters, on the other hand, raise a whole other set of questions.

nest-cam-iq-outdoor-facial-recognition

When the IQ Outdoor sees a new face, it asks if you know the person. If so, you can name them and add them to your database of "familiar faces" in the app.

Screenshots by Megan Wollerton/CNET

That's legally true in most cases, since there aren't any laws directly addressing facial recognition on private property. Laws that address recording video footage (with devices that can't recognize faces) only prohibit it in spaces where someone would expect a reasonable amount of privacy, such as a bathroom. 

While your own private property is certainly a place where most people feel they should be able to decide how much or how little tech is involved, it's important to remember the members of the public who may find themselves on your doorstep by way of necessity.

Consider your mail carrier. My house has loads of charm and no mailbox. Instead, we have a slot in our front door. That means our mail carrier  must walk up to our door, open the slot and throw a stack of letters through. 

My smart camera might collect that mail carrier's facial recognition data as they approach and send it off into the ether. Does that person deserve to be notified? They aren't choosing to subject themselves to it. It's a necessary part of their job to come in such close proximity to my camera or doorbell.

Fortunately, current cameras and doorbells can only report that they see an unfamiliar face. In order for a camera to tell you who is at the door, you'll need to train it to associate the face with a name. That takes multiple appearances, so one-time visitors won't give much away. 

How did it get this far?

If we really wanted to get a handle on this privacy thing, we should've started when the internet was born. That ship has sailed, but it's clear privacy is starting to bubble up as an issue with the public. 

Facebook's popularity is declining in the midst of its privacy issues, most notably among younger users. While that signals the public's demand for more responsibility from tech companies, it's still quite likely that most consumers will continue to ignore these concerns for the sake of having the smartest home on the block.

In the US, cameras and doorbells and even phones with facial recognition capabilities have to be trained to learn a face. There's no nationwide database of faces a device can pull from when it captures an image, but such databases do exist in other countries, including China

While the Department of Homeland Security insists the new REAL ID system is not creating a national database, it probably isn't too difficult for the federal government to access that state-level data. What if one day criminals or missing persons could be identified on the street by a facial recognition camera? Is that worth the tradeoff of personal data? That's what we'll need to decide as a society.

Where that leaves us

At a minimum, we need to make smart purchasing decisions when it comes to data collecting devices. Yes, that will likely take extra time and research on the part of the consumer. Is that so different than purchasing any other product? Some people carefully read food ingredient lists and nutrition labels. Others research clothing or home products to be sure materials are eco-friendly, fair trade or locally sourced. These same attitudes can be applied to consuming technology.

Maybe it's just my belief in the golden rule, or maybe it's a sense of foreboding about what the future might hold, but I think we have a responsibility to ourselves and the people interacting with our devices to know what's happening to our data and be clear about how we're using it in our daily lives.