What Tesla's Sentry Mode Can Teach Us About The Privacy Versus Security Debate
One of the oldest debates in history—privacy versus security—is undergoing a postmodern update—with privacy advocates accusing Tesla of enabling the invasion of privacy via its built-in camera-based security system.
At issue is Tesla’s Sentry Mode, a system available in many of the company’s models that provides a security system, recording attempts to break into or vandalize vehicles. But as an extra safety feature, Tesla designed the system to record not just activity that could damage the vehicle itself, but also events, people and objects that get too close; individuals passing near the vehicle can set Sentry Mode into recording.
As such, Sentry Mode is more than a security system; many experts consider it a surveillance system, and countries around the world are debating whether Sentry Mode should be limited or even banned, especially given that many Sentry videos end up on social media.
Its capabilities—and the way it is already being used—raises significant questions on security versus privacy. Are privacy rights being violated here, or do Tesla owners have a right, in the name of security, to monitor events and people who do not pose a threat to their vehicles? And what if it catches individuals engaged in a crime? Could—and should—that video be used as evidence in an investigation?
This isn’t just a "Tesla issue." The success of Sentry Mode only guarantees that other original Equipment Manufacturers original equipment manufacturers (OEMs) and aftermarket suppliers will build similar systems. Legislators are going to want to develop policies on this matter before these systems become ubiquitous. Which side—privacy or security—should prevail? Is there a middle ground that could satisfy both needs?
If Sentry merely recorded break-in attempts or the faces of vehicle vandals, it's unlikely anyone would have an issue. But Sentry records everything that goes on in close vicinity of the vehicle. The system then stores the videos on a local SD card or uploads them to a central server controlled by Tesla.
For some jurisdictions, that’s too far. Israel has banned use of Sentry as a security tool, as has China in some areas. While the matter has yet to be tested, some people have raised concerns that the recording of passersby violates GDPR rules in the EU, and several countries have restricted or are likely to restrict certain uses of the security system.
A report by the Bavarian Data Protection Authority for the private sector (BayLDA) on Tesla said that under GDPR rules, the owner of a Tesla vehicle is a "data controller" and must "be able to prove a legal basis" for random video recordings of individuals who pass near the vehicle. As such, the Tesla owner must get permission from those individuals before recording them and uploading the video to social media or the Tesla server.
The U.S. does not have a sweeping privacy law like the GDPR, but the California Consumer Privacy Act (CCPR) says that individuals "have a right to disclosure of the private information collected (on them) by a business in the last 12 months" as well as “right to deletion of this PI.” So far, no one has challenged Sentry under the CCPR’s privacy requirements, and police in California have used Sentry video to arrest individuals accused of crimes.
In addition, Tesla cameras have long been known to pick up interesting (even weird) incidents—as well as potentially embarrassing ones, whether of the driver or an individual in a vehicle stuck in traffic alongside a Tesla (objects passing near a Tesla set off Sentry Mode recording, too). If those potentially embarrassing videos get uploaded to social media, potential legal issues or even lawsuits could follow.
On the other hand, some Tesla users swear by the system. They say that it has helped them out, aborting carjackings, recording crashes and alleviating attempted thefts and vandalism. In any event, Tesla owner Elon Musk apparently has no plans to shelve the system anytime soon—although he does recognize the need for security and confidentiality in recordings. Whether that will be enough for the EU or other regulatory bodies that may follow in its wake remains to be seen—because of the very real possibility of private individuals getting caught up in Sentry’s net. Tesla’s Customer Privacy Notice does not address the question of Sentry privacy.
What's the answer?
One possible solution to privacy concerns could be to use artificial intelligence to interpret uploaded videos. Instead of simply uploading the videos, an algorithm could determine if the actions being recorded constitute a “danger.” Based on already-uploaded videos that the security system “encounters,” a machine-learning AI-based algorithm could determine which actions taken by individuals are likely to lead to a damaging action—and which can be ignored. Thus, a video that shows an individual lingering near the vehicle might be uploaded, but one in which the passerby walks past the car without stopping would be discarded.
AI is just one way such security systems can be tweaked to ensure both security and privacy. Other methods could include sensors on the vehicle and uploading a video only when those sensors are touched, or reducing the range that sets off the system. (Musk has said several times that sensitivity adjustment capability is on the way).
If they want to avoid excess regulations, OEMs need to address this issue as soon as possible. Sentry Mode is a selling point for Teslas, at least according to some owners, and other car makers are already installing systems that have the same potential capabilities. These makers, too, will have to contend with privacy issues, which—in Europe already, as well as other countries—are on the radar of regulators.
The industry should acknowledge and act on the issue and deal with it before the regulators do—for their own good, and for the good of customers.