The COVID-19 pandemic, along with resulting restriction measures, have led to a significant spike in popularity for many remote working tools. However, in the case of video conferencing app Zoom, this rise in popularity has also led to certain privacy-related issues being brought up. The app—available on browsers and mobile devices—has come under the proverbial spotlight, with many calling for its use to be reevaluated in light of recent criticism.
So if you’re planning to use Zoom for your remote working communication needs, here are a few things to consider first.
Lack of true “end-to-end” encryption
End-to-end encryption basically means that communication (and content) sent between users is 100% protected—even the provider (Zoom, in this case) doesn’t have access. And while Zoom says on its security page that meetings are secured with end-to-end encryption, The Intercept reports that that isn’t the case.
Instead, video meetings on Zoom use TLS encryption, which means that data sent between users and Zoom’s servers are encrypted. This is the same encryption that secure websites utilise with HTTPS—crucially, it is not the same”end-to-end” encryption that providers like WhatsApp offer, although Zoom says that it does not access user data on its servers.
Video calls with strangers
Vice also reports that the Zoom platform has had issues with leaked details. These include email information, photos, and contacts. Apparently, issues with contacts have even led to certain users being able to intitiate video calls with strangers—not exactly what you’d want on a professional conferencing app.
This is because the app has mixed up grouping for contacts for certain users; while you’re only supposed to see contacts within your “Company Directory”, Zoom reportedly groups certain users who have signed up with personal emails with others who have done the same.
Legal issues
According to Vox, Zoom has also been the subject of a class action lawsuit in the U.S., with privacy issues for the company coming to the fore. A security researcher revealed that a 3rd party could turn on the webcams of Mac users, and force them to join video calls/conferences without permission. Again, this has led to a host of complaints, with many arguing that that the security vulnerabilities should have been already discovered (Zoom was launched in 2013).
Meanwhile, there has also been the development of something called “Zoombombing”, which is when random accounts join video conferences to broadcast unsuitable content (porn, Nazi stuff) to the rest of the room. Zoom says that you can avoid this by password-protecting your meetings (or limiting screensharing settings), but it’s still certainly a worry for users on the platform.
Users have also complained about the “attention tracking” feature on the app, with many alleging that this allowed “hosts” (in other words, bosses) monitor users’ activities. However, this only applies when the host is in screensharing mode; the host is notified when participants in a conference don’t have the chat window open (for 30 seconds).
There have also been other issues with regards to how Zoom’s mobile apps manage data, with reports suggesting that the iOS app has been sending data back to Facebook through a SDK. This led to the aforementioned lawsuit, calling for more transparency for Zoom.
In general, it appears that the underlying problem behind many of Zoom’s issues is transparency—or the lack thereof. As more remote workers, and even students, begin to utilise tools like Zoom, the importance of ethical practices by providers are arguably more important now than ever.
You can find out more about Zoom on their official website here, while mobile apps are also available on the App Store for iOS devices and the Play Store for Android devices.