Organizations must fully understand the regulatory guidance on collaboration security and privacy so they can continue to implement and expand their use of tools such as Zoom and Teams.
As “work from anywhere” continues its ascent because of the pandemic, the use of collaboration tools such as Zoom, Cisco Webex, and Microsoft Teams has correspondingly skyrocketed. According to a recent Conference Board survey, 88% of companies are now willing to hire remote workers, compared with a pre-pandemic rate of 52%, signaling a meaningful change in the modern office environment. The increase in remote work and reliance on collaboration tools for every aspect of day-to-day business has led to increased scrutiny of their privacy and security as headlines buzz with examples of insecure or inappropriate use of collaboration applications.
Global privacy regulators such as the US Federal Trade Commission (FTC), the UK’s Information Commissioner’s Office (ICO), and Canada’s Office of the Privacy Commissioner (OPC) have identified the privacy and security threats stemming from the use of collaboration tools like Zoom bombing and the sharing of content through webcams, screen shares, whiteboards, and other dynamic communication capabilities. Most recently, on Oct. 21, 2020, the Gibraltar Regulatory Authority (GRA) issued its Video Conferencing Guidance on the EU General Data Protection Regulation, as a set of privacy and security best practices to consider when using videoconferencing platforms.
It’s important for organizations to fully understand the regulatory guidance on collaboration security and privacy so they can continue to implement and expand their use of these collaborative tools. Here are a few of the most common privacy and security risks the regulators highlighted and tips on how organizations can reduce exposure with supporting technologies.
Set Policies and Train Staff on How to Safeguard Collaboration Tools
A common theme across regulators is an emphasis on policy and training to inform employees about risks and instruct them how to use collaboration tools appropriately. The GRA stresses that organizations should consider how use of videoconferencing applications (VCAs) intersects with its privacy policies, stating they should “consider establishing their own policies and/or procedures so that their staff only use VCAs in a data protection compliant manner.” Moreover, the GRA emphasizes that “staff should be trained and educated on the security measures adopted by the organization to ensure the integrity and confidentiality of personal data (i.e., Article 5(1)(f) of the GDPR) when using VCAs.” Similarly, the FTC and ICO have posted about the importance of having policies outlining how employees can use collaboration platforms, while being mindful of requirements to protect sensitive data.
For example, screen sharing is another core feature of collaboration tools that facilitates the type of information sharing and collective experience that make these platforms so powerful. Teams can edit documents together, provide interactive presentations, and demonstrate processes or problems with the webcam, screen share, and whiteboard features.
However, the risks of exposing sensitive personal data, confidential corporate data, or other protected information is magnified by the seamless sharing capabilities of collaboration apps. As a result, this is a prominent topic covered in all of the pronouncements. The GRA tackles this issue head on, stating “[o]rganisations may deem the use of screen sharing to be essential in facilitating workflow and efficiency. Before doing so however, staff members should be aware that open documents, browser windows, desktop backgrounds or icons may be viewed by others when not intended, potentially in breach of the GDPR and/or the DPA.”
Keep Privacy Top of Mind
The GRA’s sentiments are echoed by the other regulators and highlight a dilemma for organizations rolling out collaboration tools: How do you embrace new technologies while protecting privacy and security? Many organizations are turning to supporting technologies that use artificial intelligence and machine learning to detect sharing of personally identifiable information, sensitive documents and applications, and other privacy risks across the video, audio, and chat components of collaboration tools. Given that potential fines under the General Data Protection Regulation can reach €20 million or 4% of global revenue, and laws like California Consumer Privacy Act include private rights of action, understanding and mitigating privacy risks must be an essential pillar of any remote work strategy, not an afterthought.
A year ago, “Zoom bombing” didn’t exist. Now, the issue of unauthorized access to meetings is a key privacy and security challenge confronting any company using collaboration tools. While there is no single setting or configuration that entirely eliminates the risk of Zoom bombing, properly managing the administrative controls on enterprise collaboration accounts can dramatically reduce the risk of unwanted meeting participants and leakage of sensitive information.
The FTC observes that administrators can “limit access by providing unique ID numbers for each meeting or for each participant” and also notes that “[t]hese features may not be enabled by default, so look carefully at what settings are available.” Canada’s OPC emphasizes that companies should “[p]rotect your video conferencing calls with a password, if possible, especially if you intend to discuss sensitive personal information such as health information.”
Third-party tools can address these Zoom-bombing risks by providing global, firmwide transparency into collaboration platform security settings and the ability to lock down and enforce settings across all accounts. Since technical controls to protect privacy are always preferable, privacy officers and compliance teams are embracing mechanisms for configuring and monitoring security settings using these new enterprise dashboards.
Finally, most of the regulators flagged the physical risks of remote working environments. To translate from security-speak, “physical risks” are the risks of whiteboards, documents, people, or other viewable content in your home office. Canada’s OPC cautions “[b]e careful about where you sit during the call. Who and what is visible in the background can reveal a lot of information that you might not want to share; mirrors and other reflective objects can show people in the room that may not want to be in the video.”
So, while we’re all clamoring for Room Rater likes, the more secure approach is to use background blurs and other techniques to secure your office. Moreover, the use of innovative supporting tools to analyze videoconferences to detect problematic logos, images, or text in office backgrounds will further strengthen your privacy posture.
The commonalities in this set of regulatory guidance are perhaps what’s most striking. Read together, these updates outline the critical privacy and security risks of using collaboration platforms. Given the serious fines for GDPR or CCPA noncompliance as well as the potentially disastrous legal and reputational damage that could result from a data breach, organizations must analyze and proactively address collaboration risks as part of their privacy and security programs.
Marc Gilman is a technology attorney, compliance executive, and adjunct professor of compliance at Fordham Law, bringing 15 years of law, financial services, and IT experience to his leadership role at Theta Lake. Gilman’s legal expertise focuses on global technology-related … View Full Bio