Summary
Microsoft has announced plans to release Copilot in OneDrive for organizations and schools in the summer of 2024. However, understandably, potential users are questioning whether this cloud-based AI will be 100% safe.
Included as part of the $30 monthly fee per Copilot for Microsoft 365 user, Copilot will let customers ask questions about OneDrive files, locate files using non-AI language, extract information from OneDrive files, summarize details contained in files, and a host of other useful actions. The fact that you’ll be able to perform these actions without having to open the files is great news for those wishing to work quickly, even on a lower-power device.
Given its widespread use across all data stored in the OneDrive cloud, people might naturally be concerned about whether their information remains secure.
Microsoft Says Copilot Is Safe
Copilot—in its broader sense—is a generative AI that uses data from the public web to provide real-time productivity and creativity assistance. It uses large language models (LLMs), which include pre-trained models and generative pre-trained transformers (GPTs), to produce human-like responses and text. When you launch a query or create text or data using Copilot, it is encrypted in transit and not used to train the LLMs already out there on the wider web. On top of that,Copilot cannot access an organization’s resources or Microsoft 365 Graph content.
Copilot for Microsoft 365, which is already accessible to organizations, schools, and individuals, iscompliant with Microsoft’s existing privacy, security, and compliance commitments. It also offers various protections, including blocking harmful content and detecting protected material.
Will Copilot in OneDrive Be Secure?
Since Copilot in OneDrive is initially intended for businesses and academic institutions, security will be the primary talking point upon its release. And, as with other Microsoft initiatives, such as Teams and Forms, it wouldn’t be a surprise if the company were to roll out Copilot in OneDrive for individuals in the future. So, even though Copilot’s security in OneDrive might not impact you right now as a Microsoft 365 Personal user, it probably will at some point down the line.
As Copilot intends to only scour your OneDrive folders and files, it will not interact with LLMs on the web, so this preserves the wider privacy of your documents. However, AI with personal data always raises concerns. We don’t have all the facts yet, but Copilot’s powerful ability to browse folders, documents, spreadsheets, and other files in the OneDrice cloud could raise more of an internal security concern than an external one.
Its seamless data retrieval capacity could lead to unauthorized access to sensitive information stored within OneDrive. Users can adjust the file and OneDrive permissions in a bid to counteract this, but this step is easily forgotten or might be inadvertently bypassed, given the novelty of the technology. Likewise, you can attribute sensitivity labels to files, which helps mitigate the risk of them landing in the wrong hands, but this manual process is also prone to errors, the labeling feature is not compatible with all file types, and documents created through Copilot for Microsoft 365 do not always apply the same sensitivity labels as their source files.
Copilot aside, it’s always worth remembering that OneDrive doesn’t use end-to-end encryption (your files are only encrypted in transit and at rest) or zero-knowledge encryption (which means that the developer can access your files).
While Copilot’s convenience undoubtedly improves productivity, it doesn’t come without its security complexities. Organizations and schools should undertake regular auditing of Copilot’s activities as it acquires more information about your OneDrive environment, and individual users should be prepared to consider the security implications that Copilot might have on personal OneDrive storage. Ultimately, it’s Microsoft’s responsibility to ensure its AI works safely, and the company should also make it clear to customers how Copilot might compromise security, whether internally or externally, and what security measures have or will be taken.