I gave a shot at reading their pdf report but honestly I gave up. I'm sure it's a fine report but I really wanted some granular examples of active privacy problems, and I struggled to find them.
>It is not clear what personal data Microsoft collects and stores about the use of Microsoft 365 Copilot. Furthermore, the information users receive when they make a request for access is incomplete and incomprehensible.
I'm guessing this is the core of the issue? Simply that they do not know what is going on behind the scenes?
> Education organisations are advised not to use Microsoft 365 Copilot as long Microsoft has not implemented adequate measures to mitigate the identified 4 high data protection risks.
These 4 are (see page 209):
> 17.2.1. Inability to exercise data subject access rights to Diagnostic Data
> 17.2.2. Significant economic or social disadvantage and loss of control due to use of generated texts with inaccurate personal data
> 17.2.3. Loss of control through lack of transparency Required Service Data, including Telemetry Events from Webapp clients.
> 17.2.4. Reidentification of pseudonymised data through unknown retention periods of Required Service Data (including both Content and Diagnostic Data)
Of these, only the second is an LLM problem. The others are basic GDPR data processing requirements.
This report is a waste of public money. If you don't understand the basic privacy risks of using Microsoft products within learning institutions, this 213-page report won't help. The money spent on producing this report could have been better used on privacy awareness lessons for students, teachers, and staff.
https://www.surf.nl/files/2024-12/20241218-dpia-microsoft-36...