Part of using oil is the risk of oil spills. The industry tries to safeguard against a disaster, but they still happen, spills big and small but we accept the risk of an oil spill for the utility and enjoyment of the roar of the engine like in my V8 powered truck. I love that truck.
The same goes for Artificial Intelligence or affectionately known as AI.
To make AI work it must be trained. Training requires data. Lots of data. For AI to work, it needs vast (MASSIVE!) amounts of data. For example, if the AI is to be trained on how to predict what you are going to type next it needs billions of chat messages to determine the right answer. This is a minor sample, but you see where I am going. Anything AI is going to do, it needs a TON of data to train it. Where does most of the data for consumer AI comes from … wait for it … you!!!
All this training data needs to be gathered and stored. Did you know training data is also given to researchers to have them help make the training better. Not only does Big Tech have all that private information but lends it out to research institutions. Now there are two organizations that can leak your data. (Actually, it is many more)
Think of everything you do on a mobile device, consider it in the hands of not just Big Tech, Big Government, but also many many many academic institutions, all guarding your most private conversations to “their best of their abilities”. {Eye Roll}.
Spills
Then the private data oil spills start! It’s happened already with various data breaches, but these will be minor compared to what is coming. Historically data breaches are your name, social insurance number, date of birth, and credit card but with the data AI needs it will be thousands if not millions more data about you. Really! Yes, really.
For example: Microsoft just had a minor AI data oil spill.
An AI researcher at Microsoft misconfigured cloud storage and leaked a whole bunch of data. Private data. The best part it was mainly Microsoft oriented stuff, but these types of mistakes could be your data next time.
Highlights
Microsoft’s AI research team, while publishing training data accidentally exposed 38 terabytes of additional private data — including a disk backup of two employees’ workstations. Oops!
The backup includes secrets, private keys, passwords, and over 30,000 internal Microsoft Teams messages. Double Oops!!
The access level could have been limited to specific files only; however, in this case, the link was misconfigured to share the entire storage account. Triple Oops!!!
This example is a sample of the new risks organizations face when starting to leverage the power of AI. As data scientists and engineers race to bring new AI solutions to production, the massive amounts of data they handle require additional security checks and safeguards, especially when its private data. Your private data.
“It’s important to raise awareness of relevant security risks at every step of the AI development process, and make sure the security team works closely with the data science and research teams to ensure proper guardrails are defined. “ – Wiz Blog
Maybe its better not to give them your private data in the first place!
References:
Situation Summary on Digital Trends
https://www.digitaltrends.com/computing/microsoft-leaked-38tb-sensitive-data/
Good Techie Summary on Wiz Blog
https://www.wiz.io/blog/38-terabytes-of-private-data-accidentally-exposed-by-microsoft-ai-researchers
Microsoft’s Breach Report
https://msrc.microsoft.com/blog/2023/09/microsoft-mitigated-exposure-of-internal-information-in-a-storage-account-due-to-overly-permissive-sas-token/
“its better not to give them your private data in the first place!”
Exactly. Who really needs to know your ‘government name’? Who really needs to know your physical address? Who really needs to know BOTH of these?
Why not start from the position that any personal data you divulge will end up in the public domain eventually? What benefit is it it you to put your personal data out there in the first place?
I use several alias names, addresses, emails and phone numbers. As a declaimer, I am neither involved in any criminal activity nor am I in a witness protection program - I just like my privacy.