Privacy in photos —GDPRs’ largest class action in the making? Can we solve it?
The largest privacy violation that is happening across the globe.
The past few weeks have seen some interesting court cases in relation to photo and privacy via the big social networks.
Grandmother being ruled violating GDPR for posting photos of her grand children without permission.
Facebook forced to pay privacy violations in Canada
Facebook had previously been in hot water for the Cambridge Analytica cases, but really any case where the data is used for anything other then the original use this is applicable.
Some of the most common forms of data theft that get little attention are how scammers/catfishers use other people's social media photos. Anyone who has ever used a dating site or had the barrage of social media bots selling something knows how prevalent this is. You never gave them permission for that use, however, what you even do?
What about the photos on Google image search? If your picture comes up in image search, did you give Google permission to share those across boundaries for other uses? Google doesn’t even block copying of these photos, going into full complicity of their violation of GDPR.
Whether your photos are stolen through google image search or social media profiles each one of those is a violation of the GDPR regulations.
What is the potential impact of this?
GDPR Guidelines:
83(4) GDPR sets forth fines of up to 10 million euros, or, in the case of an undertaking, up to 2% of its entire global turnover of the preceding fiscal year, whichever is higher.
Given the prevalence of the known occurrences of this and each being independent the impact could be even larger than Cambridge Analytica.
Catfishing could be the biggest class action against everyone who has ever had their photo stolen and used by anyone. Technically all secondary usage of photo’s weather for fun or political commentary violates GDPR.
The bigger question is what can be done?
Private Access: Photos that can only be accessed by having an account.
- Allow real peoples photos through. AI and Meta Data are used for watermarking of original user information and the user accessing the photo. This can be done in a way where it can be almost undetectable.
Public Access: Photos available through google search or other public mechanisms.
2. Hidden in plain sight — Photos need to be scrubbed of identifiable faces or even completely synthetically regenerated when people are in it so that it does not represent the original user.
Next, we will try to implement a few of these mechanisms as a library to be used in privatizing photos. Stayed tuned for that, let’s call it the GDPR Privacy Photo Filter. Here is the repo where we will be implementing it.