Poison my photos please?
May 30, 2024 11:19 AM   Subscribe

Is there a way to retroactively apply anti-AI software such as Glaze or Nightshade to your own social media photos?

I'm in the UK, and I just got the notification that Meta will be using all users' content in this region to train its AI from 26 June. I went to the "objection" link and filled out the form, but as of now they are not offering an opt-out for users.

I don't like Facebook, but I've had this account for 20 years and it's my only point of contact with some people. If I keep the account, I'd like to protect the contents as much as I can, and make them as AI-unfriendly/useless as possible.

So, I know people have written software that either makes your images unrecognisable to AI (eg Glaze) or "poisons" the AI's prompt response (eg Nightshade). I know that angry hackers are the mothers of invention. Someone has to have written a widget that you can apply to your Facebook or Instagram feed that will retroactively apply some sort of anti-AI filter to all your photos, right? Right??

I'm in the UK using a 2021 M1 Mac. If there's a solution that works for iPhone too, so much the better.

(Glaze will not run natively on a Mac; I've downloaded Nightshade and will use it going forward. I'm looking for solutions for photos I have uploaded to social media in the past.)
posted by Pallas Athena to Computers & Internet (1 answer total) 3 users marked this as a favorite
 
Unfortunately, I would say this is not possible. Meta (and any other service you might have used) will have originals and a dozen or a hundred copies of your photos, in all likelihood they have already had their features extracted, probably seconds after you posted them. Even if you delete your photos and replace one by one it is likely the originals or their metadata will be kept.

I'm a fan of Ben Zhao's work but don't overestimate the power of image obfuscation, there are often trivial workarounds — if you can see the content of the image, an AI usually can too. It's an arms race and these companies are experts in extracting meaningful (and profitable) data from imagery. And right now they are doing everything they can legally get away with before regulations come crashing down.
posted by BlackLeotardFront at 11:58 AM on May 30 [6 favorites]


« Older Why do white leftists not want to be my friend?   |   Oregon/Northern CA roadtrip recommendations? Newer »

You are not logged in, either login or create an account to post comments