Tool for identifying photos containing nudity?
December 3, 2021 12:25 AM Subscribe
I'm in the process of organizing tens of thousands of digital family photos from the last 15 years. I'm a bit worried that there may be some "adult" photos of me and my wife mixed in there. I'd like to remove those from being backed up. I'm wondering if there's any kind of (preferably free) tool available that can scan a set of images and (perhaps using machine learning or something) help identify any that contain nudity?
Response by poster: Unfortunately, there's so many photos that it would take too long to manually sort through them all.
posted by NoneOfTheAbove at 4:37 AM on December 3, 2021
posted by NoneOfTheAbove at 4:37 AM on December 3, 2021
Best answer: There are some open-source libraries for nudity detection that you can run on your own computer, but I haven't found any packaged in a form that is super easy to install and use. If you have some experience installing and running command-line programs in a shell/terminal, you can try nudepy, for example.
posted by mbrubeck at 9:18 AM on December 3, 2021 [3 favorites]
posted by mbrubeck at 9:18 AM on December 3, 2021 [3 favorites]
Hi, I do not know of a ML style tool other than Apple Mac iPhoto face matching (as suggested above) that works entirely on the device, if you go that route, be sure to turn off iCloud Photo Sharing if you do not want any of the pictures to be uploaded and sent to other devices.
One useful workaround could be approaching it from another angle - limit your search to the camera or file type, and not the actual content. Again, Apple Photos on the Mac allows you to set search and filter criteria, they call it "Smart Albums" but if you have done any database work, you can recognize it as a search query with AND or OR conditions. If you know what camera was used for the possible images, or can divine that by the date year, that may help you to do a filter by camera type. that could reduce tens of thousands of possible photos to a number that is possible to scan manually. It could even be helpful to rule out swaths of the photo library - for example, I never took those photos on the iPhone 12 or later.
posted by sol at 9:23 AM on December 3, 2021 [1 favorite]
One useful workaround could be approaching it from another angle - limit your search to the camera or file type, and not the actual content. Again, Apple Photos on the Mac allows you to set search and filter criteria, they call it "Smart Albums" but if you have done any database work, you can recognize it as a search query with AND or OR conditions. If you know what camera was used for the possible images, or can divine that by the date year, that may help you to do a filter by camera type. that could reduce tens of thousands of possible photos to a number that is possible to scan manually. It could even be helpful to rule out swaths of the photo library - for example, I never took those photos on the iPhone 12 or later.
posted by sol at 9:23 AM on December 3, 2021 [1 favorite]
If you need this process to work perfectly (e.g. no false-negatives where nudity isn't flagged or no false-positives where nudity is flagged on something not containing it), I would not trust any sort of ML classification tool to do it; you will unfortunately need to do it manually.
posted by Aleyn at 1:12 PM on December 3, 2021 [2 favorites]
posted by Aleyn at 1:12 PM on December 3, 2021 [2 favorites]
I’m not sure this will solve your problem for the reasons enumerated above, but a way overkill answer may be this module for the open-source digital forensics tool Autopsy.
posted by hollyholly at 7:02 PM on December 3, 2021
posted by hollyholly at 7:02 PM on December 3, 2021
This thread is closed to new comments.
posted by The_imp_inimpossible at 4:28 AM on December 3, 2021