Embarrassing Google search result
April 10, 2021 8:24 PM   Subscribe

I'm job hunting, and googled myself, like you do. The results all seemed pretty normal except for an inexplicable "associated word." It's bothering me a lot and I'm trying to figure out why it's there and if it's possible to get rid of it.

I have a uniquely googlable name (parents, please don't do this to your kids). When you do a Google search for my name, a few pictures of my face show up, all banal, work-related results (none of them taken from websites I have control over such as linkedin). Directly below the search bar but above the image results are "associated words" (I guess that's what they're called?) All of them again are related to my job and the city I'm in, except one of the words is that of a specific facial feature/ body part, that just so happens to be something I'm really self conscious about.

Upon discovering this, I clicked to go to the image results for my name, and spanning across the top above the images are the associated words, here featuring tiny thumbnails next to each word. For the body-part word, the thumbnail is a close up of that feature on my face, from one of my inane photos.

Maybe it's been a long week, but this really bothers me. I'm unattractive, and just so happen to be self conscious about this part of my face, and the photo in question is not flattering towards it. I'm not famous, my Google presence is inoncuous, and doesn't feature anything containing this word that I know of.

I'm wondering: where do these associated words come from? Are they things that people have searched for when googling your name (or whatever)? Do they have a technical term? Ironically, I tried to Google this question before coming here and found it competely futile.

Should I be embarrassed by this, assuming other people will be googling me in my job search? Is there anything I can do to get rid of it?

Damnit, Google.
posted by cat friend to Computers & Internet (10 answers total) 6 users marked this as a favorite
 
I don't know the answer to most of your questions, but that row of words/thumbnails is an "image pack".
posted by inexorably_forward at 3:35 AM on April 11, 2021


Should I be embarrassed by this, assuming other people will be googling me in my job search?

That really is weird. Is there any chance you could ask someone else with a different IP address to look you up on their browser and see if they're getting the same result, or if possibly that keyword was a personalized result of some sort? And does the specific picture the thumbnail comes from appear on a page with any text or links related to that body part? Have you written about your feelings about that feature anywhere online at all?

Anyway, as a single datapoint in answer to your question: I pretty much never pay attention to the image packs (thanks inexorably for the term) unless I'm searching hard for something that I'm having trouble finding. I mean, I usually literally do not notice that row and couldn't tell you what the entries say, because who has the time and attention span to look carefully at all the random UI on the page; my eyes go straight to the bigger images.

For getting rid of it: this might be more effort than you want to put in, but you could try the old approach of flooding Google with other information. So maybe put up a lot of photos of you volunteering or Irish dancing or something, with plenty of text or tags containing new keywords, and hope that the results will update at some point.
posted by trig at 4:22 AM on April 11, 2021 [7 favorites]


If you are citizen or resident of the EU, you could request Google remove the url from the search results under the EU's "right to be forgotten" laws.

Alternately, you could try to get the site hosting the offending image to remove it.
posted by justkevin at 5:55 AM on April 11, 2021 [1 favorite]


where do these associated words come from?

Probably from someone's site that has used your image, with text or tags pointing out the body part in question. That's how Google produces that "associated word" result. If you can find the site that's doing it, you may be able to ask them (rather than Google) for removal. (The "Google flooding" suggestion above is good, as well, but will take time to take effect.)
posted by beagle at 6:19 AM on April 11, 2021 [1 favorite]


Do not be embarrassed by this.

I am a super digital-savvy millennial, worked in tech for years, who googles everyone, including maybe going to the "Images" results, and I have NEVER before noticed those "associated words." (Just googled myself to see what you were even talking about.)

People who google you in relation to your job hunt aren't even going to click over to the Images results tab -- they are going to click on your LinkedIn, and scroll through the first page of results looking for related things like your bio from a conference or former employer, or talks you've given or things you've written, and that's about it.
posted by amaire at 9:27 AM on April 11, 2021 [8 favorites]


I’m not going to pretend that I am knowledgeable enough to know how much truth this may hold - but what are the chances that Google has background facial recognition type stuff built in? Perhaps this is a case of a “benign” AI noticing a feature that you have, tagging it, and then for lack of any other pertinent information - said AI is serving you up a random feature it has recognized? It could have been anything in that case. Glasses, blue eyes, unibrow, short hair, mole, etc...whatever it can latch onto from the photo.
posted by nukacherry at 12:38 PM on April 11, 2021 [2 favorites]


OK, so this thing about your face is something you're self-conscious about... is it also something you search for a lot? Because you probably google your own name more than anyone else, and if facial feature word is also something you google more than other people do then it could be the algorithm going, "huh, people who search for 'Cat Friend McFlurgelsburg' also search for 'unibrow' a lot more than other people, guess those must be connected somehow" (failing to recognize that "Cat Friend McFlurgelsburg" is such a rare search that it doesn't have enough data to make any reasonable guesses at all).

But as amaire says - it's very unlikely that anyone else would see this. It's even entirely possible that google shows a different set of "related words" to other people.
posted by mskyle at 1:34 PM on April 11, 2021 [3 favorites]


where do these associated words come from?

These days it's generally Machine Learning doing the suggested terms rather than human categorization.
posted by Candleman at 3:41 PM on April 11, 2021 [1 favorite]


Response by poster: Thanks for the insight and correct terminology...I still can't wrap my brain around why it's highlighting a facial feature of mine in such a bizarre and specific way. That photo definitely would not have been so tagged, at least I hope not, and I personally have never gone searching for this term. I'll try pretending the situation doesn't exist, I guess!
posted by cat friend at 8:16 PM on April 11, 2021 [1 favorite]


Like Candleman says, it looks like some ML bullshit from Google. Whether that's because someone tagged your photo with that tag on some platform you don't know about, or if the borg just decided that your photo was similar-enough to enough photos tagged with that tag is probably undiscoverable (probably by Google, definitely by you).

If you have any rights to your data/to be forgotten (ie: you live in the EU) you might ask Google to correct that, though if it's generated by them then maybe it's not your data (ie: can every Jane Doe/Tommy Atkins/Joe Bloggs represent for each of the others?). I tested for myself (very boring name with one specific way-more-important-than-me instance (and likely thousands of others)), and I just get things about what my namesake is famous for.

Realistically: you do not own your name. Your name is owned by the FAANG now.
posted by pompomtom at 6:18 AM on April 12, 2021 [1 favorite]


« Older Persuading HR (Filter: coronavirus)   |   Googling "cult movies" does not help Newer »
This thread is closed to new comments.