How to host and display other's art on my website
July 30, 2010 8:37 AM
I would like to post pictures on my blog and prevent Google from indexing and linking to the them and people from downloading them. How should I go about doing this?
Note, these photos are not strictly my own, though some will be.
Sometimes I might see an interesting photographer or artist and make a post about it on my blog (hosted on my own server with Laughing Squid, using ExpressionEngine). When doing this, I'd like to put up a photo of the art (with proper credit and link back to the original artist of course) but do so in a way that my own website isn't impacted by doing this.
For instance, most of my referrers come from hosting some of these images on my own server and people finding them via Google searches. That's swell and all, but I don't see any reason for my website to be getting hits about other artists work, you know?
So, 1) Is there particular code I should use to prevent Google from indexing the images, but still indexing the text and posts? or 2) should I host the pictures on some free or cheap service, and then link to the on my blog?
Note, these photos are not strictly my own, though some will be.
Sometimes I might see an interesting photographer or artist and make a post about it on my blog (hosted on my own server with Laughing Squid, using ExpressionEngine). When doing this, I'd like to put up a photo of the art (with proper credit and link back to the original artist of course) but do so in a way that my own website isn't impacted by doing this.
For instance, most of my referrers come from hosting some of these images on my own server and people finding them via Google searches. That's swell and all, but I don't see any reason for my website to be getting hits about other artists work, you know?
So, 1) Is there particular code I should use to prevent Google from indexing the images, but still indexing the text and posts? or 2) should I host the pictures on some free or cheap service, and then link to the on my blog?
the easiest way to remove all images from Google is to have a robots.txt file with the following line
User-agent: Googlebot-Image
Disallow: /
If you want block all images from all search engines, you could have them all in some shared top level directory like "images" and do:
User-agent:*
Disallow:/images/
posted by alkupe at 8:46 AM on July 30, 2010
User-agent: Googlebot-Image
Disallow: /
If you want block all images from all search engines, you could have them all in some shared top level directory like "images" and do:
User-agent:*
Disallow:/images/
posted by alkupe at 8:46 AM on July 30, 2010
I'm confused. You want to write about artists and work you like, but you don't want your website to get visited as a result? That seems very counter intuitive to me. Are you just worried about bandwidth? Or am I being particularly dense today?
posted by FlamingBore at 8:47 AM on July 30, 2010
posted by FlamingBore at 8:47 AM on July 30, 2010
umm i think the OP is talking about the images in an image search, not the writing on the website.
posted by alkupe at 8:49 AM on July 30, 2010
posted by alkupe at 8:49 AM on July 30, 2010
Flaming bore: I'm getting referrals for the IMAGES, not the writing, which I'd like to stop. Bandwidth is one of the issues, but the large one is the referrals just hosting the images myself.
posted by nomadicink at 8:52 AM on July 30, 2010
posted by nomadicink at 8:52 AM on July 30, 2010
depending on the underlying server (i.e. Apache, ngnix, lighttpd, etc.) you can also prevent people from linking to the images directly by having the server check the referrer. this tutorial is a good place to start, I think, for Apache servers. as far as I can tell, you've got Apache through Laughing Squid, so you ought to be able to do this. do both this and the robots.txt - not only will your images not show up in Google, but you'll also not have to worry as much about people linking directly to them on your site.
posted by mrg at 9:02 AM on July 30, 2010
posted by mrg at 9:02 AM on July 30, 2010
The robots.txt file will prevent automated bots/spiders, sure, but to prevent human users from finding your images and linking to them regardless, you have to set up an htaccess file (*nix servers, all over Google) or a separate script (CGI Perl or PHP) that obfuscates the actual image directory location and has built-in hotlinking prevention.
posted by Ky at 9:25 AM on July 30, 2010
posted by Ky at 9:25 AM on July 30, 2010
I have no tech background so I can not begin to offer an answers, but I get the impression you are taking photos from some place , putting them at your site, and then do not want others to take what you have taken (in addition to what is yours)...not sure what that suggests! Seems like you are taking and telling others not to do what you have done.
posted by Postroad at 9:40 AM on July 30, 2010
posted by Postroad at 9:40 AM on July 30, 2010
People have given you great advice on how to stop your images from being indexed and on how to stop people from hotlinking them (displaying them on their website with the files still loading from your server). There is no way, however, to prevent a user from downloading your images. The images are being sent over the Internet. When a user views the image, it's on their computer. If they want to, they can keep it and upload it to their own server and there's nothing you can do technologically to stop them. You can, of course, pursue legal action if you wish, but that's a different matter entirely.
For completion's sake, there is one thing you can sort of of do. You can embed all of your images inside flash files. But that will only really serve to make your website less usable, and people can still screenshot them or download the flash file and unpack it using one of various tools.
posted by 256 at 9:41 AM on July 30, 2010
For completion's sake, there is one thing you can sort of of do. You can embed all of your images inside flash files. But that will only really serve to make your website less usable, and people can still screenshot them or download the flash file and unpack it using one of various tools.
posted by 256 at 9:41 AM on July 30, 2010
1. Use robots.txt
2. Don't make a copy of the image, just link to it in the original location. Include a visible link to view the original page it came from.
3. If you can see it, you can copy it (from a technical standpoint). Whether it's your photo or someone else's. You should accept this.
posted by blue_beetle at 9:49 AM on July 30, 2010
2. Don't make a copy of the image, just link to it in the original location. Include a visible link to view the original page it came from.
3. If you can see it, you can copy it (from a technical standpoint). Whether it's your photo or someone else's. You should accept this.
posted by blue_beetle at 9:49 AM on July 30, 2010
but I get the impression you are taking photos from some place , putting them at your site, and then do not want others to take what you have taken
I don't want Google searches for the images directing to my website, 'cause they aren't my images. Links to the actual post are fine of course, it's the Google referrals to the images that are annoying.
There is no way, however, to prevent a user from downloading your images
Yes, I understand that. At most I could prevent direct hot linking for non tech savvy users and still people could just a take screenshot.
posted by nomadicink at 10:25 AM on July 30, 2010
I don't want Google searches for the images directing to my website, 'cause they aren't my images. Links to the actual post are fine of course, it's the Google referrals to the images that are annoying.
There is no way, however, to prevent a user from downloading your images
Yes, I understand that. At most I could prevent direct hot linking for non tech savvy users and still people could just a take screenshot.
posted by nomadicink at 10:25 AM on July 30, 2010
"just link to it in the original location"...
please don't do that.... you are stealing bandwidth.. If you want people to view the image on the other site, link to the site, not the image...
posted by HuronBob at 11:17 AM on July 30, 2010
please don't do that.... you are stealing bandwidth.. If you want people to view the image on the other site, link to the site, not the image...
posted by HuronBob at 11:17 AM on July 30, 2010
It depends on the other site, really. Some sites don't mind or actively prefer hotlinking.
Anyway, robots.txt plus a refer[r]er-based anti-hotlinking line in your apache config seems like the sweet spot to me— it won't bother the users you want, but shuold prevent the larger abuses.
posted by hattifattener at 12:31 PM on July 30, 2010
Anyway, robots.txt plus a refer[r]er-based anti-hotlinking line in your apache config seems like the sweet spot to me— it won't bother the users you want, but shuold prevent the larger abuses.
posted by hattifattener at 12:31 PM on July 30, 2010
This thread is closed to new comments.
robots.txt
file, in the Robots Exclusion Standard, placed at the root of your site, likehttp://www.nomadicink.com/robots.txt
.If all of your images are in the subdirectory
/images
, yourrobots.txt
file would look like:User-agent: *
Disallow: /images
This tells all (
*
) of the Robots Exclusion Standard compliant web spiders to stay out (disallow
) of the path/images
.posted by adipocere at 8:42 AM on July 30, 2010