Look farther to get closer
June 16, 2008 10:30 PM   Subscribe

Photostellar Geolocation - Assuming a 10X optical zoom camera, comprehensive star charts and a carpenter's level to determine "straight up", what global positioning accuracy could I achieve?

Am I re-coining another technique? I know the scale of the answer has to be finer than "continental", because you can reckon that much with the bare eye.

Ideally this would wind up right in my Canon's firmware, where it could be coupled with the trick that builds high res images from multiple stills, to avoid long exposure star streaking. Then maybe geo-tag my photos till next time I point skyward.

Any related astronomy hacks welcome. Webcam and a compass?
posted by jayCampbell to Science & Nature (13 answers total) 3 users marked this as a favorite
 
You are recreating the sea navigational method of our ancestors. I suggest reading up on how sextants and navigation by the stars works.
posted by Ironmouth at 10:51 PM on June 16, 2008


Best answer: There are 69 miles per degree around the earth. So I'd think that if you can get (combined) inaccuracy of your observations and charts to where you know they will be less than one degree (ie give or take half a degree), you've narrowed your position down to 70 miles.

If you can get it to 2 degrees, you've got a 140 mile ballpark, and so on.
posted by -harlequin- at 11:16 PM on June 16, 2008


Assuming a 10X optical zoom camera, comprehensive star charts and a carpenter's level to determine "straight up", what global positioning accuracy could I achieve?

With much less equipment than that you could determine your latitude accurate to a fraction of a degree if you are in the northern hemisphere. In the southern hemisphere it's a bit more tricky but not impossible.

And with all of that you could not determine your longitude at all. Not possible without something more.

In order to determine your longitude you need a clock which keeps sidereal time which is set to the time at a location whose longitude is known accurately. Then you use your instruments to determine local sidereal time, and the difference between the two gives you your longitude.

...or you could just buy a GPS receiver.

Determining longitude was the holy grail of astronomical navigation in the old days. And the reason it was difficult was that it could only be done with the aid of accurate clocks. The best clocks in the world used pendulums, but those couldn't be used on a pitching ship. So it wasn't until really good and consistent steel eventually became available that they could make consistent springs for wind-up clocks. Then it become possible to determine longitude.

That was why the "trade winds" were so important historically during the age of American plunder. There was a latitude where the prevailing wind was to the west. A ship would tack and maneuver south until it was at that latitude, put up its square sails, and sail west until it reached land, in the Americas. When time came to return, there was another latitude where the prevailing wind was to the east. The ship would sail north to that latitude, and then let the wind blow it home again to Europe. To do that, all you needed to do was to determine latitude, which was easy on a clear night if you had a sextant. You didn't need to determine your longitude.
posted by Class Goat at 11:27 PM on June 16, 2008 [4 favorites]


There are some other ways to do it that don't require accurate clocks, but they are rather painful to use. This article talks about some of them.

The history of the Longitude Prize is interesting, too.
posted by Class Goat at 11:43 PM on June 16, 2008 [1 favorite]


In order to determine your longitude you need a clock which keeps sidereal time

Might as well use the canon's internal clock for this, saving the addition of a further instrument.
posted by -harlequin- at 11:48 PM on June 16, 2008


Well, you could, but it's more difficult. Presumably the camera's internal clock is keeping solar time, not sidereal time. A sidereal day is shorter than a solar day by 3 minutes 56 seconds. So figuring out the sidereal time is a hassle.

A sidereal day is the length of time it takes the earth to rotate 360 degrees. A solar day is the length of time it takes the earth to rotate just a tadge less than 361 degrees. A different way to put it is that there's one more sidereal day per year than there are solar days.

And the calculation doesn't work if you use solar time; you have to use sidereal time for your reference because that's the only thing you can easily determine locally to the accuracy you need.

It is possible to convert solar time to sidereal time if you know the date, but it's a complex calculation.
posted by Class Goat at 1:40 AM on June 17, 2008


Best answer: On preview: Class Goat, your almanac takes care of the conversion between solar and sidereal time. Note the illustrated pages show GHA (Greenwich Hour Angle) as a function of UT (effectively GMT or Mean Solar Time @ Greenwich.)

My totally WAG take on the original question: The short answer is, this would be a real project.

Disclaimer: I have no real qualifications in anything related to optical engineering or numerical methods, so this answer may be completely full of shite. My gut tells me it might be good within an order of magnitude or so, but my gut is also telling me my gut might be a bit optimistic.

I don't know much about digital camera firmware but I do know a little about how to solve the problem of geolocation from skyshots (well, as much as anybody with an astronomy course or two under his belt, I guess) and maybe somebody better with the technical stuff can tell you how realistic it would be to try this with what you've got. In other words, to use the classic mathematics text blow-off, we'll leave the details as an exercise for the reader.

I'm going to assume you know the basics of celestial navigation -- if not, the Wikipedia article is a good place to start.

For starters, you need to get star charts into your firmware (to identify stars in your shot; navigators do this by knowing the sky,) a set of ephemerides for your reference bodies (a/k/a Almanac) and a way to determine the time @ your almanac's reference meridian (usu. Greenwich, England.) The good news is your star charts don't have to be that detailed as there aren't that many reference stars for celestial navigation, and even if you go down to magnitude 4 you're still under 1000 stars for the whole celestial sphere.

Here is what your algorithm needs to do:

You need to be able to identify stars* within your shot as well as have horizon or zenith information so you can measure their apparent elevation. Your computer will use star charts and look for a best fit to your shot. This is doable, but not trivial, as there are some gnarly coordinate transformations as well as varying parameters to get a best fit. You need to have a handle on your camera/lens combo's distortions as well to get this to work out.

Take the apparent elevation of these reference points, and the time and use these in conjunction with the Almanac to determine your position on the globe. People used to do this with pencil and paper so I have no doubt this would be a cakewalk for a computer. The math is a bit wonky, but fortunately has been worked out for years.

Okay, now how accurate is this going to be? Here are a few sources of error off the top of my head:

First, obviously, is the accuracy of your time source. An error of one second equates to one nautical mile of longitude at the equator.

Next, accuracy of elevation -- Assume your algorithm can unambiguously identify the standard reference stars (they're far enough apart in the sky that if it can't, I doubt you'd get any kind of answer.) Also assume you have radial (pincushion/barrel/moustache) distortions accounted for as that should be pretty easy to deal with numerically. Anyway, one degree of arc equates to 60 nautical miles of latitude. Making the not-entirely-unreasonable assumption that your camera offers resolution comparable with the human eye, that is, on the order of one minute of arc, that puts your accuracy on the order of a mile. Assuming you can capture horizon data separately you could use a higher-magnification lens and increase resolution. And a wider-angle lens would decrease the precision of your elevation information. There will be a lower limit to resolution due to diffraction but for a typical DSLR lens that should be an order of magnitude or two below that arc-minute. IRL you'd run into seeing issues (that is issues due to atmospheric inhomogeneities) before you hit that anyway, I think. There are ways of ameliorating the seeing issues, but not really with a DSLR and a single picture.

An additional source of elevation error relates to locating the center of your reference stars; these will not resolve as points on your sensor, so you'll need to do some kind of smart averaging, taking into account coma, astigmatism and spherical aberrations if these are significant for your lens. You'll probably need to tweak the exposure algorithm to optimize this.

Now, all those errors are for individual star fixes; I'd send you to a statistician to tell you if they'd be expected to add or cancel each other out (I don't know.) Supposedly the Air Force had an automated system that was good to within 300 feet, with 11 star sights. That's about 3 seconds of arc and I'd guess that's close to the theoretical limit. My guess is, with a perfect clock but a typical digital camera and typical seeing conditions you'd be lucky to get consistently within a mile without serious, serious work on getting all the kinks out and probably 5 to 10 miles is closer to reasonable, again for the setup we're talking about.

For the purpose of comparison, 5-10 miles is close to, but not as good as what a trained, experienced celestial navigator should be able to repeatedly acheive with a quality sextant and accurate time source. But keep in mind they're using very different equipment designed to solve that specific problem.

----------------------------------
*Planets are tougher, but you can identify them and calculate their altitude if you build dynamic star charts using your base chart in conjunction with planet ephemerides and date info (probably just knowing the date is good enough for planets.)

The Moon adds a couple of complications. If it's up, you can compute the angle of the Moon to other celestial bodies and in theory you should be able to throw away your clock (you'd use a fitting algorithm in conjunction with the moon ephemeris to find the time.) This is notoriously not that accurate with traditional equipment but I suspect you could do a pretty good job with digital sky shots and a computer doing the heavy lifting. The bad news is the brightness of the Moon introduces exposure difficulties -- it's going to be something like 10^5 to 10^6 times as bright as your reference stars and planets and I don't know how easy it is for your real-life camera to deal with that. The techies should. Also, of course, the moon is going to be so big in the image that you'll need to do something pretty smart to locate its center. And thinking about it some more, even a "pretty good job" would probably not be all that great compared to a typical quartz watch.

Things would be different if you had a serious astronomical surveying setup, but IRL even a crap clock will be better than I think you'll be able to get by looking at the Moon's position relative to the stars with a DSLR.

posted by Opposite George at 2:55 AM on June 17, 2008


Ah hell -- yah know what? I forgot about the carpenter's level thing. Obviously that's going to be a major limit (are they good to within even half a degree? I have no idea.) If you could figure out a way to rig an artificial horizon that might be better.

Oh, and another limit on your precision is going to be the difference between geocentric coordinates (which assume the Earth is a sphere and I think celestial navigation gives you) and geodetic coordinates, which are your position on the surface of the Earth (and different because the Earth isn't a sphere.) I think these are overshadowed by your other errors, but if the Air Force is getting to within 300 feet they're making an adjustment somewhere.
posted by Opposite George at 3:22 AM on June 17, 2008


Tycho is considered the greatest naked-eye astronomer in history. Using enormous (house-sized) quadrants, he achieved about 3 arc-minutes of accuracy under poor conditions (his observatory was on a very cloudy island). You'd have optics, which puts you ahead in that respect, but unless you had a very steady and precisely calibrated mount for the camera, I suspect you'd be at a net disadvantage. A carpenter's level wouldn't cut it—eyeballing off that, you'd be lucky to get within 5°. Even if you bodged a camera mount onto a nice machinist's protractor, you'd have a hard time getting readings accurate beyond about 0.5°.
posted by adamrice at 7:04 AM on June 17, 2008


A carpenter's level wouldn't cut it—eyeballing off that, you'd be lucky to get within 5°. Even if you bodged a camera mount onto a nice machinist's protractor, you'd have a hard time getting readings accurate beyond about 0.5°.
posted by adamrice at 10:04 AM on June 17 [+] [!]


adamrice, can you explain that statement? A carpenter's level is more accurate than 5° - more like 0.5° (the best are five-ten-thousandth of an inch (.0005) per inch, or 0.03°).
posted by IAmBroom at 7:46 AM on June 17, 2008


IAmBroom—I can try. It's not clear what kind of rig the OP is proposing, but if he just had a camera, protractor left over from high-school geometry class (which he didn't mention, but I'll give him anyhow), and level, and was holding these together with chewing gum and baling wire, the precision of the level is not going to be the limiting factor in the overall precision of the rig. If you've got vertical figured out to a fare-thee-well, but are just eyeballing the angular deviation from vertical, you'll be lucky to get 5° of precision.

Now, if he's got the camera clamped to the level, and is only sighting whatever stars happen to be straight up, that changes things. But if I understand correctly, usually you sight the declination of a known star to get your latitude, rather than looking at 90° and then figuring out what star you're seeing.
posted by adamrice at 8:17 AM on June 17, 2008


Response by poster: Awesome, thank you everybody.
posted by jayCampbell at 11:46 AM on June 17, 2008


On reread, a correction: a time error of four seconds equals one nautical mile at the equator. I guess that helps a bit (okay, not really.)
posted by Opposite George at 7:54 PM on June 17, 2008


« Older Urban areas flat as a pancake?   |   Make my pork and beans taste good Newer »
This thread is closed to new comments.