Cambridge scientists have developed photo recognition software that matches up pictures snapped from cameraphones to a database of building photographs on a remote server, helping the phone user determine where in the hell they are standing. Of course, at the moment it only works in a small area, but with location accuracy down to one meter, the developers hope to parlay their work into a commercial service. I have a similar service I have developed, wherein the user snaps a small image of a street sign, and I tell them on what corner they are standing.