Sun Seeker – Seeing the Light with Augmented Reality

I am pleased to announce that my new app “Sun Seeker” was approved by Apple on the second attempt, 31 days after the initial submission, and is now available in the iTunes appstore. Note – As it requires use of a compass, it will only work with the iPhone 3GS devices.

Sun Seeker in App Store

I have recorded a brief video demo showing how it works.

This app shows you where the sun is now, and what path it takes through the sky, either for today or for any day of the year, for your current location.

It has two main views.

  • A flat compass view
  • An augmented reality camera overlay view

It is valuable for real estate buyers (to find the sun and light exposure of any room throughout the course of the year), for gardeners and landscapers (to find hours of sun exposure for any location in the garden), for photographers (to find when the sun will be shining at the right angle), and for anyone interested in daily variations of rise and set times of the sun.

Sun SeekerThe above shot shows the opening view – which displays the sun’s day/night path segments using the flat compass. Typically you would hold the iPhone horizontally in your hand, and then you can easily see the directions of the rise point, set point, and which direction the sun is in right now – the yellow triangle. The other information displayed here is:

  • Current latitude and longitude (from built-in GPS)
  • How long since the sun rose, and until it sets; or if at night, how long since it set and how long until it rises
  • The sun’s heading (azimuth) angle and elevation. If you watch these you will see them ticking over as the sun moves.
  • Shadow ratio (length of shadow in comparison with the vertical height of a an object) and path length (the multiple of atmospheric thicknesses through which the sunlight has traveled).

Tapping the camera icon changes the app into its augmented reality overlay view.

Sun Seeker AR View

The types of information you see here are:

  • If the sun is not already in view, then a pointer showing which direction to turn towards to find the sun
  • The current heading (azimuth) and elevation of the centre of your camera view
  • The sun’s current position and its opposite shadow point
  • The sun’s path throughout today with hour positions marked – including the nighttime segment below the horizon
  • Optionally also in blue the sun’s path on the shortest day of the year, and in red for the longest day of the year
  • Grid lines of equal heading (purple for cardinal compass directions E/S/W/N and red for others) and elevation (blue)
  • The horizon line (green)

You may find this especially valuable if you look towards the rise and set points near a room’s window or on a balcony. You can then see the range of directions through which the sun rises (or sets), and therefore when it will shine through that window or onto that balcony, and for roughly how many hours at different times of the year.

Further details you can obtain are shown in the following view.

Further Details

So you can see that this app uses augmented reality a little differently from most other newly released apps, and it can provide genuinely valuable information that is not easily available by any other means. It effectively turns your iPhone into an advanced sun tracking device.

I created this app because I was myself in the process of buying property, and it was just what I needed myself.  I hope that some of you might also find it useful, as well as fun to use and to show off your iPhone!

* * *

More recent news and discussion about Sun Seeker on Facebook:

More recent blog entries on Sun Seeker:

Note – Sun Seeker is now available for Android! (March 2012)

https://market.android.com/details?id=com.ajnaware.sunseeker

Advertisement

Geotagging and Augmented Reality – New Standards Needed?

Augmented reality (AR) apps are all the buzz at the moment, and they do offer some exciting possibilities. In fact I have already created an iPhone app using certain AR techniques myself, and intend to submit it to the app store as soon as Apple will permit it. More later. 😉

However, whilst considering some of the implications of using AR, I was surprised to find that current geotagging standards don’t seem to be on a par with what AR technology permits. There is definitely scope for extending the current standards, and its seems very likely to me that demand for this will grow rapidly from now on.

In its simplest form, geotagging is simply a means of tagging a piece of data (such as a photo) with a latitude and longitude, and thus allowing data to be mapped or accessed via locational search.

An implicit assumption (given that it is called geotagging) is that the data is associated with a point on the earth’s surface, whatever the altitude of the surface might be at that location, in which case an altitude is not strictly required, because the earth’s topography can be assumed to be reasonably constant and knowable via other sources. But what about the case, for example, when a photo is taken from an aeroplane? In that case, altitude would be an additional parameter required in order to correctly differentiate it from a photo taken on the earth’s surface. Given that GPS systems typically record altitude, it is hardly a stretch to include it as  standard additional tag item.

And going further still, given that augmented reality devices measure spatial attitude of the camera, would it not make sense to enable any photo taken to be tagged with the heading (aka azimuth) and elevation (ie. local horizon coordinates) at which is was taken, and further also with the camera tilt (eg. whether portrait or landscape relative to horizon, or any angle in between), and with angle subtended by the camera shot, both vertically and horizontally. It would be necessary to provide all this extra geotagging information in order to be able to correctly reproduce/model the exact “window” into space that the photo represents. Whilst this additional data might would be unnecessary for common scenic or portrait photography, it could be quite valuable in other situations. For example, two such fully (and accurately) geotagged photos taken of the same object from different locations would allow a 3-D reconstruction of that object to be created. This is not a trivial implication!

One of the existing geotagging standards (FlickrFly) also allows an additional item to be specified ie. distance or range from the camera to the subject of the photo, which presumably is necessary when a locational search is being done for the subject of the photo rather than for the location from which the photo was taken.

In order to avoid missing out on the possibilities that AR apps can already provide, and to be able to start constructing better, more powerful geolocational systems and applications which take advantage of this, I would propose that existing geotagging standards be extended to include all of the following:

  • date, time, timezone offset (to provide instant in time)
  • geographic latitude, longitude and altitude (to provide location relative to earth’s surface and mean sea level)
  • camera viewpoint central heading (0°=N, 90°=E, 180°=S, 270°=W)
  • camera viewpoint central elevation (0°=horizontal, -90°=vertical downwards, +90°=vertical upwards)
  • camera tilt (0°=portrait/upright, 90°=landscape/top points left, 180°=landscape/top points down, 270°=portrait/top points down)
  • camera angle subtended vertically (ie. along nominal top to bottom)
  • camera angle subtended horizontally (ie. along nominal side to side)
  • range of point of interest from camera (if there is a relevant POI involved)

I would welcome feedback on this proposal from anyone knowledgeable with the current state of geotagging. I am certainly not an expert in this area, but for those who have the capabilities to influence things in this area, I believe that there may be opportunities for valuable advances, especially in relation to the new generation of AR technology.