Sun Seeker – Finding a Window’s Sunlight Exposure

This image has an empty alt attribute; its file name is img_1751.png

This article explains how you can use the Sun Seeker iOS app to get valuable information about a property’s sunlight exposure by using the app on-site.

Whether you are interested in your existing home or office’s solar exposure, or wanting to understand more about the sunlight availability and exposure of a property you are thinking of renting, buying or moving into, you can easily assess each room or window of interest using Sun Seeker, by following these steps.

Firstly, stand by the window you are interested in, make sure the Sun Seeker app is running, then tap on the 3D View button to open the augmented reality camera view (shown above). You can use the app either in portrait or landscape mode. I am using landscape in this case because it allows me to show more points of interest for this article, but you can choose whichever helps you see more of the relevant parts of the sun’s path.

Due to the inherent limited field of view of the camera on the device, you may need to move it around to see the entire range of view through the window. However, in this particular example we will start by looking at the features of interest we see by looking towards the position of the sun at sunrise.

In this case I have in fact taken a screenshot at precisely the time of sunrise on this day (12th May). Choosing a time at which the sun actually visible from the window is helpful in that it allows us to be sure that the sun’s position and path is accurately aligned with the true direction of the sun. i.e. to have an accurately calibrated heading. Of course this will not always be possible, so there is a detailed explanation below describing various other ways of calibrating the heading, if needed, but for now we will continue on the assumption that we already have the best available heading calibration (as we obviously do in the sample screenshot shown above, given that the sun’s icon does align exactly with the actual sun).

What we can see in the above screenshot is that today the sun is rising at about 6:50am and will follow the yellow path line, disappearing behind the building at about 9:20am – this means that today we could get a maximum, on a cloudless day, of about 2.5 hours of direct sunlight.

By contrast, the blue path line shows the path at winter solstice, which is limited to a maximum of about 2.2 hrs of direct sunlight.

The green line shows the path at equinox (and therefore intersects the horizon roughly due east). Although this path extends well above the top of this particular screenshot, and you would point the camera upwards to see the full extent of that path, as I did here to obtain the following screenshot.

Here we can see that on the green equinox path line the sun remains out until just after 12pm, after having risen at around 7am, so provides a maximum of just over 5 hours of direct sunlight.

Following the red line (summer solstice) in a similar fashion shows rise at just before 6am and also disappearing at about 12am, so provides a maximum of about 6 hours sunlight.

What further conclusions could we draw?

This window gets morning sunlight year-round. Nice! However, in mid-winter it receives a maximum of only just over 2 hours (on sunny days), and then only at low solar elevation (disappearing behind the building when it has only risen to about 20 degrees elevation). This means that this room might be somewhat cool in winter – and especially so when the morning is cloudy. In spring, summer or autumn/fall though, it receives quite copious amounts of sunshine and you might need shades if you aren’t a full-on sun lover, or want to protect items in the room from UV exposure.

How to Calibrate the Heading

As you can see from the above analysis of this window’s sunlight, having an accurate heading on your device is important as if it is not accurate, it can significantly distort the projected timing of the sun’s emergence or disappearance into shade.

Sun Seeker calculates the position of the sun very accurately – to better than one second or arc – but the device’s compass may not be very precise, can be significantly affected by any surrounding magnetic interference, and may need “massaging” to bring to optimum accuracy.

Therefore I strongly recommend that, if you are doing anything more than getting a rough overview of solar exposure, rather than relying on the compass being accurate, you use one of the following methods instead.

  1. Set the heading manually from the actual sun’s position (only possible if sun is visible from your current position of course). To do this, first switch the 3D view into gyroscope only mode (by tapping on the compass icon symbol in the toolbar), and then manually drag the screen until the sun icon aligns with the actual visible sun. Once you have done this, the heading will likely on drift very slowly out of correct alignment – just recheck it and re-align it in the same way, from time to time, as needed to maintain the accuracy.
  2. Set the heading manually from the known position of a visible landmark, by using Sun Seeker’s Reference Azimuth feature. For a full explanation of how to use this feature see this earlier blog post: https://ajnaware.wordpress.com/2014/08/19/sun-seeker-how-to-use-the-new-azimuth-calibration-feature/

If you must rely on the compass because these other methods are not feasible in your current circumstances (i.e. sun not visible, and no known landmarks are visible), then I strongly recommend that you perform a compass calibration manoeuvre instead, to get the best possible compass performance and accuracy. This involves rotating your device a few times around 3 different axes, while an app that uses the compass (such as Sun Seeker) is open on your device. See the following video for a demonstration.

Advertisement

Sun Seeker Sizzles!

The Sun Seeker app has continued to be enhanced and honed in a series of updates, the most recent being v2.8, which includes a modernised interface – yep, those are indeed flat button faces. 😉

Sun Seeker v2.8 compass screen
Sun Seeker v2.8 Compass Screen

These updates seem to have attracted a bit of positive attention resulting in several new blog reviews, culminating in the following fabulous review from top-notch reviewer John Martellaro (@jmartellaro) of The Mac Observer. This is a must-read review, not just because it is positively glowing, but also because John saw fit to include my detailed answers to his probing questions, including some inside information on how the app works, and especially importantly on how to ensure optimum compass calibration of your device.

Just to show a little more of the app here, my own favorite feature update is the ability to select date and time on the map view via a scroller.

Sun Seeker Map View

 

Sun Seeker Update v1.5

As part of a series of planned updates for the Sun Seeker augmented reality iPhone app, the latest update v1.5 has just been approved by Apple.

For a video demo of Sun Seeker see this earlier blog post.

The main changes for v1.5 are:

  • Enhanced performance for smoother compass dial rotation
  • Added new table of annual rise and set times
  • Added new table of sun’s daily azimuth and elevation
  • Added tap action to rise/set label to see local times instead of intervals
  • Better handling of disabled Location Services
  • More efficient use of GPS – now only used briefly on startup
  • Fixed some minor bugs and issues with date changes

[Note – v1.5.1 update has been submitted to fix OS3.0 backward compatibility and missing sunset times for some locations west of GMT.]

Following is a more detailed description of some of these items.

1. Enhanced Compass Dial Rotation

The compass dial in Sun Seeker includes text which retains its orientation relative to the device regardless of the compass rotation, and this means that at least part of the image needs to be re-rendered for each incremental rotation of that dial as the compass rotates. Previously the whole image was being redrawn each time, and that performance hit meant that the dial motion was quite jerky when it needed to make large rotational changes. The new implementation involves much less redrawing, and hence allows the compass to be much more responsive.

2. New Tables

The table of rise and set times spans the entire year, and hence allows you to look up rise and set times for any given date.

The table of the solar path lists the sun’s azimuth and elevation at 15 minutes intervals throughout the currently selected day.

3. Tap action to see rise/set time instead of intervals

This was added simply for clarity. Tapping on the rise/set labels on the compass screen toggles the display between showing the rise and set time in local time versus showing time duration between now and the rise and set times.

4. Better handling of Location Services status

A problem to date has been if the user has switched off the device Location Services or (perhaps accidentally) disabled them for this particular app. In these cases the app can only use its last acquired location data, and in this case the app shows data which is correct for that old location, but incorrect for the user’s current location.

This issue has been the biggest generator of email support requests to date, but I now expect that this will lessen considerably, because I have implemented clear warning messages which pop-up whenever location services are disabled, each time that the app starts up or resumes from background.

5. More efficient use of GPS

Previous versions of the app left GPS on continuously while the app was active (although off when inactive or in background), and this presented the app with ongoing positional updates while it was open. But for the sake of efficient use of GPS, it seemed unnecessary to leave it on once the location had been determined to a reasonable accuracy, so GPS is now only on as long as location has not been found to reasonable accuracy. However an important point here is that the app should re-query its location not only every time it starts up, but also whenever it resumes from background. The reason for this of course is that the device may have changed location while it was in background – for example it may resume from background after the user has traveled somewhere by air!

6. Future Updates

By far the most common request from users has been to allow selection of other cities/locations rather than just the current current, and this is the next major feature planned. But please note that it is not a trivial update! A particular difficulty here is in ensuring that the local times reported for other locations respect the correct timezones and daylight savings rules for those locations throughout the year. However, I do have a solution planned, and hope to be able to do this within a reasonable timeframe.

The next most common request has been for an Android version. Due to the particularly technical nature of the app, and the fact that I personally have no grounding in Android development, this is a much more difficult proposition. However I have been looking to outsource it. I apologise to those who have been waiting impatiently, and I can assure you that these plans are progressing.

In the meantime, I hope you continue to enjoy the app!

See Breeze – Augmented Reality Wind Visualizer for iPhone and iPad

Ajnaware’s latest app “See Breeze” has just been approved by Apple, and is now available from the app store. This app has a universal binary – so can be installed onto either iPhone 3GS or onto iPad from the same purchase.

Like Sun Seeker, this app pushes the boundaries of what augmented reality on mobile devices can be used for. The app description is as follows.

Provides both a FLAT VIEW COMPASS and an AUGMENTED REALITY 3-D VIEW showing the local wind and weather conditions with animated wind vectors.

Ideal For:
– Aviators, Sailors, Surfers, Windsurfers, Kite Flyers, Cyclists, Fire Fighters, Weather Hobbyists and any other outdoor enthusiasts

Main Features:
– Compass view showing animated wind vectors for nearest weather stations with wind, temperature and humidity readings
– 3-D augmented reality view with animated wind vectors
– List of local observation stations (up to 10 nearest), from which any may be selected for individual wind viewing
– Map view of all local stations with weather arrows showing direction, speed and temperature
– Uses official Bureau of Meteorology data within Australia, and NOAA metar data (from airports) for rest of world

Feature Device Dependencies:
– iPhone – interface runs only in portrait mode, 3-D View is shown as an overlay on the camera view
– iPad – interface runs in any device orientation, 3-D View is displayed with an opaque background (due to absence of camera)

I had the idea for this app about the same time as I had the idea for Sun Seeker, but I had to choose just one to do first, and even when I did start it, I found that it took a lot longer than expected due to the various technical challenges involved. The first major challenge was learning some OpenGL ES, and the second one was figuring out how to get OpenGL ES to respond correctly to device orientation and heading changes. Many thanks to Jeff LaMarche for some great blog articles on the former, and as for the latter, I pretty much had to figure it out for myself. I did post on Stack Overflow, but ended up answering my own question.

Adapting the app to iPad was also an interesting issue to deal with. I ended up with quite a few conditional branches in the code to deal with cosmetic differences. But the end result more than justified the extra effort. It looks superb on the iPad. Credit for the excellent app artwork goes to Peter Fellows once again, whose work on the Oz Weather program was brilliant. Here are a few screenshots from the iPhone app.

and one from the iPad app, which of course has much nicer mapping ability…

Sun Seeker – What is it Good For?

There have been quite a few news and blog mentions of Sun Seeker since it’s release (described in this previous post), which has created some good interest in it, and yes, some good app sales too. But a common reaction of press reviewers seems to be to question what you would use it for. I have to say, frankly, that I am a little surprised. How could you not immediately understand how useful this app really is?!

SunSeeker for Real Estate

But then it dawned on me (whoops, no pun intended!). We are not all born the same. Some of us do seem to have that extra geek gene, which means that some things which seem really obvious to us are pretty much obscure to others. And vice-versa of course, as I know all too well, often to my own detriment. 😉

Thankfully, however, some of those who bought the app do already “get it”, and a few kind souls have left some great comments explaining exactly how they find it useful – and some of these are in ways that I had not even imagined myself. As these comments are spread around different countries’ app stores, I thought it might help to list a few of them here. I have added highlighting to various words and phrases to emphasize the types of usage people are using it for.

I bought this app to track the suns position on the cockpit window during my trips as an airline pilot, this app works better than I had hoped. I now use this app as a situational awareness tool, keeping track of possible solar glare on final approaches to particular runways. It works awsome in the virtual 3d view because of the slaved compass I can find the suns relative position with reference to any runway. This is really a great app. (Lwm5 – USA)

Fantastic – shows the true utility of augmented reality apps. As an architect I have been doing solar analysis of sites by printing solar charts, taking pictures and noting bearings & altitue of horizon (trees mnts structures etc) – then combining info in Photoshop. With this app it’s as easy as pointing the camera to get a sense of the solar access of a site at different times of day / year. (smh_iTunes – USA)

I work in the Solar industry and this works exceptionally well for aligning solar arrays and showing customers the path of the sun. GREAT app 🙂 (Clear James – Australia)

The perfect app for DOP’s Gaffers and anyone that needs to know where the sun path will be and where you will lose the sun behind a building etc. The augmented reality is flawless and helps anyone plan out a photo/film shoot to the hour. A steal at this price. (Metromadman – Australia)

It might also be worth noting that, currently, the best sales of this app are being made in… Japan. How fitting, given that it is sometimes know as the land of the rising sun!

Currently Sun Seeker is #6 in paid apps in the Navigation category, there. I’m guessing that this might have something to do with the fact that Japanese are known for being early and enthusiastic adopters of new technology. This helps in two ways – firstly because there might be a strong uptake of the latest 3GS iPhone model (required for this app), and secondly because the area of augmented reality is so new to the consumer space, and offers exciting new ways of using the technology, which may not be immediately obvious to those more reluctant to embrace unfamiliar technology.

Now why can’t Westerners be more like the Japanese?

So until next time – Konichiwa! 🙂

Sun Seeker – Seeing the Light with Augmented Reality

I am pleased to announce that my new app “Sun Seeker” was approved by Apple on the second attempt, 31 days after the initial submission, and is now available in the iTunes appstore. Note – As it requires use of a compass, it will only work with the iPhone 3GS devices.

Sun Seeker in App Store

I have recorded a brief video demo showing how it works.

This app shows you where the sun is now, and what path it takes through the sky, either for today or for any day of the year, for your current location.

It has two main views.

  • A flat compass view
  • An augmented reality camera overlay view

It is valuable for real estate buyers (to find the sun and light exposure of any room throughout the course of the year), for gardeners and landscapers (to find hours of sun exposure for any location in the garden), for photographers (to find when the sun will be shining at the right angle), and for anyone interested in daily variations of rise and set times of the sun.

Sun SeekerThe above shot shows the opening view – which displays the sun’s day/night path segments using the flat compass. Typically you would hold the iPhone horizontally in your hand, and then you can easily see the directions of the rise point, set point, and which direction the sun is in right now – the yellow triangle. The other information displayed here is:

  • Current latitude and longitude (from built-in GPS)
  • How long since the sun rose, and until it sets; or if at night, how long since it set and how long until it rises
  • The sun’s heading (azimuth) angle and elevation. If you watch these you will see them ticking over as the sun moves.
  • Shadow ratio (length of shadow in comparison with the vertical height of a an object) and path length (the multiple of atmospheric thicknesses through which the sunlight has traveled).

Tapping the camera icon changes the app into its augmented reality overlay view.

Sun Seeker AR View

The types of information you see here are:

  • If the sun is not already in view, then a pointer showing which direction to turn towards to find the sun
  • The current heading (azimuth) and elevation of the centre of your camera view
  • The sun’s current position and its opposite shadow point
  • The sun’s path throughout today with hour positions marked – including the nighttime segment below the horizon
  • Optionally also in blue the sun’s path on the shortest day of the year, and in red for the longest day of the year
  • Grid lines of equal heading (purple for cardinal compass directions E/S/W/N and red for others) and elevation (blue)
  • The horizon line (green)

You may find this especially valuable if you look towards the rise and set points near a room’s window or on a balcony. You can then see the range of directions through which the sun rises (or sets), and therefore when it will shine through that window or onto that balcony, and for roughly how many hours at different times of the year.

Further details you can obtain are shown in the following view.

Further Details

So you can see that this app uses augmented reality a little differently from most other newly released apps, and it can provide genuinely valuable information that is not easily available by any other means. It effectively turns your iPhone into an advanced sun tracking device.

I created this app because I was myself in the process of buying property, and it was just what I needed myself.  I hope that some of you might also find it useful, as well as fun to use and to show off your iPhone!

* * *

More recent news and discussion about Sun Seeker on Facebook:

More recent blog entries on Sun Seeker:

Note – Sun Seeker is now available for Android! (March 2012)

https://market.android.com/details?id=com.ajnaware.sunseeker

Yet Another Dubious App Rejection Story

As alluded to in a previous article, I have created an app with augmented reality capabilities, using Apple’s new camera overlay API calls which were first introduced in the OS3.1 SDK beta.

Given that “augmented reality” has caused such a buzz, I was of course keen to try to get it published as soon as possible. I noted that a number of other developers had already submitted apps which used camera overlays and yet were OS3.0 apps, and that some of them had been accepted into the app store, despite the widespread suspicions that such apps must be using private API calls to achieve this – something that Apple explicitly forbids.

But I didn’t want to risk raising the ire of Apple by trying to sneak through the cracks of the review process. After all, I depend on them for my living, and on the whole they have been very supportive of my efforts – for example by featuring Oz Weather prominently and repeatedly over the course of many months, which has kept it in the top 20 paid apps in Australia for much of the time.

So having more or less completed the app some time ago using the OS 3.1 beta, I dutifully waited for the final release of OS3.1 SDK, which upon arrival I promptly downloaded, verified that my app worked correctly with it, and then re-built and submitted it via iTunes Connect. That app submission was on 10th September.

Of course, as usual, I then had to sit back and wait for my app to reach the front of the queue and be reviewed by Apple’s team. The first and only sign you get that this has happened is to get an email either approving or rejecting your app. This arrived on 23rd September ie. 13 days later. And of course it was a rejection.

The Reason for Apple’s Rejection

Although I have previously made at least a dozen app submissions (new and updates), only one had ever been rejected, and that was due to a crash that occurred during Apple testing, so a fully understandable rejection. But this rejection was different. The reason given was as follows.

Thank you for submitting [redacted] to the App Store.  Unfortunately it cannot be added to the App Store because it is modifying or extending an undocumented API, which as outlined in the iPhone Developer Program License Agreement section 3.3.1 is prohibited:

“3.3.1 Applications may only use Documented APIs in the manner prescribed by Apple and must not use or call any private APIs.”

There is no documentation for the custom subclasses or self-contained views of UIImagePickerController in iPhone OS 3.0.1.  This includes PLCameraView and its custom subclasses (PLImageTile, PLRotationView, PLImageScroller, PLImageView, PLCropOverlay, PLCropLCDLayer, TPBottomDualButtonBar, TPPushButton and TPCameraPushButton).

Additional Camera APIs are now available in iPhone OS 3.1.  Please review these new APIs to see if they meet your needs.  If any additional APIs are desired, please file an enhancement request via the Bug Reporter, <http://bugreport.apple.com>.

So apparently they were rejecting it because they believed I was using UIImagePickerController in an illegal manner under OS3.o. Huh?

My Denial

Well of course the app was doing no such thing. I had gone to considerable effort to ensure that I was conforming fully with Apple’s rules, and the app was compiled with OS3.1 as its base SDK. But… I am guessing that my “mistake” was to try to ensure backward compatibility, with a graceful and  minor degradation of capability for those still running OS3.0.

What I had done was to set the deployment SDK target to OS3.0, so that the app could also run on devices with the older OS3.0, but avoiding any use of the UIImagePickerController in that case. The reasons for doing so were

  1. The camera overlay view is not essential for the app to be useful – the OS3.0 view with a plain background still gives a good 3-D “augmented reality” perspective which is more than adequate for the type of data being presented by the app.
  2. If users upgrade from OS3.0 to 3.1 (a free upgrade anyway), then the camera view will become available to them without needing to obtain a new or upgraded app, so it should serve simply to encourage the OS upgrade.

So to re-iterate – the app was NOT using any private API calls, under ANY circumstances.

I can only assume that the tester/s jumped to the erroneous conclusion that it did so because they did not read or understand my app description (in which I explained the app’s difference in behaviour on OS 3.0), and/or that they did not test it on OS3.0 as well as on OS3.1.

The above is essentially exactly what I explained to Apple in my emailed response to their rejection notice, and I asked them for confirmation as to whether or not they were maintaining their rejection, and if so whether recompile the app to run on OS3.1 only would be required.

Apple’s Further Response

I am pleased to say that it was less than 24 hours before I heard from Apple again – this time via a phone call from the USA. However, the bad news was that Apple was requesting me to resubmit the app for deployment only on OS3.1. Of course I queried why this was necessary, given that I was not using any private APIs, and the (polite) response I got was that, for the sake of the approval process, it would simply be necessary for me to make this change. With hindsight, I should have pushed for a more detailed justification, but being on a crowded and noisy bus, and not sensing that any negotiation might be possible, I agreed to make the change and resubmit.

The Status Quo

So, despite the fact that Apple’s rejection was justified with spurious reasons, the situation now is that I have resubmitted the app for deployment only on OS3.1.

The main question I have now, of course, is whether or not I have gone to the back of the queue once more, and will likely therefore have another two week wait. I’m not holding my breath – I’m betting on at least another two weeks. 😦

The saddest part of this story, of course, is that it is just one of many weird and wonderful rejection stories. I’m not going to dump on Apple, because there are some great upsides to working with them, but I have now joined the ranks (throngs?) of those who find whole app approval process somewhat arbitrary and disempowering.

If Apple does have any formal rules and guidelines, why can’t they let developers know what they are? Why can’t they have an arbitration or escalation process for those who rejections appear to be based on dubious grounds. And why oh why can’t we know exactly where our apps are in the queue?

Geotagging and Augmented Reality – New Standards Needed?

Augmented reality (AR) apps are all the buzz at the moment, and they do offer some exciting possibilities. In fact I have already created an iPhone app using certain AR techniques myself, and intend to submit it to the app store as soon as Apple will permit it. More later. 😉

However, whilst considering some of the implications of using AR, I was surprised to find that current geotagging standards don’t seem to be on a par with what AR technology permits. There is definitely scope for extending the current standards, and its seems very likely to me that demand for this will grow rapidly from now on.

In its simplest form, geotagging is simply a means of tagging a piece of data (such as a photo) with a latitude and longitude, and thus allowing data to be mapped or accessed via locational search.

An implicit assumption (given that it is called geotagging) is that the data is associated with a point on the earth’s surface, whatever the altitude of the surface might be at that location, in which case an altitude is not strictly required, because the earth’s topography can be assumed to be reasonably constant and knowable via other sources. But what about the case, for example, when a photo is taken from an aeroplane? In that case, altitude would be an additional parameter required in order to correctly differentiate it from a photo taken on the earth’s surface. Given that GPS systems typically record altitude, it is hardly a stretch to include it as  standard additional tag item.

And going further still, given that augmented reality devices measure spatial attitude of the camera, would it not make sense to enable any photo taken to be tagged with the heading (aka azimuth) and elevation (ie. local horizon coordinates) at which is was taken, and further also with the camera tilt (eg. whether portrait or landscape relative to horizon, or any angle in between), and with angle subtended by the camera shot, both vertically and horizontally. It would be necessary to provide all this extra geotagging information in order to be able to correctly reproduce/model the exact “window” into space that the photo represents. Whilst this additional data might would be unnecessary for common scenic or portrait photography, it could be quite valuable in other situations. For example, two such fully (and accurately) geotagged photos taken of the same object from different locations would allow a 3-D reconstruction of that object to be created. This is not a trivial implication!

One of the existing geotagging standards (FlickrFly) also allows an additional item to be specified ie. distance or range from the camera to the subject of the photo, which presumably is necessary when a locational search is being done for the subject of the photo rather than for the location from which the photo was taken.

In order to avoid missing out on the possibilities that AR apps can already provide, and to be able to start constructing better, more powerful geolocational systems and applications which take advantage of this, I would propose that existing geotagging standards be extended to include all of the following:

  • date, time, timezone offset (to provide instant in time)
  • geographic latitude, longitude and altitude (to provide location relative to earth’s surface and mean sea level)
  • camera viewpoint central heading (0°=N, 90°=E, 180°=S, 270°=W)
  • camera viewpoint central elevation (0°=horizontal, -90°=vertical downwards, +90°=vertical upwards)
  • camera tilt (0°=portrait/upright, 90°=landscape/top points left, 180°=landscape/top points down, 270°=portrait/top points down)
  • camera angle subtended vertically (ie. along nominal top to bottom)
  • camera angle subtended horizontally (ie. along nominal side to side)
  • range of point of interest from camera (if there is a relevant POI involved)

I would welcome feedback on this proposal from anyone knowledgeable with the current state of geotagging. I am certainly not an expert in this area, but for those who have the capabilities to influence things in this area, I believe that there may be opportunities for valuable advances, especially in relation to the new generation of AR technology.