This year’s Space Apps Hackathon saw me (Kris Sum) teaming up with Jon Day (ex-SwitchSystems staff), Jamie Reynolds (BelieveIn Design) and Boris Tane (Engineering student). We knew we wanted to do something with data-visualisation, but weren’t quite sure what…


We looked through the challenges again, and we kept coming back to the Space Station Telemetry challenge. Jamie came up with a great point – lots of the I.S.S. locator websites are just maps with the space station plotted on top, and any information they do show about the space station is pretty ‘stale’, and you generally had to download a special app for your phone if you wanted anything fancy looking. We set out to fix that with something no one has ever done before*…

Use your mobile phone to track the International Space Station!

*No plugins or downloads required! – CLICK TO LAUNCH

blog_tryouts blog_tryouts3 blog_tryouts2

Day 1

What we wanted to build was really pushing the boundaries of what’s possible with a standard mobile web browser, but we did some investigation and managed to find everything we needed:

  • GPS location – to find out where the user is standing
  • Gyroscope information – to work out what direction and angle the phone is at
  • WebGL/Canvas support for rendering a 3d environment
  • NASA APIs for grabbing the space station location
  • Some complicated maths and a Server API to do the number crunching

OK, research over, so that gave us… erm… 22 hours to bring it all together! Time to get down to work.

The science bit

We’d need a few things to make this all work. Our tech stack ended up being: PyEphem, Three.js, jQuery, DeviceOrientation, Geolocation … and lots of caffine.

Working out where to look in the sky

We needed code which could take our latitutde, longitutde and altitude and return us back the compass bearing (azimuth) and angle (elevation) to look at to find the ISS. Cue the whiteboard! It started off quite blank…


Then ended up being quite full…


As soon as it was all worked out, we actually stumbled upon the awesome PyEphem library which would do the math for us, which is definately better than having us work it all out. I also found that it could work out the positions of the Sun and the Moon – which would work out handy for testing since you can’t spot the ISS during the day! I wrote some python code and converted it to a web API – will return you JSON data with everything you need to track the space station, sun and moon. We even went out at night when the ISS was passing to check that it was right (it was!).


Will it work on the phones?

We needed to make sure that our mobile devices would give us the gyro and location information, so we wrote some html snippets that confirmed we could get everything we needed. It wasn’t until day 2 that we found out that ios devices weren’t quite behaving the same way as android devices – (you have to use the .webkitCompassHeading value, not the .alpha value that other phones use!). So yes, in the end, our plan for the phones was going to be ok!


Starting to feel tired…

It was pretty late at night now, so we all headed home for some much needed sleep. Jon and I spent the evening researching how we could render the visuals in the phone, and jamie started work on finding other API feeds.

Day 2

Jamie and Jon started work on the data overlays, and we fixed the location to Exeter to save some development time. Jamie had come up with some nice designs overnight, but it was looking like we weren’t going to have enough time to implement them.


We figured we could use Three.js to render a 3d sphere which we could put a camera inside. I tied the camera controls into the device gyro, and when the device was moved, the camera looked at a different part of the sphere. Awesome!

{ video }

Now we just needed to make our scene align itself to North, and convert our azimuth and elevation data from the API feed to X,Y,Z coordinates for the 3d scene. Turns out, that was pretty tricky. Especially when you forget that Three.js uses a XYZ scheme different from all the math texts you’re referencing. And that the world is 90 degrees off. And that ios phones don’t seem to be aligned to north properly. And that the deadline is in 20 minutes. ARGH!

Presentation Time

Needless to say, we didn’t quite hit the deadline. We managed to fix things enough for the presentation demo, but we could see that people weren’t looking in the same direction as each other! Also, we were so busy trying to get things working that we didn’t leave much time for our presentation.

We didn’t win. But we were so happy with the progress we had made so far that we continued with the project. On Sunday evening I was able to watch the ISS space station fly overhead whilst my phone tracked it perfectly on the screen!



the top right dot half way through the movie is the ISS!


We’ve handily packaged everything up on our Github repo if you want to check out our python script, html and javascript code – just bear in mind that this was a very time constrained hackathon, so there’s some extremely hacky code in there!

Continued Development (1 week later)

Jamie and I have re-written most of the project now – it has a new front end GUI which shows the next 5 ISS passes and weather for your location, robust self-updating API which now also returns plantary/star data, new world sprites and 2d rendering support (so we can now position labels inside our 3d scene).

As of Sunday 12th April:


As of Sunday 19th April:

atsat_2atsat_1b atsat_1c