Augmented reality for the sky. LaunchDetect's mobile app puts ISS, Starlink, and Hubble into your viewfinder. This week is the math + code that powers it.
Augmented reality for satellite spotting: point your phone at the sky and see exactly where the ISS is, where Starlink trains are about to streak, where a recent SpaceX upper stage is decaying. LaunchDetect's mobile app ships this as a core feature; this week you build the math and the code that powers it.
From any observer on Earth, a celestial object's position is described by two angles: azimuth (compass direction, 0° = north, increasing clockwise) and elevation (degrees above the horizontal). Week 9 covered how to compute these from an observer's lat/lon and a satellite's ephemeris.
AR is the inverse problem: given the satellite's azimuth and elevation right now, where should the satellite's icon appear in the camera's viewport? Three coordinate systems involved:
The browser's DeviceOrientation API tells you the phone's orientation:
window.addEventListener('deviceorientation', (event) => {
const alpha = event.alpha; // compass, 0–360
const beta = event.beta; // tilt forward/back
const gamma = event.gamma; // tilt left/right
updateOverlay(alpha, beta, gamma);
});
iOS quirk: Apple requires user permission via DeviceOrientationEvent.requestPermission(). Without that gesture, no events fire. Build the UX accordingly.
The DeviceOrientation alpha is referenced to magnetic north, not true north. The two differ by the local magnetic declination, which varies from ~0° in much of South America to ~20°+ at high latitudes and can flip sign over a few hundred kilometers. To overlay satellite positions accurately, convert from true north (what skyfield gives you) to magnetic north (what alpha gives you) using a magnetic declination model — the World Magnetic Model (WMM) or its successor IGRF.
JavaScript libraries: geomagnetism on npm wraps WMM and gives declination for a (lat, lon, date) in milliseconds.
Given the phone's orientation and the satellite's (azimuth_true, elevation), compute the angular offset between the camera's center direction and the satellite's direction. If that offset is within the camera's field of view (typically ~67° horizontal, ~52° vertical for a modern smartphone), place the icon at the corresponding pixel.
// Pseudocode
const dAz = satAzimuth - phoneAzimuth; // wrapped to [-180, 180]
const dEl = satElevation - phoneElevation;
const fovH = 67, fovV = 52;
if (Math.abs(dAz) < fovH/2 && Math.abs(dEl) < fovV/2) {
const x = (W/2) + (dAz / (fovH/2)) * (W/2);
const y = (H/2) - (dEl / (fovV/2)) * (H/2);
placeIcon(x, y);
}
Three implementation approaches:
<video> stream + CSS-positioned icons. ~100 lines of code, works on every modern browser.transform: translate3d(...) on iconlets is enough and works universally.You'll build a minimal browser-based AR satellite spotter: request DeviceOrientation permission, get the user's GPS lat/lon, compute the ISS's current azimuth and elevation with satellite.js, convert to magnetic-north reference, project onto the camera viewport, and overlay a moving dot. By the end you'll have the same core experience as LaunchDetect's mobile AR feature, in 200 lines of vanilla JavaScript.
Build a browser-based AR demo using the device orientation API. Show a moving dot at the correct azimuth/elevation for the ISS overlaid on the camera view.
Test yourself. Answer key on the certificate-track page (Gold-tier feature: progress tracking and auto-grading).