An innovative AR overlay for the Map My Tracks iOS app allowing users to visualise live tracking pins and local points of interest directly on the horizon in real-time.
What we did
- Native iOS development using Swift and ARKit
- Integration of real-time GPS data with augmented reality overlays
- UI/UX design for immersive spectator experiences
- Implementation of a live-tracking "Spectator Mode" using virtual pins
- Optimisation of spatial mapping and geographical coordinate translation
- Development of a low-latency data synchronisation engine

About the project
Map My Tracks has long been a pioneer in the fitness tracking space, providing outdoor enthusiasts with a robust platform to record and share their adventures. As our own in-house product, it serves as a playground for innovation where we test the boundaries of what is possible in sports app development. In 2017, with the release of Apple’s ARKit, we identified a unique opportunity to enhance the way spectators and participants interact with live events. By bridging the gap between digital data and the physical world, we aimed to create a more visceral connection between the user and the landscape.
The objective was to introduce an Augmented Reality (AR) feature that would allow users to look through their iPhone camera and see virtual markers floating in the real world. These markers represent points of interest or, more importantly, the live location of friends and family participating in a race. By integrating this into our existing ios app development services workflow, we aimed to solve the common problem faced by spectators at large-scale endurance events: the difficulty of locating a specific athlete in a crowded field or across a vast distance.
This ambitious addition was part of a broader minimum viable product strategy designed to bring cutting-edge tech to our users quickly. By focusing on the core utility of live tracking in an AR environment, we were able to launch a feature that felt like the future of sports spectating. This project not only improved the Map My Tracks experience but also solidified our reputation as forward-thinking mobile app developers in the UK, capable of mastering new frameworks as soon as they are released to the public.

The Challenge
Developing for augmented reality in 2017 presented a steep learning curve, particularly when combining it with real-world geographical data. The challenge was not just to show an image on a screen, but to make that image feel anchored to a physical location miles away. Our primary hurdles included:
- Maintaining the spatial accuracy of virtual pins when users were moving at high speeds or in areas with fluctuating GPS signals.
- Preventing "visual jitter" where the AR markers would bounce or drift away from their true geographical coordinates, breaking the immersion.
- Managing the significant battery drain associated with running the camera, GPS, and high-intensity graphics processing simultaneously.
- Designing a user interface that remained readable on a horizon cluttered with different terrain, light conditions, and weather.
- Developing a real-time synchronisation logic that ensured spectator pins reflected the tracker’s position with minimal latency across mobile networks.
- Translating 2D longitudinal and latitudinal coordinates into a 3D AR space that accounted for the earth's curvature and user elevation.

Our Solution
Harnessing Apple's ARKit Framework
Our development team used the then-new ARKit framework to handle world tracking and visual odometry. By combining data from the iPhone’s camera and motion sensors, we created a stable coordinate system. We developed custom math libraries to translate GPS coordinates into the AR world-space. This ensured that if a friend was tracking five miles to the North-West, their virtual pin appeared precisely in that direction on the user's screen.
Live Pin Spectator Mode
The standout feature of the update was the live tracking overlay. We built a specific "Spectator Mode" where friends and family could open the AR view at an event. The app queries the Map My Tracks real-time API to fetch the coordinates of anyone tracking live. A virtual pin is then rendered on the horizon, showing the person's name and their current distance. This creates a powerful "X-ray" vision effect, allowing a spectator to see exactly where their athlete is, even if they are hidden behind a hill or several kilometres away.
Horizon-Based Point of Interest (POI) Logic
To provide additional context for the user, we implemented a POI layer. This included local landmarks, mountain peaks, and course milestones. We used SceneKit to render these markers, ensuring they scaled realistically based on distance. A marker for a peak ten miles away would appear smaller and higher on the horizon than a nearby course marker. This required a high degree of technical precision in our web development backend to serve the correct data based on the user's immediate vicinity.
Optimising for the MVP Approach
Because AR was a new frontier, we adopted a strict MVP approach to the initial rollout. We focused on the core "look and see" functionality, ensuring the tracking was rock-solid before adding more decorative elements. This allowed us to gather user feedback on the ergonomics of holding a phone up at a race, which informed later updates such as semi-transparent UI elements that didn't block the view of the actual race. This minimum viable product strategy meant we were one of the first fitness apps in the world to offer a functional AR tracking tool.

The Results
The introduction of Augmented Reality to Map My Tracks was a significant milestone for Tinderhouse, proving that our team could take experimental technology and turn it into a functional business tool.
First-to-Market Advantage: Map My Tracks became one of the first fitness apps to use ARKit, resulting in significant global press coverage and a surge in new users.
Enhanced Spectator Engagement: Users reported a much higher sense of connection to races, with the AR view becoming a "must-have" tool at major marathon and cycling events.
Technical Validation: The project served as a successful proof-of-concept for our ability to handle complex spatial data, leading to interest from other sectors looking for AR solutions.
Increased App Retention: The "cool factor" of the AR feature, combined with its genuine utility, led to a measurable increase in daily active users during the 2017 event season.
Seamless Performance: Despite the technical complexity, the feature maintained high frame rates on supported devices, providing a smooth and professional user experience.

Technical Highlights
Language: Swift 4.0 for native iOS performance.
Frameworks: ARKit for world tracking; SceneKit for 3D marker rendering.
Location Services: CoreLocation for high-precision GPS and compass heading data.
Data Handling: Real-time RESTful API integration for live tracking updates.
Graphics: Custom shaders to ensure pins remained visible in high-contrast outdoor light.
Mathematics: Spherical trigonometry for distance and bearing calculations between GPS points.

Ready to Build the Future of Your App?
Whether you are looking to integrate Augmented Reality or need a solid MVP development plan for a new idea, our Kent-based team has the expertise to help. We specialise in turning complex technology into user-friendly mobile experiences.