Valtik Studios
Back to blog
Stravamedium2026-03-1711 min

Strava Heat Maps: How Fitness Data Exposed Every Secret Military Base

In 2018, a 20-year-old student noticed Strava's global heat map glowed in places it shouldn't. Remote deserts, Arctic ice, supposedly-unoccupied Pacific atolls. He had found every classified military base on Earth by following soldiers who ran laps. Eight years later, Strava still leaks. A deep dive into fitness-data OSINT and what it means for your threat model.

TT
Tre Trebucchi·Founder, Valtik Studios. Penetration Tester

Founder of Valtik Studios. Pentester. Based in Connecticut, serving US mid-market.

The moment fitness data became a national security leak

We see this pattern show up on almost every engagement.

Nathan Ruser was 20 years old and studying international security at the Australian National University. In November 2017, Strava. The fitness tracking social network. Released an updated "global heat map" showing aggregated anonymized activity data from its users. Billions of GPS data points from 27 million users. Visualized as glowing lines across the Earth, the heat map looked cool. It also revealed something Strava and its users had not considered.

Ruser zoomed in on remote areas where fitness activity should be sparse. The results were immediate and extraordinary:

  • US Forward Operating Base outlines in Syria, Iraq, Afghanistan, Niger, and Djibouti. All clearly visible as glowing running routes around otherwise dark satellite images
  • French military bases in Niger
  • Russian bases in Syria (from Russian special forces operators who had been running laps with fitness trackers)
  • Turkish military patrol routes in Syria
  • NSA station locations in undisclosed areas
  • CIA black sites
  • Nuclear weapons facilities in multiple countries
  • Classified intelligence installations the US has never officially acknowledged exist

Ruser posted his findings on Twitter on January 27, 2018. Within 24 hours, the story had gone global. Within 48 hours, the Pentagon acknowledged operational security concerns and ordered a review of wearable device policies. Within weeks, every NATO military had issued guidance on fitness tracker use in operational environments.

Strava eventually reduced the heat map's resolution and gave users more opt-out controls. But the fundamental dynamic. Individual fitness tracking combined with social-network publishing creates global OSINT datasets. Has not changed. Every year since 2018 has produced new leaks using the same pattern.

This post walks through how fitness-data OSINT works, what's still exposed in 2026. And what the implications are for individuals, militaries, and organizations that employ the kind of people who like to run.

How the heat map reveals secrets

Strava's heat map aggregates all user activity (running, cycling, swimming, walking) into a geographic density visualization. Bright areas = lots of activity. Dark areas = little or no activity.

In populous areas, the map shows roads, parks, trails. The places people exercise. Background noise: New York City's entire Manhattan grid lights up, San Francisco's neighborhoods, every marathon route in every major city.

In depopulated areas (oceans, deserts, tundra, wilderness), the heat map should be mostly dark. When activity appears where it shouldn't be, it's a signal.

Consider the use cases:

Forward Operating Base in the Syrian desert. A hundred US soldiers stationed there. Most of them wear fitness trackers. Garmin, Apple Watch, Fitbit, etc. Some opt to sync their runs to Strava for bragging rights with friends back home. They run in the mornings. They run the perimeter of the base. Hundreds of runs logged over months. On the heat map: the base outline, perimeter routes, and internal paths glow clearly against an otherwise dark desert background.

Russian Ministry of Defense special operations base in Syria. Similar pattern. Russian troops also exercise, also wear fitness trackers, also sync to Strava. The base location, which Russia officially denied existed, is clearly visible.

NSA listening station. Staff run routes around the perimeter. Entire building outline visible.

Classified satellite ground station. Personnel jog the access road. The road reveals the location of a facility that wasn't on any public map.

The pattern works because:

  1. Fitness culture is real. Soldiers, intelligence officers, contractors, and support staff all exercise. Many of them use fitness trackers. Many of those sync to Strava or similar services.
  2. Secret facilities are staffed. No matter how classified the facility, the people working there exist. They exercise. They commute.
  3. Aggregation reveals patterns. Individual runs might be noise. Aggregated across hundreds of people and months of activity, patterns become obvious.
  4. Default privacy settings allowed sharing. Strava's defaults in 2017 were "share activity with the global heat map." Most users never changed defaults. The heat map effectively crowdsourced a global military infrastructure map.

What Strava did after 2018

Under pressure, Strava implemented several changes:

  • Reduced heat map resolution. The current heat map has less detail, making building outlines harder to read.
  • Updated default privacy settings. New accounts no longer default to sharing with the heat map.
  • Enhanced opt-out. Users can exclude specific areas ("privacy zones") from their public activities.
  • Removed some identifying data. Individual activity identifiability in heat map data was reduced.

But none of these changes eliminated the fundamental leak. The heat map still exists. Aggregate patterns still reveal facility locations, at slightly lower fidelity. Subsequent researchers have shown the heat map still exposes substantial OSINT value in 2024-2026.

What's still leaking in 2026

Fitness tracking OSINT has matured into a specialty. Analysts routinely use:

Strava segment leaderboards

Strava's segments feature lets users create named route segments and compete for times on them. Segment leaderboards are typically public.

An analyst investigating a specific facility can:

  1. Look at the heat map for the facility
  2. Identify segments created around the facility (named things like "Camp Lemonnier 5K")
  3. Access the segment leaderboard
  4. See individual users who've run the segment
  5. Click through to their profiles
  6. See their other activities, names, photos, and locations

From the original anonymous aggregate data, the analyst now has a named list of individuals who work at the facility, with their routines, their home city (from activity patterns on weekends and off-duty time). And often their real identities.

This isn't hypothetical. Journalists, OSINT researchers. And nation-state intelligence services have used this exact chain to identify undercover personnel at CIA, NSA, DGSE, Mossad, and other agencies' covert facilities.

Activity-timing correlation

Every activity Strava records includes:

  • Start time and end time
  • GPS track
  • Heart rate (if wearable supports it)
  • Cadence, power, pace (varies by activity)
  • Device model
  • App version

For analysts building surveillance profiles, activity timing enables:

  • Shift identification. "Person A runs 5:00-5:45 AM every day, then GPS shows them moving to a specific building for the rest of the day." That's a commute + on-shift inference from fitness data alone.
  • Travel detection. A user who normally runs in Virginia suddenly logs runs in Kabul. That's deployment evidence.
  • Co-location inference. Two users whose runs consistently overlap in time and location are likely colleagues or romantic partners.
  • Home address triangulation. Weekend runs typically start and end near the user's home. Over time, the start/end locations converge to the home address.

Other fitness platforms

The problem isn't unique to Strava. Similar OSINT vectors exist on:

  • Garmin Connect. Less social-network-like but still has public activity sharing
  • Apple Fitness. Sharing with contacts is opt-in, but iMessage sharing of workout summaries exposes location
  • Fitbit. Social features include step competitions that can reveal user identity
  • Nike Run Club. Public activity feeds
  • MyFitnessPal. Less GPS-heavy but still reveals location via gym check-ins
  • Zwift. Indoor cycling platform with geolocation in profile

Every one of these has documented OSINT leakage patterns.

Wearable device fingerprinting

Beyond app-layer social features, the wearable devices themselves broadcast identifiers:

  • Bluetooth MAC addresses (some devices rotate, others don't)
  • Device-specific advertising patterns
  • Heart rate strap signals that can correlate wearers over time

Sufficiently motivated adversaries can use wearable device fingerprinting to track specific individuals even without ever accessing their cloud account.

Recent fitness-OSINT incidents

2023: South Korean intelligence officer identification. A Strava segment near a South Korean National Intelligence Service facility was used to identify officers via leaderboard analysis. Reported by investigative journalists in the Hankyoreh.

2024: Russian FSB officer tracking. Ukrainian OSINT researchers used Strava activity patterns to identify Russian FSB officers operating in Crimea, tracing them to their home addresses in mainland Russia.

2024: Area 51 / Nellis AFB activity. Researchers noted continued Strava activity in classified areas of Nellis Air Force Base and adjacent facilities, despite official policies requiring wearable device opt-out.

2025: Saudi Arabia oil facility patterns. Aramco and associated security personnel fitness activity patterns used by researchers to map facility schedules and shift patterns.

2025-2026: multiple journalist investigations using Strava + other fitness data to track executives, politicians. And intelligence officers.

What this means for your threat model

For individuals

If you're a normal person who runs for fun, the fitness-OSINT threat to you is limited. Your home address might be inferable. And weekend patterns can be deduced, but aside from narrow stalking scenarios, the practical risk is low.

However, if you're in any of these categories, fitness-data OSINT is a real concern:

  • Government employees with security clearances
  • Military personnel (active or reserve)
  • Intelligence community
  • Diplomats
  • Journalists (particularly those covering sensitive topics)
  • Law enforcement (especially undercover or protective services)
  • Business executives of any stakeholder-exposed company
  • Domestic abuse survivors or anyone evading a specific identified threat
  • Politicians and political staff
  • Researchers studying sensitive topics

For all of these, the defensive posture should include fitness-data minimization.

For organizations

If your organization has facilities that you prefer not to be publicly mapped, employee fitness-tracking policies matter:

  • Secure / classified facilities: explicit prohibition on wearables with GPS/activity sharing
  • Executive protection: fitness tracker opt-outs for protected individuals, or dedicated fitness-tracking devices isolated from social sharing
  • Corporate campuses with competitive IP concerns: awareness of patterns that might reveal workforce size and schedules

A surprising number of organizations have no policy on wearable device use and no awareness of fitness-data OSINT exposure.

Defenses that work

Individual-level defenses

1. Disable contribution to global heat maps.

On Strava: Settings → Privacy Controls → Aggregated Data Usage → "Opt out of data contribution."

On other platforms: look for equivalent settings.

2. Make activities private by default.

On Strava: Settings → Privacy Controls → Activities → "Only You."

You can still share individual activities with specific people (coaches, running partners) without making them publicly searchable.

3. Use privacy zones.

Strava lets you hide parts of activities that occur within user-defined "privacy zones" (usually your home address, workplace). Activities still record, but the public view doesn't show start/end points near these zones.

4. Don't use segments.

Segment leaderboards are a significant deanonymization vector. If you participate in public segments, you're participating in a global social identification graph.

5. Consider whether you need fitness data in the cloud at all.

Many Garmin devices, Apple Watches, and others can record activities without syncing to a social platform. If you want the training data without the social element, disconnect the cloud sharing entirely.

6. Audit your profile.

Your Strava profile probably has:

  • Your full name
  • Your profile photo
  • A bio that may reveal employment
  • Kudos from specific friends (revealing your social graph)
  • Clubs you belong to (which often reveal employer or locale)

Review and minimize.

Organizational defenses

For organizations that need to address this:

1. Written policy on wearables and fitness sharing.

Especially for classified environments, executive protection, and intelligence-adjacent work. Policy should specify what devices are allowed, what connectivity is allowed, and enforcement mechanisms.

2. Training.

Employees need to understand the OSINT risk from fitness data, not follow a policy they don't understand.

3. Physical security integration.

Wearable device policy should be part of facility security briefings, travel security briefings. And protective security detail protocols.

4. Regular OSINT self-assessment.

Check what your organization looks like on Strava's heat map. Look at segment leaderboards around your facilities. Search for your organization's name on fitness platforms. The first time you do this, you'll find things you didn't expect.

The broader pattern

Fitness-data OSINT is an instance of a much larger pattern: consumer technology, aggregated at scale, produces surveillance datasets that were never intended. Other examples:

  • Geofence warrants using Google Location History
  • License plate reader networks (Flock, municipal ALPR)
  • Cellular tower dumps
  • Social media "check-in" networks identifying employee routines
  • Ring / Nest doorbell networks mapped into law enforcement surveillance
  • Smart home devices creating household-level behavior graphs

None of these were designed as surveillance systems. All of them became surveillance systems because of aggregation.

The pattern suggests a defensive principle: any behavior logged at scale will eventually be queryable by people you don't want querying it. Your best defense is minimization at the source. Don't produce the data in the first place. Because once it's in the aggregation, controlling downstream use is structurally difficult.

What Valtik does in this space

Valtik provides OSINT self-assessment engagements for organizations that want to understand their public-data exposure:

  • Employee OSINT audit. We search public sources for identifiable information about your workforce that could support targeting attacks.
  • Facility OSINT audit. We check fitness platforms, commercial satellite imagery, social media, and other sources to see what your physical infrastructure looks like from the outside.
  • Executive OSINT assessment. Focused on senior leaders whose personal exposure creates corporate risk.

For individuals in high-risk roles (journalists, executives, politically exposed persons), we offer personal OSINT reviews. Finding what an adversary would find, and helping you reduce exposure.

Reach out via https://valtikstudios.com.

The honest takeaway

Strava's global heat map in 2018 was a viral moment. The underlying dynamic. Consumer fitness tracking creates global OSINT datasets. Has not been fixed. Every individual's contribution is minor. The aggregate is a surveillance system.

If your threat model includes adversaries with access to OSINT aggregation (intelligence services, organized crime, motivated private investigators, stalkers, nation-state-linked journalists doing investigations), fitness data minimization should be part of your posture.

If your threat model doesn't include any of that, you're probably fine. But it's worth knowing that every time you hit "upload" on a Garmin watch, you're contributing to a dataset that may eventually be queried in ways you didn't consent to.

Sources

  1. Strava Heat Map Military Base Reveal. The Washington Post (2018)
  2. Strava Heatmap Reveals Military Bases. BBC
  3. Fitness Tracking App Strava Gives Away Location of Secret US Army Bases. The Guardian
  4. Nathan Ruser Original Twitter Thread
  5. Strava Metadata Leaks Home Addresses. The Register
  6. Department of Defense Fitness Tracker Policy Update. DOD
  7. Strava Privacy Controls Documentation
  8. OSINT Techniques for Fitness Data. Bellingcat
  9. The Polar Fitness App Data Leak. De Correspondent
  10. Privacy Implications of Wearable Technology. EFF
osintstravafitness trackinglocation privacymilitary osintdata privacyoperational securityopsecsurveillanceresearch

Want us to check your Strava setup?

Our scanner detects this exact misconfiguration. plus dozens more across 38 platforms. Free website check available, no commitment required.

Get new research in your inbox
No spam. No newsletter filler. Only new posts as they publish.