Showing results for 
Show  only  | Search instead for 
Did you mean: 
Showing results for 
Show  only  | Search instead for 
Did you mean: 
Community Manager

What's Your Take? - "Fitness app lights up staff at military bases"?

Fitness app Strava lights up staff at military bases


From the BBC: “Online fitness tracker Strava has published a "heatmap" showing the paths its users log as they run or cycle. It appears to show the structure of foreign military bases in countries including Syria and Afghanistan as soldiers move around them. The US military was examining the heatmap, a spokesman said.”


What are your thoughts about the implications of users at your organization sharing so much data with third parties?


What are the implications/responsibilities for vendors collecting all this data (even if anonymous) and sharing it for promotional purposes (or any reason at all)?


That's your reaction? 

11 Replies
Contributor II

I think it comes under "loose lips sink ships."  With counterintelligence and operational security, sometimes it seems like the more things change, the more things stay the same.


Any information employees put out on social media like LinkedIn is a track for OSINT collection by a threat actor. Want to find the AD administrators? Look on LinkedIn.  Want to find the DB admins? Look on LinkedIn?


Organizations can leak capabilities, or the lack thereof, through RFP/RFQ documents or job posting as well.  Looking for firewall administrator with advanced knowledge of X company. Looking for a new analytics platform. Maybe there is no incumbent, maybe there is.  Either way, these leak information about the current or future state of your network.


In some cases where it isn't going to cause issues for the mission (fake RFPs are probably not a good idea) social media provides a channel for leveraging deception operations to confuse and delay the adversary. These public data maps of user trackers offer such an opportunity as well, if leveraged appropriately. Like Patton's ghost army on the English coast prior to 1941, forged data here could provide a deception opportunity to telegraph phantom troop locations to confuse enemy intelligence gathering, exaggerate force deployment, or even potentially lead the enemy into a trap based on their faulty intelligence.


But Big Bank Co. probably isn't going to be leading enemy h4x0rz into a feint and then enveloping them with a pincer and applying overwhelming force.  Most of us not in the govt or the military space probably just need to deal with information leaks that can't really be stopped and need an education and/or HR type solution rather than a technical one. In the end, if an employee posts details from his or her CV on LinkedIn then posts about vacation plans on Facebook with lax privacy settings, it is our (infosec's) problem (potentially) but not really our responsibility.

-- wdf//CISSP, CSSLP
Community Champion

Comes around to awareness again. We need to keep on bringing to light the amount of data that is being collected to our users. One of my favorite examples is that of a chocolate candy maker asking a data warehouse for a mailing list of anyone who had signed up  for weight loss programs such as Jenny Craig, Weight Watchers, etc., and then they mailed them brochures for chocolates, knowing that they would probably be weak and order something sweet. Now nothing nefarious or illegal about that. Is it considered unethical to prey on the potential presumed weakness of someone who has clearly shown lack of control of their appetite and who has reached out to weight loss organizations for help? I guess it depends on your moral compass. Some societies would see it as offensive and others as capitalism, but regardless it shows that:

1) The collectors of the data make money on selling the data, not on the ethical nature of the data's use

2) Data can identify things about you (patterns, tendencies, weaknesses, etc.) that you might not have thought about.

3) More and more data is being collected everyday. Do you really want to contribute to the stockpile?


Another example I use of how we are being tracked, even though we may think we have outsmarted the data gatherers is location tracking. Even though you may have turned off location tracking on your phone, if enough data is gathered (and shared) about the Wi-Fi hotspots your phone discovers as you are moving around can be used to narrow down your location. Several years ago cell phones were automatically relaying the discovered hotspots back to the data hoarders and the data hoarders were using it to build heat maps of locations to be able to be more helpful when providing location specific services if the location tracking had been turned off.


I listed to a podcast yesterday that told of how a hacker had defeated 2FA and was able to reset some prominent people's passwords by forwarding their cell phone SMS to the hackers phone and then requesting a password reset that was being protected by 2FA. The 2FA message was forwarded to the hackers phone so the hacker was able to defeat the 2FA method. How was the hacker able to do this? The phone companies had a default setting that if you knew the user's last 4 of SSN you could setup call forwarding.


I think people just have a very naïve trust in their device to only track their fitness (or whatever the intended use of the device is) and not think of just exactly what data it is collecting and sending out. It is our job to help educate them.

Newcomer II

I think the privacy concern in the context of IoT is still not fully understood nor have all the ramifications been totally recognized. This is now making headlines because of the possibly military implications, but what about headlines involving baby monitors or hackers gaining access to houses and internal cameras due to IoT devices. Security was not the foremost concern when these devices were developed because they do not store ePHI or CHD, but they give you enough information in combination with other devices that the same person or household is using to be invading your privacy to the same degree or even more than hacking into the EMR system that you use as a patient. I am still driving a car that has power windows as the only modern development (and a radio) because I do not want the IoT messing with my driving or knowing where I have been. I only have a smartphone because my workplace required me to have one. Once we have the approach that any new gadget has a security assessment before going into production and security best practices must be built in and are continuously updated once the device is sold to consumers, I may be more willing to purchase fitbits etc. Until then, I tell people where I will be during the day rather than some stranger logging into an interface to follow some gadget to find out what I am doing.

Newcomer I

Old lessons being revisited with new technology: users are the weakest link in security plans.

Also, “Anonymous” doesn’t mean unidentifiable or non-sensitive. “Anonymizing“ seems to be code for minimal scrubbing to get free data to attract more customers. Please see the Privacy Policy. You were warned.
Contributor I

Just to echo/build upon some things already metioned:


This further highlights the obvious to security professional around the world, though proves once again the inherent (and ignored) dangers that come with connected technologies.


When "Anonymous" information about "Anonymous" users is shared in such a way it is percieved as inherently benign. However no consideration is given to the fact that a group of "Anonymous" users with a very "non-Anonymous" profession, such as military personal, presents a clear and dangerous depiction of habitual tasks.


I fear that until a some sort of agreed upon framework that such IoT devices would need to comply with is produced and utilised, we will continue to see such leaking of data into the world. I fear most that, until something tragic strikes, we will continue to see little to no change.


Let's hope that I'm wrong.... 

Newcomer III

Strava Heatmap has been around for 4-5 years now.  Nobody considered it a really big deal until yesterday.


If you're in a zone where GPS signals are scrambled, your tracking means nothing.  So folks in really, really secure bases are having their tracks moved around 20 - 40 miles on a weekly basis.


We... already know where most of the bases are.  In any case, Strava isn't just for Americans.  The concept that any Strava track anywhere in the world means American military are there is just silly.  Literally, anyone with an email address can sign-up for Strava.  I've used it to find routes all over the world.


Strava already has the tools needed to ensure privacy.  Even if you do decide to sign-up with your government-issued email address (you shouldn't), you choose your privacy levels.  You can remain 100% hidden from all reporting, you can set up Hidden Zones that don't report your GPS activity, you can even easily (despite news reports) opt-out of the anonymized tracking.


A Strava track doesn't mean anyone has been there.  Seriously.  Just north of Vanuatu, there's a pair of tiny islands that have hosted thousands of bike rides and races.  Except the islands have no roads.  They're the setting for the popular training program Zwift.  The Strava interface allows anyone to log rides anywhere.  This allows people to build virtual rides and with the advent of "smart trainers", bike trainers that simulate uphill and downhill grades, people can ride perfectly smooth roads anywhere in the world from the convenience of their basements.  In prepping to ride in Europe, I've ridden the French Alps and Dolomites hundreds of times.  Virtually.  I've also ridden race courses in Virginia, South Carolina, Utah, and California.  Virtually.


Even if this all breaks down, Strava doesn't tell you where people are, it just tells you where people (probably) were.  From an intelligence perspective, even after you hack Strava, tie an account to a particular user, untangle their anonymization, and tie that data to the device to the user, it's only utility would be to establish patterns.

You only say it's impossible because nobody's done it and lived.
Newcomer I

I really like this news as there is now a perfect visualization and use case what can happen with Big Data if when anonymized.

This can be taken for any awareness presentation in the near future because otherwise you're like "yeah, there's a risk of your data being aggregated and assumptions being made of this. Always be aware of what data you are sharing with whom."

Now you can say: "Look here, people were running here and there, this is a military base etc."


In regards to implications in general:

There's not so much you can do about it other than making people aware. The US military was in this case even hading their staff fitness trackers...

Newcomer III

A heat map will hardly give anyone useful information about overseas installations; especially with all the other sources that are already at those locations (local national support personnel, Google maps, embedded reporters that are ALWAYS so concerned with operational security, etc.).  At best, the enemy will learn that Soldiers exercise...the only surprise may be that they are not all running at Olympic pace!

Newcomer II

As has been said already, the locals know where these bases are, whether they appear on the public satellite databases or not. Details about sensitive areas (or easier targets such as gyms and canteens) are likely to be readily available from sympathisers within the locally employed civilian or local troop (or wider TCN) communities. 


Most problematic is the use of the tracking devices on patrol - but patrol routes should be randomised anyway.


Slightly problematic is the tracking of any exercise routes outside base perimeters - but you shouldn't be going jogging outside the base if there is any local security threat.