Using a fitness app taught me the scary truth about why privacy settings are a feminist issue

The social fitness app Strava is meant to help people compare data on their workouts.
The social fitness app Strava is meant to help people compare data on their workouts.
Image: Reuters/Eddie Keogh
We may earn a commission from links on this page.

As a lifelong runner, I’ve become adept at predicting the best times, routes, and strategies to jog in cities while avoiding street harassers. From circumventing stops at traffic lights to steadfastly avoiding eye contact with passersby, I’ve adopted behaviors that are unfortunately standard practice for a lot of urban women. But recently, when using the social fitness tracking app Strava, I noticed a different kind of potential threat—one I wasn’t prepared for.

After I’d completed my usual 5-kilometer loop near my London flat, a stranger I didn’t know “liked” my workout—even though I had enabled stricter privacy settings, which I thought would shield my workouts from public view. This happened several more times while I jogged the same route, and then again when I was on vacation in Barcelona. Alarmed at the idea of that strangers could see the routes I run on two or three times a week, I embarked on an investigation into the privacy settings of Strava. What I learned wasn’t reassuring for an urban woman—or anyone concerned about location-based privacy.

It should be said that for many Strava users, the whole point of the app is to receive “kudos”—Strava’s equivalent of an Instagram like—from strangers. Indeed, for the (mostly male) users who dominate Strava’s feature discussion forum, the public and granular nature of Strava’s user data is what allows them to compare, compete, and quantify their performance with rigorous attention to detail. Tracking everything from speed and elevation to calories burned and personal records, Strava’s users can see how they are progressing against their past performance, and compared to users who run or cycle the same routes.

This social aspect is appealing to me, too, but with a crucial caveat: I only want people who I’ve allowed to follow me to see where I run. When you’re a woman whose personal and digital space is invaded with alarming regularity, you think carefully about how your digital life intersects with your real one—especially when the data you’re sharing is quite literally close to your front door.

I soon learned that the first problem was my assumption that “Enhanced Privacy” on Strava meant that my data and running routes were viewable only to my approved followers. In fact, it means no such thing. Strava’s “Leaderboard” function ranks the pace of all athletes who complete the same Segment, or a set distance on a given route that has been mapped by a user and added onto the app. Though I had Enhanced Privacy on, I hadn’t enabled “Hide from Leaderboards,” which is a separate toggle on the privacy settings in the app.

This meant that if I ran a particularly fast 200-meter segment in the park, landing me temporarily on a Leaderboard, anyone who was examining that segment in the app—whether or not I’ve allowed them to follow me—could see my workout that day. Troublingly, this also would allow them to see my first and last name and the photo attached to my profile.

With Leaderboards enabled (which is the default setting, even with Enhanced Privacy on), going for a run in the park is the opposite of private or anonymous. In effect, it’s like having a private Instagram account—and then finding your photos are viewable on Instagram’s Explore page.

Failing to opt out of Leaderboards was admittedly my blunder. But from there, it seemed like there were an ever-increasing number of things I needed to opt out of to avoid strangers finding out my first and last name based on where I run. In the privacy settings, I needed to toggle on Group Activity Enhanced Privacy, so my data wouldn’t be shared if I happened to run with other Strava users. I also needed to switch on Hide from FlyBys, which allows users to see other athletes (with picture and full name on the default public setting, and picture plus first name and last initial when Enhanced Privacy is on) they crossed paths with on a given route. Furthermore, if I chose to join a Challenge—for example, “run half marathon in August,” which has more than 52,000 digital participants—that data would be public no matter what. (There’s no disclaimer when you opt in, either.) This, as Strava’s support told me, is in the interest of maintaining “athletic and competitive integrity.”

Strava communications lead Andrew Vontz noted that the three levels of privacy that Strava offers—totally public, the aforementioned Enhanced Privacy, and the Private by Default option, which strips the app of any social aspect whatsoever—are designed to “balance protecting our member’s data but also creating a community that is engaging and social.” But in practice, it feels like there are more like six levels. Even if all the above parameters are met, there is still a chance the specifics of a user’s run could be viewable publicly if, as Vontz said, “you’ve shared it directly, shared it on other social media channels, or one of your followers has shared it in some way. It is also unlikely but possible that someone could find it through random sampling.”

It’s true that as a user, it’s my responsibility to know what I’m opting into when I use a service, especially when it’s free. But the fact that it took me three rounds of emails with a support rep, a call with Vontz, and a follow-up email round with him to fully understand how to prevent strangers from seeing my running routes is troubling. To borrow a phrase from English law, “a man on a Clapham omnibus”—or an average ordinary and reasonable human being—would likely not assume that their first and last name would be broadcast so indiscriminately despite having an app’s enhanced privacy enabled.

Indeed, when I polled a few other female friends who use the app, they too were not aware of the need to opt out of Leaderboards or FlyBys to maintain total control over who can see their workouts. Women both online and in the support forum have also complained about “random people following and giving kudos,” as well as the concerning nature of FlyBys when it comes to revealing first and last names to strangers, often unbeknownst to the user.

Vontz stressed that Strava takes athlete privacy and safety very seriously, and it’s true the company has introduced other features to address the issue. In addition to their clever Privacy Zones feature—which allows users to block out areas where workouts commonly begin and end, like home or office—the Beacon tool in their premium version will send live links of a user’s activity to chosen contacts. I understand that Strava is a social network as much as a fitness tracking tool, and they must balance those functions accordingly. But the multi-layered, opt-out heavy, and rather unclear nature of their settings still seems like a problem.

“If you don’t like something, you can opt out of it” is something we hear a lot in the consumer-facing tech world—whether it’s Facebook newsfeed spam, incessant push notifications, or location-tagged posts. The problem with this attitude is that it puts the onus on consumers to ensure that they’re being respected, and lets companies off the hook—the assumption being that they can bank on a good number of users being too lazy, confused, or negligent to opt out. That’s an unsustainable approach if companies want to retain the goodwill of their users. And in cases where privacy is a concern, it can be downright dangerous.