Internet Privacy—Or Lack Thereof—Is Quietly Affecting Your Mental Health

The insidious trickle of tech tracking into every aspect of our lives can be toxic in ways we aren't even conscious of.

Most of us have a growing awareness that our personal data is less and less private. Whether it’s handing over our contact info to score a quick discount or allowing apps to access our photos and messages, these trade-offs have become an unsettling norm of the digital age. Yet it’s increasingly clear—not least from cases like Facebook’s exploitation of teen girls’ insecurities on Instagram—that the exchange comes at the risk of not just our privacy but also our mental health.

The insidious trickle of tech tracking into every aspect of our lives can be toxic in ways we may not even be fully conscious of. “Research has shown that privacy mediates important psychological needs, such as the ability to have a fresh start, recover from setbacks, and achieve catharsis,” says Elias Aboujaoude, clinical professor of psychiatry at the Stanford University School of Medicine and author of Virtually You: The Dangerous Powers of the E-Personality. “These needs are difficult to meet if we are constantly under a microscope, yet this is the state we’re in thanks to social media and companies whose business model is to endlessly mine and sell our personal data.”

Of course, we also get something out of the deal: the convenience and entertainment of the many digital amenities that enhance our lives. “You can’t really avoid data collection anymore,” says Sandra Wachter, associate professor at the University of Oxford Internet Institute, who studies the ethical implications of A.I., big data, and technology. “People are aware that data collection happens all the time, but they’re not fully aware of what risks that might bear. They think, What’s the big deal? Let them have my email, I get a free app. But the saying is true that if it’s free, you’re not the customer; you’re the product. You’re paying for it—you just don’t know how high the price actually is.”

Because the world of digital tech is so vast, complex, and rapidly evolving, even if you try to be careful about giving away your data, it’s somewhat out of your hands, so don’t beat yourself up if you don’t read the fine print of every (or any) terms and conditions agreement you sign. “Big companies are often aware that no one has time to review them, and that it’s a dodgy practice,” says Wachter. But it’s still important to understand the ramifications of this exchange, because they go far beyond the sudden appearance of banner ads for those shoes you googled last week.

“These algorithms and the laws that regulate them are designed to control who has access to data, but don’t govern what happens after it’s collected,” says Wachter. So even if you feel you have nothing to hide from companies collecting your data, that doesn’t mean you have nothing to lose. “An algorithm can take seemingly benign data, spot hidden correlations, and infer information you never volunteered—like gender, sexual orientation, ethnicity, race, religious beliefs, and political activity—and then use that information in a very subtle way, like excluding specific users from seeing certain products or offers.”

Research shows that discriminatory digital ads have skewed housing opportunities and job postings by race, socioeconomic status, and gender (making it likelier for women to see jobs with, say, an entertainment or domestic focus—some of which may be lower-paying). Credit card companies have also targeted ads by age, excluding younger demographics. And online advertisers have exploited health issues like addiction, underage drinking, and teen vaping for profit. Even something as innocent-seeming as your level of extraversion can be used by marketers to boost product sales.  

“There are endless examples of how privacy risk [turns into] discrimination risk,” says Wachter. “You can lose out on opportunities without being aware, and you have very little protection. It’s hard to control what people learn about you because you don’t know what your data says, so you can’t fully understand the consequences. If you don’t know something bad happened, what remedy do you have?”

Turns out that gut sense of unease you get when you blindly accept another app’s terms is grounded in cold, hard reality. Whether it’s a twinge of anxiety when you spot a creepily targeted ad or a flare of anger at headlines about another tech giant’s infractions, our growing awareness of personal data violations (and the associated risks) can weigh us down by diminishing the emotional security that comes with privacy.

But that doesn’t mean there’s nothing we can do to protect our data—and our mental health. The first step, say experts, is not surrendering in the face of what might feel like a losing battle. “We need to start putting our money where our mouth is when it comes to privacy,” says Aboujaoude. “Appreciate it as a human right worth defending, and avoid the defeatist trap that suggests nothing can be done. Think of your mental health: Giving up control of the details that make us who we are is disempowering and psychologically destabilizing.”

Short of going off the grid (yeah, right), there are actions you can take to offset the negative consequences of digital privacy invasions, from low-level everyday anxiety to systemic injustices like discrimination and exploitation. These steps might seem small, but they matter.

Personalize your settings to protect yourself.

“Very few of us take full advantage of the privacy features offered by our favorite platforms,” says Aboujaoude. When websites or programs offer you the choice to customize your privacy settings—often when you sign up or download an update—take the time to opt out of unnecessarily invasive features. For example, many apps allow the option to enable geolocation services only when the app is in use, versus all the time. And by disabling something as simple as Gmail’s “always display external images” setting, you can prevent email trackers from detecting when you open messages.

Clean up your digital footprint.

Have you been meaning to delete some old, embarrassing tweets or Insta posts? Sure, once you’ve publicly posted something online, it’s hard to remove entirely (thanks to automated archives), but you can make it less accessible. Carefully consider whether it’s worth having any public social media profiles—you can always start a separate, protected account for private content. Periodically review downloaded apps, programs and add-ons, and uninstall those you don’t use. Regularly clear your browser cache and cookies. Sign up for an identity theft protection program (offered free through some banks and credit check services) that will alert you to data and password breaches.

Educate yourself—and join the conversation.

“The disequilibrium between the individual and big tech is so vast that this can no longer be a private citizen’s battle,” says Aboujaoude. “We need legislation to help us.” Laws that protect digital privacy vary by state and are in different stages of legislation, so it’s worth checking to see what’s in the works where you live, and reaching out to your local representatives to express support or dissent. The more of us who get involved in the conversation, the better. “We need to get people, old and young, interested in privacy protection,” adds Wachter. “As people find out about data misuses, they start caring more and voting with their feet. Everyone has something to lose.”

This story originally appeared on: Glamour - Author:Condé Nast

Tags