Selectively revealing yourself to the world -- Privacy in the 21st century

Fifteen years ago, if you wanted to join Facebook (then The FaceBook), you needed a valid college email address and the site did not offer much more than study group meetups or a place to chat and share pictures with high school friends. Today, Facebook is a juggernaut with roughly 2.5 billion monthly active users -- and as one of the world’s largest ad platforms, the amount of data it has on its users is staggering. But in 2018, the Cambridge Analytica breach profoundly shook users’ confidence in the social network -- and the entire tech industry as well. Since the scandal, there have been Congressional hearings, lawsuits, antitrust concerns and even the complete demise of  Cambridge Analytica. But the questions did not end there -- consumers began to question how all big tech used their data. Why does Google track people’s location even if they have turned tracking off? Is Alexa recording my private conversations?  

Thanks to the fallout from the Cambridge Analytica scandal, and the endless stream of data sharing scandals since, consumers are more aware of their online privacy and are beginning to question how companies are monetizing their data. Let’s look at how the rise of social media created this data economy, the tech industries  attempt to regulate itself and how the U.S. government is woefully unprepared to address modern privacy challenges.  How do we progress in a world where every detail of our days are tracked?

Social Media’s Rise and Data Monetization

In his Harvard dorm room in 2004, Mark Zuckerberg had unknowingly created what would be the 21st century’s most influential platform -- The Facebook. What was originally a tool to connect Harvard students online, in 2019, Facebook shared 60 percent of the total U.S. digital advertising market alongside Google. Facebook is not alone -- with the rise of its popularity came a host of other companies like Instagram (which, of course, Facebook acquired in 2012), TikTok, and Snapchat, to name a few. But social media does not have the monopoly on data consumption. From streaming sites to online shopping, companies now collect and store unprecedented amounts of data gathered from their growing user base that is all too eager to share. So, what types of data are these companies keeping on you?

Surveillance Economy

A harmless survey about which Hogwarts House you belong to, a ten-year challenge where you post a picture of yourself from 2010 next to your current self, a location check-in at your favorite neighborhood restaurant. Each of these activities seem innocuous, but the data Facebook is gathering on its users is alarming. Recent reports find that it has access to your most private health data and your pictures scraped from sites are used to tune facial recognition algorithms. And social media companies, and the companies they sell that data to, are making a fortune with it. That’s because that data is used by advertisers to fine tune what ads they target you with. Information about where you travel, what you are reading, medical issues you have read about online, and many other data points are being broadcast to hundreds, if not thousands, of companies who have paid for access to that data.

In the short term, it may seem like a minor inconvenience or slightly creepy that an advertisement for a product you spoke about once, follows you on the internet, but it is important to keep an eye on the bigger picture. What dangers can come from companies profiling individuals based on their online presence? Could it impact your ability to obtain a mortgage or credit card? Health insurance?

In a worst-case scenario -- this profiling could shake the very foundation of our democracy. The 2016 election saw a huge rise in political ads that contained false or misleading claims on Facebook and coordinated across a range of social media platforms. Many of these ads were targeted to key voters in select states or regions of the country and the impact of these ads is still a discussion point for pundits as we enter a new election season.

Regulation

The Cambridge Analytica scandal ushered in a new awareness of consumers’ online privacy and that awareness has had a ripple effect across social media. Last month, Facebook launched a new tool called "Off-Facebook Activity," which gives its users insight into which organizations and advertisers benefit from Facebook’s data collection. The results are unsettling as it showed linkages between Facebook and your broader online behavior, and demonstrates the limitations of self-regulation.

But one problem with the Off-Facebook Activity tool is that it is Facebook policing itself -- which does not instill confidence that it will remove all tracking. Regulations like GDPR and CCPA are great starts at data protection, but will government oversight truly stem this surveillance economy? The rapid rise of Facebook and other tech startups has meant that government regulations are slower to address these challenges. In response to past scandals, the U.S. government has pursued antitrust violations, but their efforts only demonstrate how behind the curve of technology the government is.

In a November antitrust hearing, FTC Chairman Joseph Simons said that a 100 year old agency with outdated tools is not equipped to tackle current privacy challenges. Simons is quoted as saying "if you want us to do more on the privacy front, then we need help from you … We’ve done as much as we can do with the tools we have." Additional legislation has focused on data brokers and demanded greater transparency and disclosure of data practices, and some even weaken personal data protection in an attempt to penalize big tech. Clearly, there is significant need for data protection regulation to catch up to the digital age, but despite the rhetoric from the Hill, there shows little sign of progress.  If the Cambridge Analytica scandal taught us anything, it should be that we must remain wary of sharing any data knowingly, and to be better versed in how apps and websites are using our data. Relying on the government won’t offer much protection at this point in time.

We live in an age of overwhelming data, and the one question users, companies and government agencies should be asking themselves is how can we use this data for a mutually beneficial relationship without exacerbating privacy concerns? The answers to that question will shape our  privacy and security in the coming decades.

Image credit: alphaspirit / Shutterstock

Dr. Andrea Little Limbago is Chief Social Scientist, Virtru, and presenter at RSA Conference 2020. She is a computational social scientist specializing in the intersection of technology, cybersecurity and society. At Virtru she researches and publishes on the geopolitics of cybersecurity, global data protection and privacy trends, workforce development and usable security. Limbago is also the Program Director for the Emerging Technologies program at the National Security Institute at George Mason University. She was previously the Chief Social Scientist at Endgame. Prior to that, Limbago worked at the Department of Defense, where she was recognized for her analytic support and technical excellence. Limbago earned a Ph.D. in political science from the University of Colorado at Boulder.

2 Responses to Selectively revealing yourself to the world -- Privacy in the 21st century

© 1998-2024 BetaNews, Inc. All Rights Reserved. Privacy Policy - Cookie Policy.