Turmoil in the online world has drawn headlines lately, whether it’s the shakeup at Twitter or the ongoing efforts to ban TikTok on US.. government systems. 

As a security practitioner, I know never to let a crisis go to waste. We can use these heightened data privacy concerns to motivate us to take action that will have a much more lasting and holistic effect than merely banning one specific app. 

Today’s digital world is a modern marvel of convenience, information and entertainment. Algorithms enable each of us to easily navigate that big and sometimes messy ecosystem. At best, these algorithms are extremely useful. At worst, they are weapons of mass manipulation, causing significant harm to us, our families and our society. But good or bad, we can’t avoid them and deserve to know how they work and how they are being used.    

These algorithms don’t cause immediate, noticeable changes. Rather, they fuel relentless micro manipulations that, over time, substantially reshape our society, politics and opinions. It doesn’t matter if you are able to resist the manipulation or if you opt out of the apps powered by these algorithms. If enough of your neighbors and friends are making these almost imperceptible changes in attitudes and behavior, your world will change — and not in ways that benefit you, but that benefit the individuals that own and control the platforms. 

Finally, a move for data privacy

Data privacy activists have sounded alarms about these algorithms for years, but have had little success in making meaningful change. But now there’s finally a chance to do something about the problem — a piece of federal legislation that the House Energy and Commerce Committee in the last Congress sent forward for a vote by the full House. 

The bill, known as the American Data Privacy and Protection Act (ADPPA), would, for the first time, start to hold the creators of these algorithms accountable — and require them to demonstrate that their engagement formulas are not harming the public.

I like to think of it as comparable to the Generally Accepted Accounting Principles the SEC requires of publicly traded companies. In this case, the enforcement agency would be the Federal Trade Commission.

Unfortunately, a vote on the ADPPA did not take place before the last Congress adjourned. And there’s no telling whether the new House, now controlled by the new party, will be inclined to take it up. But citizens of all political persuasions who care about data privacy should urge their lawmakers to revive the legislation or devise a new version addressing what some critics saw as its shortcomings.

As a former FBI Cyber Special Agent who now works at a cybersecurity company, I urge every cybercitizen to pay attention to this issue — and implore their lawmakers to take action.   

Why you should worry    

A common example of the algorithms I’m referring to is the ones that create the “you might also like” suggestions on sites like Amazon or Netflix. They seem harmless enough but are designed to coax us to buy more stuff or engage in more binge-watching, which I suppose is okay if you have time or money to burn. 

But other algorithms are pernicious — like those used by some online financial institutions that have been accused of encoding racism or other biases into their loan application process and those that push algorithmic radicalization, which feeds users more and more radicalized content with extremist views on topics from politics to healthcare.  

Then there’s TikTok, the “free” social media app used by 80 million Americans. It is so addictive that some critics call it “digital fentanyl.” Revelations regarding TikTok’s data collection and data storage activities have also raised serious concerns. It’s unclear if the Chinese government is privy to the data that TikTok collects on its users, but national security leaders say they don’t want to wait around to find out. 

Controlling data collection

These concerns have led the U.S. Senate to unanimously approve a bill banning the app from all federally-issued devices, with at least 11 states following suit by ordering similar bans on state-owned devices.

FBI Director Chris Wray also testified in November before the House Homeland Security Committee that China could potentially weaponize the app to influence or control users and their devices — setting up a virtually endless flow of information from which attackers could launch phishing or social manipulation campaigns targeted at American users.  

But with strong and clear data privacy regulation and enforcement, Americans could use social media apps like TikTok with far less fear. If we were better able to control what information was being collected, where it was being stored, with whom it was being shared, and could verify those facts, these kinds of concerns would be greatly ameliorated.

More importantly, if we could gain insight into the algorithms being used to influence users, we could set rules on what we will allow and even give the ability to opt out of these manipulative systems.  

A crucial step toward data privacy    

The ADPPA is far from perfect, but it’s the first time in decades that the federal government has seriously attempted to protect consumers’ data privacy. Some states, notably California, already have stricter data privacy laws, and critics of the ADPPA want the bill amended so that it wouldn’t preempt states from enacting tougher protections.    

But internet data doesn’t respect state borders. And even if the ADPPA is only a first step on behalf of the entire nation’s cybercitizens, it would be a significant stride. We need a federal-level legal framework that protects everyone and avoids the pitfalls of a patchwork of uneven laws across various states.   

This bill, as drafted, is a reminder to us all: Don’t let the perfect be the enemy of the good. I’d like to see the FTC rulemaking powers increased and be given more budget to accomplish the tasks outlined in the bill. In addition, we need more detail and clarity around the “private right to action” to directly take legal action against companies for data privacy abuses.  

Data collection a sophisticated science; also destructive

With that said, one of the most valuable parts of the ADPPA is highlighting how the sophisticated science of data collection can be turned into something dangerous and destructive. Right now, we’re relying on companies to do the right thing. Many aren’t.

The ADPPA would finally create a mechanism that requires companies to certify that private data won’t be misused. And it would give every consumer the right to opt out of having their data tracked and shared with third parties.  

In the business-to-business world where I now work, everyone recognizes the value of data. So they take all sorts of measures, including legally binding contracts, to keep other businesses from exploiting it for their benefit.  

Today, consumers have little say in how their equally valuable personal data is used — and by whom — for someone else’s profit. The ADPPA would give consumers remedies that include the right, in some instances, to sue companies for abusive data practices. In addition, consumers have little visibility into powerful algorithms that underlie our current use of the internet. 

A bill like the ADPPA would provide a process to start understanding how these algorithms operate, allowing consumers to influence how they work and how they are being used.  

We, the people, need to hold algorithm creators and data collectors accountable. The ADPPA would create a much-needed foundation on which we can build a much safer and more transparent online world for all of us.  

Adam Marrè, a former FBI Cyber Special Agent, is CISO at Arctic Wolf.  

DataDecisionMakers

Welcome to the VentureBeat community!

DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.

If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.

You might even consider contributing an article of your own!

Read More From DataDecisionMakers