This year is turning out to be a banner one for flawed proposals that would allow businesses to share information about Americans’ online activity with the Department of Homeland Security (DHS) in the name of cybersecurity. First came the White House plan in January, then the Cybersecurity Information Sharing Act (CISA) — which passed the Senate Intelligence Committee on a 14-1 vote earlier this month — and on Tuesday, the House introduced the Protecting Cyber Networks Act.

Each of these three proposals throw industry a bone by waiving liability for violating even our very inadequate privacy rules. None of these three narrowly and specifically identifies the categories of information that Congress wants to allow to be shared, despite privacy rules. As Sen. Ron Wyden (D-Ore.) aptly put it following his no vote on CISA, it’s “not a cybersecurity bill — it’s a surveillance bill by another name.” Don’t we have enough domestic surveillance already?

Information sharing on its own isn’t going to solve the network security problem. But it is a valuable tool, and a relatively easy starting point for improving digital security. I’ve spent my career advocating against rules that inhibit security information sharing, and even wrote a law review article about it in 2005. Pretty much everyone agrees that vulnerability information sharing is a good idea. So what does Congress need to do to get it right?

Good information sharing

First we need to be clear about what types of information we are talking about sharing in the name of enhanced security practices. We are talking about sharing vulnerability information: software flaws, virus signatures, threat signatures — stuff that system administrators need to know to check and protect their systems from attacks that others have identified or suffered. 

These are real world examples of vulnerability information, the kind we want to share more of. For example, this is how one can tell if a machine is infected with a particular malicious keystroke logger and this tells how you can detect an exploit kit infection that has been spreading through websites. Congress should take a close look at these and other bulletins. You’ll note that there is no personally identifying information in the notification. Nor is there information protected by privacy laws like the Wiretap Act, the Electronic Communications Privacy Act (ECPA), the Family Educational Rights and Privacy Act (FERPA), or any other.

(It’s worth pointing out that all members of the panel on information sharing at the White House Cyber Summit held at Stanford University in February seemed to agree that, as a general rule, the type of information we need to share to mitigate these vulnerabilities does not include private data.)

The threat signatures sometimes do, as with the exploit kit example, list domains and IP addresses used to host the malware. That information helps potential victims block malicious incoming connections. But those IP addresses are not protected from sharing by ECPA. ECPA doesn’t stop IP addresses from being shared with private entities. And ECPA only protects (lightly) IP addresses of subscribers to or customers of certain publicly offered services. The malware delivery server will very rarely fit that definition. I’m not saying there are never going to be situations where something that fits the definition of vulnerability information is legally protected from voluntary sharing. I’m saying that is — by far — the exception and not the rule.

Bad information sharing

Now let’s be clear about what we are NOT talking about passing legislation for sharing. We are not talking about providing: evidence that my machines were attacked, evidence about who may have attacked me, government access to my network to help me catch the attacker. We already have laws regulating how victims can report crimes and how the government can investigate those crimes. Those laws are called ECPA, the Wiretap Act, Rule 41, and the Fourth Amendment. Those laws are replete with rules about when legal process is and is not needed for investigating particular threats and specific attacks. Those rules have provisions for sharing in emergencies, to protect the rights and property of the provider, and more. We don’t need to waive those rules to promote digital security. In fact, the opposite is true. The rules need to be stronger to protect online privacy and security, and absolutely not weaker.

What’s holding up good information sharing?

Why does government keep stalling on moving ahead with information sharing proposals? It is probably true that entities with useful vulnerability information are not sharing it frequently enough with the government. And when the government asks them why they don’t share, they say,“because we’d like liability protection.” Because what even slightly regulated corporation doesn’t want liability protection?

The problem with this logic is there are no laws that would create liability for sharing most of the kinds of data we want to make more widely available.

My guess is that the real reason we aren’t seeing more robust sharing with DHS is that some sectors of commercial actors don’t see that it is worth their while to share with the government. I’ve been told that the government doesn’t share back. Silicon Valley engineers have wondered aloud what value the Department of Homeland Security has to offer the in their efforts to secure their employer’s services. It’s not like DHS is setting a great security example for anyone to follow. And there’s a very serious trust issue. Any company has to think at least twice about sharing how they are vulnerable with a government that hoards security vulnerabilities and exploits them to conduct massive surveillance.

Meanwhile, information sharing is happening … just not as much with DHS as DHS wants. Companies are sharing vulnerability data with each other, in all kinds of ways commercial and voluntary. More vulnerability information sharing would be a good thing to have. But we need not sacrifice what little privacy we have on the altar of government involvement. Congress should reject all three of these proposals and go back to the drawing board. What, exactly, do you want private parties and commercial entities to share with DHS and what security benefits, exactly, can DHS offer the public in exchange for this data? Instead of legislation that would mitigate a basically non-existent risk of sharing, think about what the benefit might be. We can both share security information and protect privacy.

A version of this post also appears at the Center for Internet and Society at Stanford Law.