Get ready for another round of the crypto-wars.

A recent report in the New York Times indicates that the Justice Department has been quietly discussing with researchers ways to provide the government with “backdoor” access to encrypted communications that would minimize the security vulnerability that any such access inevitably introduces. Even before the Times broke details of the ongoing discussions, impassioned encryption supporters were already assailing the government’s efforts. Perhaps most notably, the Center for Democracy and Technology’s chief technologist decried the Justice Department’s new push as “nonsensical.”

All told, the debate over encryption appears to be heating up again. This round of the crypto-wars began emerging almost a year ago in London where, in the wake of three terrorist attacks in a six-week span, Prime Minister Theresa May pledged to revive previously tabled legislation that would bar encrypted communications platforms such as WhatsApp from operating in their current form or even being offered to users on forums like Apple’s App Store. Shortly thereafter, Australian Prime Minister Malcom Turnbull issued a call for legislation banning end-to-end encryption. The debate then moved to Washington, where new leadership at the Justice Department and at the FBI revisited one of former FBI Director Jim Comey’s favorite topics: his insistence that widely available end-to-end encryption poses intolerable challenges to law enforcement.

Encryption is a devilishly tough issue for those who care about both the government’s interest in defending its citizens from bad actors and all of our interest in privacy and data security, and the news that experts inside and outside government are discussing it rigorously should be welcomed. Nonetheless, the likelihood of these discussions yielding, in the near term, backdoor access that both the government and tech sector can accept strikes me as low. Right or wrong, tech leadership—perhaps most notably Apple CEO Tim Cook—appears to maintain very strong opposition to the inherent and ultimately inescapable connection between facilitating government access and introducing at least some security vulnerability. Until that changes, it may not make sense to “go big” and strive in vain to develop an acceptable backdoor—though, if that remains the conversation’s direction, participants in it would be wise to heed my former DOJ colleague Alan Rozenshtein’s sage advice not to treat security as “an all-or-nothing proposition” and instead to ask what’s “secure enough.”

To my mind, however, there are perhaps less ambitious but potentially more helpful ways in which the debate over encryption can, quickly, become more practical and more meaningful than it has been thus far.

In both the United Kingdom and the United States, the debate has remained unhelpfully conceptual. One side—including many of the government voices—has insisted that, to paraphrase President Obama, one’s iPhone shouldn’t be different in kind from one’s underwear drawer. That is, if a duly issued warrant or other lawful process can authorize a search of an underwear drawer for whatever may be hidden there, then a court order must similarly be able to authorize a search of encrypted communications and yield meaningful results. That, in turn, demands that some actor—WhatsApp, Apple, etc.—be able to produce the contents of communications within the scope of the warrant. The other side—Cook and much of the tech industry—has noted that any backdoor offering access to such contents by definition introduces a vulnerability that can be exploited by bad actors. Consequently, industry has refused to sacrifice day-to-day user security to facilitate meaningful compliance with court orders.

Those are the terms on which Prime Minister May revived the debate last summer. But they’re conceptual ones, not practical ones, which makes them a poor basis for evaluating what really makes us safer. To get practical, think about what would happen if May’s government were to succeed in proscribing end-to-end encryption. Companies responsive to British law would either withdraw their products from the British market or redesign them to ensure that the contents of communications could be retrieved. Either way, one consequence seems likely: savvy bad actors will stop using those products, as experts such as Wired’s Kim Zetter have emphasized.

And, these days, even terrorists who aren’t savvy themselves can receive up-to-the-minute direction online from those who are savvy about which products to use and which to avoid. So, savvy bad actors presumably would switch to products not responsive to British law—such as, perhaps, the enigmatic Telegram, created by two Russian dissident brothers and reportedly utilizing a corporate structure designed to maximize flexibility and minimize accountability to governments. In the era of a world wide web, bad actors will always be able to download such platforms somewhere online, even if they’re banished from mainstream forums like the App Store.

Is it better or worse to push the worst actors to such platforms? The upside for governments would be if such platforms have weaker encryption than mainstream products—but it’s not clear that they do or that, over time, they always will. The downside is that the companies offering such renegade platforms generally don’t cooperate with governments at all, whereas companies responsive to Western legal regimes generally do provide at least some non-content metadata in response to appropriate lawful process.  So, it’s possible that mandating backdoors will simply push bad actors further to the fringes, yielding less visibility into their activities, rather than more. Whatever the right answer may be, balancing these types of practical considerations is necessary to have a mature public debate on encryption.

Moreover, in the United States, the encryption debate remains largely stuck where the unresolved court battle over the San Bernardino iPhone left it—focused on hardware, like phones and computers. The FBI sought to crack the pin code to “open” the shooter’s phone, rather than, for example, seeking the content of his communications stored on a server somewhere far from the phone itself. (It’s worth reading the interesting commentary on the San Bernardino saga recently penned by my Just Security colleague Julian Sanchez.) But accessing hardware is just one element of the encryption debate, and perhaps one of diminishing importance. Today’s increasingly popular platforms, such as Signal, offer settings automatically deleting the contents of communications within five minutes of transmission—or even five seconds. What’s left to find on the phone, even if Apple could be forced to unlock it? Perhaps some non-content metadata; but, if Signal is to be believed, no content and as little metadata as Signal has found technologically possible.

What does it matter, then, if Apple were forced by court order or legislative mandate to ensure that it’s able to unlock iPhones? As bad actors switch increasingly to platforms like Signal, there may be nothing left to find. Acknowledging that development pushes the conversation away from hardware and “data at rest,” and toward apps and “data in motion.” And that, in turn, brings us back to the question of what changes that governments could realistically agree to with industry or ultimately mandate by law would yield a result overall good or bad for public safety.

These are hard questions. Let’s at least make sure we’re asking the right ones, and ones on which there may be near-term progress to be made in providing answers. Doing so won’t bring peace to the crypto-wars; but it may, at least for now, set the right battle lines.