This post is the latest installment of our “Monday Reflections” feature, in which a different Just Security editor examines the big stories from the previous week or looks ahead to key developments on the horizon.

With the advent of GPS in the late 1990s, the US Naval Academy stopped teaching cadets to navigate by the stars. Last year, cybersecurity concerns prompted the Academy to resume teaching cadets celestial navigation. As one Academy official explained, “We went away from celestial navigation because computers are great. … The problem is, there’s no backup.” This is just one example of how entities and individuals are beginning to “give up” on cybersecurity.

In a forthcoming essay, I argue that “giving up” on cybersecurity captures instances where cybersecurity concerns prompt strategic retreats from digital dependence. These are cases where individuals, businesses, and governments assume that efforts to secure digital information and networked devices will fail. They then adapt to that scenario by either (1) adopting low-tech redundancies for high-tech capabilities or digital information, or (2) engaging in technological regression or arrest, choosing to forego capabilities that technology could provide because of concerns about cybersecurity risks. Scattered examples of giving up have already occurred, and they will likely become increasingly frequent in the coming years.

Low-Tech Redundancy

Low-tech redundancy involves deliberate decisions to retain low-tech or no-tech versions of capabilities or non-digital versions of content. It assumes that cybersecurity measures will fail and that digital files or technological capabilities will be rendered inaccessible, inoperable, or untrustworthy. When that occurs, the low-tech alternatives function as a failsafe, allowing continued operations and perhaps restoration of high-tech capabilities.

In addition to the celestial navigation example above, consider another example of low-tech redundancy. In the wake of the 2000 presidential election with its “hanging chads,” many jurisdictions purchased electronic voting machines. But concerns about the machines’ susceptibility to hacks have prompted more recent requirements that electronic voting machines produce a paper record of votes cast.

Low-tech redundancy may also be prudent to guard against the rising threat of ransomware — malicious software that encrypts an individual’s or entity’s files, rendering them inaccessible unless a ransom is paid. Recent ransomware attacks have forced hospitals in California and Washington, D.C. to revert to hard copies of medical records while their electronic systems were inaccessible.

Technological Regression or Arrest

The impetus for technological regression and arrest is the assumption that cybersecurity measures will fail and that the implications of such failure are sufficiently dire to justify foregoing a technological capability entirely. Technological regression occurs when the security implications are recognized only after the technology has been developed or deployed; technological arrest occurs when the security concerns are appreciated ex ante.

A 2013 “60 Minutes” interview with former Vice President Dick Cheney revealed one example of technological regression. Cheney confirmed that “his doctor ordered the wireless functionality of his heart implant disabled due to fears it might be hacked in an assassination attempt.”

Revelations about US government surveillance of foreign targets prompted other examples of technological regression. Media reports indicate that Russia’s Federal Guard Service ordered typewriters in an attempt to keep communications from being surveilled, and German parliamentarians apparently considered taking a similar measure.

Some of Apple’s recent briefing in the San Bernardino shooting iPhone case made an argument for technological arrest. Among its many objections to the order requiring it to assist the government in accessing the shooter’s iPhone, Apple argued that the order would require “Apple to design, create, test, and validate a new operating system that does not exist, and that Apple believes—with overwhelming support from the technology community and security experts—is too dangerous to create” (p. 16). Apple also pointed to the risk of the code being leaked or stolen by hackers to support its refusal to write the code in the first place (p. 19–20).

Giving Up Strategically

Desire for efficiency, convenience, and new capabilities often push society to network first, and consider security later. Government officials and regulatory bodies aren’t immune from this impulse. Consider the rush to deploy electronic voting machines and regulations pushing the adoption of electronic medical records that are now subject to hacking and ransomware.

As I explore more fully in the essay, however, law and government may be able to play a constructive role in promoting giving up on cybersecurity strategically. Government actors can explicitly consider adopting low-tech redundancies for governmental capabilities or incorporate consideration of technological regression or arrest into risk assessments. And they can encourage non-governmental entities to do the same.