The headquarters of the US Federal Trade Commission (FTC) in Washington, DC, November 18, 2024. (Photo by ROBERTO SCHMIDT/AFP via Getty Images)

Before Enforcing the New Foreign Data Law (PADFAA), Congress Must Fix These Five Things

When Congress enacted the Protecting Americans’ Data from Foreign Adversaries Act (PADFAA) in 2024, it did so with the right goal: preventing Americans’ sensitive personal data from reaching foreign adversaries. But the law Congress actually passed suffers from five critical drafting flaws that make responsible enforcement nearly impossible. Before the Federal Trade Commission (FTC) or any agency brings its first enforcement action, Congress must fix these problems: (1) add a knowledge requirement, (2) narrow the definition of “controlled by a foreign adversary,” (3) align the “data broke”” definition with established law, (4) require DOJ consultation on enforcement, and (5) narrow the overly broad treatment of web browsing data as categorically sensitive.

These are not cosmetic edits. PADFAA as written could penalize legitimate U.S. companies for routine global operations while failing to deliver the targeted national security tool Congress intended. The law was included in a high-stakes supplemental appropriations package that also funded military aid to Israel, Ukraine, and Taiwan, and carried the TikTok divestiture measure. As a result, PADFAA never received the careful legislative scrutiny that major statutes typically require. That speed created two structural problems: substantive overbreadth that sweeps in normal commerce, and a mismatched enforcement structure that assigns national security determinations to the FTC, a consumer protection agency never designed for that role.

Kevin Moriarty has argued that the FTC’s failure to enforce PADFAA undermines the case for federal privacy legislation. But this analysis misreads the problem. The issue isn’t FTC reluctance. It’s that Congress gave enforcement authority to the wrong agency for the wrong reasons and drafted a statute with critical flaws that make responsible enforcement nearly impossible without congressional fixes first.

Congressional Turf Wars, Not National Security, Put the FTC in Charge

At first glance, it looks odd that a statute framed as a national security measure was handed to the FTC rather than the Department of Justice (DOJ) or one of the national security components of the executive branch. That was not a thoughtful, security-driven design decision. It was a jurisdictional workaround.

The House Energy and Commerce Committee (E&C) played the lead role in moving PADFAA. E&C has jurisdiction over the FTC and over consumer protection. It does not have jurisdiction over DOJ, over DOJ’s National Security Division, or over the broader national security and counterintelligence apparatus of the executive branch. If the bill had been drafted to vest primary enforcement power in DOJ, or to require DOJ to make core determinations, it would likely have been referred to the House Judiciary Committee. That would have slowed or jeopardized the package.

To keep the bill within E&C’s jurisdiction, the drafters gave enforcement authority to the FTC. That move protected the bill’s path to passage, especially because the entire supplemental package was on a tight timeline and politically sensitive. But that same move produced a predictable capability gap. The FTC is an expert consumer protection and competition regulator. It is not, and has never claimed to be, a national security agency.

The legislative record underscores just how unusual that process was. The official House Energy and Commerce Committee Report (H. Rept. 118-418) on PADFAA spans only a few pages and contains almost no substantive policy discussion. The report’s “Purpose and Summary” section merely restates that the bill would prohibit data brokers from transferring sensitive data of U.S. individuals to foreign adversaries. The “Background and Need for Legislation” cites only a few news stories and a decade-old FTC study, while offering no analysis of definitions, scope, or the rationale for giving enforcement to the FTC instead of the Department of Justice. There is no record of a classified briefing, no legislative findings about national security risks, and no committee report language explaining how key terms like “controlled by a foreign adversary” or “data broker” should be interpreted. For a statute carrying significant civil penalties and broad national security implications, the absence of such analysis is striking. PADFAA moved through the committee and the House floor with extraordinary speed, reflecting political urgency, not policy deliberation.

That mismatch matters. PADFAA prohibits certain transfers of “personally identifiable sensitive data” about U.S. individuals to “foreign adversary countries,” or to private entities that are “controlled by a foreign adversary.” Violations can trigger penalties of more than $50,000 per incident. But applying those standards in the real world requires answers to national security questions: Which foreign entities are acting as cutouts for adversary governments? Which corporate structures are effectively state influenced? Who exercises indirect control? What is the adversary’s access model?

Those are determinations that usually rely on classified reporting, interagency intelligence analysis, and visibility into mergers, investments, and ownership structures across borders. The FTC simply does not have routine access to that type of material or systems by which to access this material, such as sensitive compartmented information facilities (SCIFs). It also has limited staff who have active security clearances. It has no independent authority to compel intelligence agencies to share classified threat information for civil enforcement purposes. And it has no organic capacity to do adversarial ownership tracing in the same way Treasury’s Office of Foreign Assets Control or DOJ’s National Security Division can.

The Senate never had the opportunity to consider PADFAA on its own terms. The measure was attached to the supplemental appropriations package. Senators were presented with a binary choice: accept the entire package or risk derailing urgently needed national security funding. It was understandable that the Senate chose not to hold up the supplemental over a data provision. But the consequence was that PADFAA became law without the Senate holding committee hearings, without Senators receiving expert testimony, and without the kind of inter-chamber reconciliation that would normally refine definitions and enforcement structure. The Senate’s procedural constraint, combined with the House’s jurisdictional shortcut, produced a statute that now asks a consumer protection agency to manage a national security regime.

Two Frameworks, Zero Coordination

This jurisdictional workaround didn’t just create an institutional mismatch, it produced direct conflicts with existing national security frameworks already in place. The Department of Justice and the Department of Commerce have already built a structured, interagency process to manage the very risks PADFAA was intended to address. That framework, developed under Executive Order 14117 and formalized in the Department of Justice’s Bulk Data Regulations, offers a useful contrast.

The DOJ and Commerce built a structured, interagency process to manage national security risks. It defines covered “bulk data” transactions through clear thresholds and categories, focusing on data sets large enough to enable strategic exploitation such as geolocation, health, genomic, and biometric data. It also requires interagency consultation, intelligence-informed risk assessments, and case-by-case review before designating a transaction or sector as restricted. Importantly, DOJ must coordinate with Commerce, Treasury, State, and the intelligence community, allowing for a comprehensive threat evaluation before enforcement.

By contrast, PADFAA hands the Federal Trade Commission, a civil consumer protection regulator, sole authority to interpret and enforce prohibitions on transfers of “personally identifiable sensitive data” to “foreign adversaries.” There is no requirement for interagency consultation, no classified input, and no formal mechanism to resolve conflicts with the DOJ and Commerce framework. The FTC may issue civil investigative demands and seek penalties, but it lacks the analytic or procedural infrastructure to make national security determinations.

The differences are not academic. They create direct conflicts in law and practice. For instance, DOJ’s Bulk Data Rule defines foreign “control” using a 50 percent ownership threshold, while PADFAA applies a 20 percent standard, meaning the same entity could be legal under DOJ’s rule but unlawful under FTC enforcement. DOJ’s framework includes licensing exceptions, mitigation measures, and safe harbors for legitimate transactions; PADFAA has none. DOJ’s process ties enforcement to risk-based categories, while PADFAA applies a flat prohibition to any “personally identifiable sensitive data,” regardless of scale or context.

The result is a two-track system that can produce contradictory outcomes. A U.S. company compliant under DOJ and Commerce rules could still be exposed to massive FTC penalties. A retailer that shares loyalty card data with a European analytics partner partially owned by a Chinese investor, for example, could be deemed “controlled by a foreign adversary” under PADFAA but not under DOJ’s test. A health app that uses a third-party software development kit (a code package that adds features like analytics or advertising) cleared under Commerce’s risk review could nonetheless face FTC scrutiny. Even routine logistics tracking data, protected under DOJ’s licensing process, could trigger liability under PADFAA.

These inconsistencies create real compliance paralysis. Companies cannot reconcile two systems that define risk differently and delegate enforcement to agencies with opposing mandates. DOJ and Commerce rely on intelligence-driven review and coordinated oversight; the FTC operates through notice, subpoena, and civil penalty. Congress’s decision to split these roles was a procedural workaround, not a policy plan, and it has yielded a fragmented and contradictory enforcement regime.

Why the FTC Can’t Just Enforce

These contradictions aren’t just bureaucratic inconveniences. They reveal why the FTC cannot simply begin enforcing PADFAA without congressional fixes. Others have argued that the FTC already has the tools to enforce PADFAA and that its failure to announce early cases is troubling, highlighting several prior FTC matters involving data transfers or alleged transfers to companies in China.

It is true that the FTC has an extensive privacy and data security record. The Commission has, for example, brought cases against firms that allegedly allowed sensitive data to flow to China-based analytics providers, including in the digital health space. It has brought cases against mobile device makers that embedded undeclared data collection software. It has brought cases against location data brokers that sold precise geolocation data revealing visits to places such as health clinics, places of worship, and shelters.

But PADFAA is not simply a privacy-oriented unfairness or deception statute. Those traditional FTC cases typically hinge on consumer notice, consent, transparency, retention, or misuse. PADFAA is different. It is explicitly framed as a national security control on data flows to certain countries and certain entities. That distinction matters for at least three reasons.

First, the factual predicates are different. To bring a PADFAA case, the FTC would need to determine not only that data was transferred, but that the recipient was “controlled by a foreign adversary” or located in a “foreign adversary country.” That requires national security determinations, and in some cases intelligence reporting, that the FTC does not generate and is not guaranteed to receive.

Second, the penalty posture is different. PADFAA authorizes significant civil penalties, and it links those penalties to findings that look, in substance, like sensitive national security designations. If the FTC were to bring an aggressive early case against a global company that later turned out not to fall within the statutory definition of “controlled by a foreign adversary,” that would not just be a litigation loss. It could be an international diplomatic event.

Third, the resource posture is different. The FTC is already carrying an expanded merger enforcement program, a broader AI and commercial surveillance program, and multiple rulemakings. The Commission’s total headcount is well under 1,200 personnel, a fraction of the more than 115,000 employees at the Department of Justice, which has dedicated national security divisions, cleared personnel, and intelligence infrastructure. It has been asked to “do more with less” for years. Congress did not stand up a new cleared unit inside the FTC or provide a surge of national security personnel when it assigned PADFAA to the Commission. The expectation that the FTC can absorb a sensitive, high-stakes national security enforcement program on top of its existing load is detached from budget reality.

Seen in that light, the FTC’s public caution is not dereliction. It is prudence. The contrast with the Department of Justice and Commerce Department’s joint rulemaking under Executive Order 14117 is telling. That process, which produced the Bulk Data Regulations, involved months of notice-and-comment rulemaking, interagency consultation, and technical briefings with cleared experts from the intelligence community. The FTC, by contrast, was given enforcement power without any corresponding procedural infrastructure. Its staff were not granted new security clearances or liaison authorities, and there was no interagency working group to align PADFAA with the Bulk Data Regulations. The result is two overlapping national-security data regimes that define key terms differently and risk placing companies in impossible compliance conflicts.

Five Amendments Congress Must Make Before Enforcement

Even if the FTC had infinite capacity, PADFAA as written would still need repair before responsible enforcement. The statute suffers from at least five core drafting problems. Fixing them would preserve national security protections while reducing collateral damage to legitimate commerce. Most troubling, PADFAA directly conflicts with the Department of Justice’s Bulk Data Regulations in several places, creating the possibility that a company could comply with one framework and still violate the other.

I. Knowledge Requirement

The current text does not clearly require that a company know it is transferring data to a prohibited recipient. A data broker could be penalized for transferring data to an entity that is later determined, through some opaque national security analysis the company never saw, to be “controlled by a foreign adversary.” This exposes companies to strict liability for ownership structures they cannot possibly map with certainty.

The proposed fix is straightforward. Congress should amend the statute so that it is unlawful for a data broker to knowingly sell, license, rent, trade, transfer, release, disclose, provide access to, or otherwise make available personally identifiable sensitive data of a United States individual to either a foreign adversary country or any entity that the data broker knows is controlled by a foreign adversary.

A knowledge standard would focus enforcement on willful or reckless conduct. It would also align PADFAA with other FTC-enforced privacy laws, such as the Children’s Online Privacy Protection Act, which ties liability to knowing collection or disclosure.

II. “Controlled by a Foreign Adversary”

PADFAA currently defines “controlled by a foreign adversary” so broadly that it could reach ordinary global business structures. For example, an entity can be deemed controlled if a person from a “foreign adversary country” directly or indirectly owns as little as 20 percent. In practice, many U.S. companies with operations in China, or with Chinese national employees who have access to certain internal systems, could be characterized as “controlled” under a literal reading.

This definition is not administrable. It asks private actors to make national security status determinations about their vendors, advertising partners, analytics providers, logistics networks, franchisees, or affiliates, and to do that across borders. It also creates serious overbreadth problems. Normal coordination between a U.S. parent company and its affiliated business in China could be reinterpreted, after the fact, as a PADFAA violation carrying massive per-violation penalties. Consider a U.S. cloud provider that licenses its software to a European analytics firm partly owned by a sovereign wealth fund in an adversary country. Under DOJ’s Bulk Data Regulations, that transaction would be permitted because it does not involve a bulk transfer of sensitive data. Under PADFAA’s current text, however, the same transaction could trigger penalties simply because of minority foreign ownership, even when no national-security risk exists.

Congress should adopt one of two alternative approaches to restore clarity. One approach would require a presidential determination, accompanied by notice to Congress, that a particular entity presents a significant national security threat before that entity is treated as “controlled by a foreign adversary.” Another approach would align the definition with existing federal lists and designations, such as the Treasury Department’s Specially Designated Nationals list, the Commerce Department’s Entity List, the Federal Communications Commission’s covered list, or the Defense Department’s lists of Chinese military companies.

Either approach would give companies an objective way to understand who counts as a prohibited counterparty. It would also ensure that PADFAA remains tied to clearly articulated national security concerns rather than general suspicion of any company with a China footprint.

III. Data Broker

PADFAA defines “data broker” in a way that departs from settled practice. The statute currently sweeps in any entity that “provides access” to data, even if that entity is a service provider or has a direct relationship with the consumer. By contrast, state laws in California, Vermont, and Oregon define a data broker as a business that sells or trades personal information about a consumer with whom the business does not have a direct relationship.

Why does this matter? Because without a narrower definition, PADFAA could unintentionally cover retailers, restaurants, franchisors, logistics providers, and advertisers whose ordinary operations involve sharing consumer data with service providers or affiliates. Those are not the “shady data brokers” that lawmakers routinely describe when they talk about Americans’ data being sold on open markets. Those are mainstream businesses that are trying to comply with U.S. law while operating globally. The divergence between PADFAA’s and DOJ’s definitions compounds the confusion. DOJ’s rule applies to large-scale bulk data sets that could be exploited for intelligence purposes, while PADFAA extends to individual records and routine consumer transactions. A company could therefore follow DOJ’s framework and still be exposed to liability under PADFAA for the same conduct.

The fix is to bring PADFAA’s “data broker” definition into line with state law and common understanding of the term. Congress should amend the statute to clarify that a data broker is an entity that knowingly collects and sells to third parties the personal information of a consumer with whom the business does not have a direct relationship.

IV. Department of Justice Consultation

Congress should correct the procedural design flaw that resulted from the House jurisdiction issue. The ideal solution would be to transfer primary PADFAA enforcement authority to the Department of Justice, which already administers the Bulk Data Regulations and has the national security infrastructure, classified intelligence access, and interagency coordination mechanisms necessary to make “controlled by a foreign adversary” determinations.

Short of that transfer, Congress should at minimum require the FTC to consult with the Department of Justice on allegations, evidence, and proposed enforcement related to PADFAA, and give DOJ the opportunity to review proposed regulations or guidance to ensure consistency with national security goals.

This is not cosmetic. It is how you inject national security expertise back into a statute that was structurally diverted from DOJ for jurisdictional reasons. Requiring DOJ consultation would not sideline the FTC. Instead, it would create an accountability and validation loop that protects both agencies. The FTC would no longer be forced to make sensitive designations alone. DOJ would have insight into, and some responsibility for, the national security posture of PADFAA enforcement.

V. Web Browsing Data as “Sensitive” Information

PADFAA also goes further than any previous federal law by classifying ordinary web browsing history as “personally identifiable sensitive data.” That is not a carefully considered policy choice; it is an overreach born of haste. For the first time, a federal statute places routine online activity such as visiting news sites, shopping platforms, or social media pages on the same legal footing as medical, biometric, or financial records.

No existing federal privacy or security framework treats web browsing information as categorically sensitive. The Federal Trade Commission’s own data security precedents, the Gramm-Leach-Bliley Act, and the Health Insurance Portability and Accountability Act all distinguish between data that is inherently sensitive and data that becomes sensitive only through context. Even the most far-reaching state laws, like California’s Consumer Privacy Act, treat browsing data as sensitive only when it reveals particular characteristics such as health, religion, or sexual orientation. Similarly, the European Union’s General Data Protection Regulation (GDPR) treats browsing data as requiring heightened protection only when it reveals special categories of information such as health, religion, or sexual orientation. PADFAA, by contrast, treats all browsing data as sensitive without qualification or threshold.

That overbreadth would make compliance nearly impossible for any company with a digital footprint. Every advertising exchange, analytics provider, and e-commerce platform necessarily processes some form of browsing information. Under PADFAA’s current language, even basic functions like displaying an ad impression or logging website traffic could be construed as handling “personally identifiable sensitive data.” If any downstream data recipient is later found to have a foreign minority investor, that normal commercial activity could trigger severe penalties.

This is not a rational or risk-based approach to national security. By sweeping ordinary web traffic into a national security statute, Congress has blurred the line between legitimate privacy protection and unwarranted regulation of the internet economy. If left unchanged, this definition will generate confusion, deter investment, and invite selective enforcement untethered from any real security concern. Congress should narrow the definition to cover only web browsing data that reveals genuinely sensitive categories of information or is aggregated at a scale that poses a clear intelligence risk.

Conclusion

PADFAA was enacted with the right intent but the wrong architecture. Congress must adopt five targeted amendments before enforcement begins: adding a knowledge requirement, narrowing “controlled by a foreign adversary” to align with federal designation lists or requiring presidential determinations, conforming the “data broker” definition to state law standards, requiring FTC-DOJ consultation on all enforcement decisions and regulatory guidance, and limiting web browsing data protections to genuinely sensitive categories or intelligence-scale aggregation. Ideally, Congress should transfer primary enforcement to DOJ, which already administers the Bulk Data Regulations with proper national security infrastructure.

In parallel, the FTC and DOJ should issue joint interim guidance before bringing any enforcement actions. That guidance should clarify how “controlled by a foreign adversary” will be interpreted, what evidence will be required before a case is filed, and what safe harbors will apply for companies making good-faith compliance efforts. Joint guidance would reduce uncertainty for legitimate U.S. businesses while signaling to actual bad actors that the government intends to act decisively and coherently. Given that neither agency has issued PADFAA enforcement guidance despite the statute taking effect in June 2024, this coordinated approach is both urgent and overdue.

Protecting Americans’ data from foreign adversaries is too important to leave to a flawed statute. Congress does not need to revisit the core question that animated PADFAA; there is bipartisan agreement that Americans’ sensitive data should not be for sale to hostile governments or their proxies. What Congress does need to do is adopt these targeted statutory corrections before enforcement creates precedent that becomes difficult to reverse. The necessary amendments are straightforward and could be enacted quickly.

Amend first, then enforce. That is the responsible path for both national security and the legitimacy of federal privacy enforcement going forward.

Filed Under

, , , , , , , ,
Send A Letter To The Editor

DON'T MISS A THING. Stay up to date with Just Security curated newsletters: