2024-24582. Provisions Pertaining to Preventing Access to U.S. Sensitive Personal Data and Government-Related Data by Countries of Concern or Covered Persons  

  • Table VII-1—Selected Data Broker Revenue and Employee Figures

    Data broker Total revenue U.S. revenue Foreign revenue Employees
    Acxiom (2018) a $917.4 million $834.6 million $82.8 million 3,380
    LexisNexis (2021) b $974.3 million n/a c n/a c 10,200
    Oracle America (2023) d $50 billion $31 billion $19 billion g  164,000
    Equifax (2023) e $5.3 billion $4.1 billion $1.2 billion 14,900
    Experian (2022) f $6.6 billion $4.4 billion $2.2 billion 22,000
    a  Acxiom LLC 2018 Annual Report, AnnualReports.com (2018), https://www.annualreports.com/​HostedData/​AnnualReports/​PDF/​NASDAQ_​ACXM_​2018.pdf [ https://perma.cc/​6BVA-DQS5].
    b  Latka, How LexisNexis Hit $974.3M Revenue with a 10.2K Person Team in 2021, SaaS Database, https://getlatka.com/​companies/​lexisnexis [ https://perma.cc/​M4DM-HAC9].
    c  LexisNexis is owned by RELX and is folded into their annual report and therefore the annual report does not provide specific domestic and foreign revenue numbers just for LexisNexis. ( print page 86175)
    d  Oracle, Oracle Announces Fiscal 2023 Fourth Quarter and Fiscal Full Year Financial Results (June 12, 2023), https://investor.oracle.com/​investor-news/​news-details/​2023/​Oracle-Announces-Fiscal-2023-Fourth-Quarter-and-Fiscal-Full-Year-Financial-Results/​default.aspx [ https://perma.cc/​DL8Y-H2VM]. The U.S. Revenue entry of $31 billion is for “the America's.” The Foreign Revenue entry of $19 billion is for “Europe/Middle East/Africa” and “Asia/Pacific.” Oracle, Culture and Inclusion Empowers Diversity, https://www.oracle.com/​careers/​culture-inclusion/​best-practices/​ [ https://perma.cc/​3M2B-GQ7F].
    e  Equifax Inc., 2023 Annual Report (2024), https://investor.equifax.com/​sec-filings/​annual-reports##document-3666-0001308179-24-000246-2 [ https://perma.cc/​WU9A-NHZ2].
    f  Experian, Annual Report 2023 (2023), https://www.experianplc.com/​content/​dam/​marketing/​global/​plc/​en/​assets/​documents/​reports/​2023/​annual-report/​experian_​annual_​report_​2023_​web.pdf [ https://perma.cc/​7QRT-GN3T].
    g  Oracle Corp., Annual Report (Form 10-K) (June 20, 2023), https://www.sec.gov/​Archives/​edgar/​data/​1341439/​000095017023028914/​orcl-20230531.htm [ https://perma.cc/​4ADX-R6EJ].

    iii. Products Sold by Data Brokers

    Data brokers often collect data regarding, for example, where the average person goes, where they shop, and what they search for online.[442] Notably, researchers from Duke University who used a secret shopper approach were offered access to thousands of records of military personnel and military veterans' data containing names, addresses, emails, phone numbers, military agency or branch, medical ailments, political affiliations, religion, gender, age, income, credit rating, and even details on children in the household.[443] Not all brokers sell the same data, with many targeting niche industries or markets to help them gain a competitive advantage. Brokers also trade and combine their data with primary collectors to create detailed profiles they can package and commercialize.[444]

    iv. Price Information

    Depending on its type and volume, personal data can be purchased for prices ranging from less than $1 for one personal record to millions of dollars for a large dataset. In a secret shopper study, Duke University researchers found that they could purchase a single record for as little as $0.12 and spend upwards of $10,000 for approximately 50,000 records of service members and military veterans. The price did not noticeably vary based on the data subjects' IP location (United States vs. Singapore). The Duke University researchers found that if one broker could not sell the information to them, another one could. There are estimates that mental health datasets could range between $15,000 and $100,000 and may be sold for even higher prices if the datasets include more detailed demographic data.[445]

    v. Customers of Data-Brokerage Products

    It is known that data brokers sell datasets both domestically and internationally; however, specific transaction activities with these parties are difficult to ascertain from available financial reports. The U.S. Bureau of Economic Analysis (“BEA”) faces significant limitations for estimating the size of the domestic and international markets because BEA data does not break out data brokerage separately as an industry. Furthermore, based on sample financial data of the data-brokerage firms listed in Table VII-1 of this preamble, the Department estimates that the U.S. market produces over 60 percent of data broker revenue.

    c. Agreements Affected by the Proposed Regulation

    It is difficult to determine an approximate number of affected vendor agreements, employment agreements, and investment agreements that are entered in any given year by a U.S. person due to the scope and nature of these agreements. Each of these three types of agreements are considered restricted transactions if they involve access to government-related data or bulk U.S. sensitive personal data. The Department welcomes comments that provide a source for the annual number of vendor agreements, employment agreements, and investment agreements that might be affected by the regulation.

    i. Vendor Agreements

    According to the proposed rule, a vendor agreement is defined as “any agreement or arrangement, other than an employment agreement, in which any person provides goods or services to another person, including cloud-computing services, in exchange for payment or other consideration.” See § 202.258(a). A potential example of a vendor agreement covered by the proposed rule is a medical facility in the United States that contracts with a company headquartered in a country of concern to provide information technology (“IT”) related services. The medical facility has bulk personal health data on its U.S. patients, and the IT services provided under the contract involve access to the medical facility's systems containing that bulk personal health data. ( See Example 2 in § 202.258(b)). The NPRM also discusses additional examples of vendor agreements pertaining to technology services and data storage. ( See Examples 3 and 4 in § 202.258(b)).

    The costs of compliance with the security requirements will vary. Covered persons who have vendor agreements within the scope of the proposed rule may face costs associated with either replacing a vendor located in a country of concern or spending more on compliance ( e.g., implementing the security requirements) to maintain those vendor agreements. Furthermore, some U.S. companies may choose to remove vendor agreements altogether rather than bear the cost of complying with the security requirements. In contrast, most Fortune 500 companies or companies in sectors subject to cybersecurity regulations already have cybersecurity controls in place and might only need minor modifications to their existing vendor agreements and data security controls, while companies with less mature cybersecurity programs may require more significant changes.

    ii. Employment Agreements

    This NPRM defines an employment agreement as “any agreement or arrangement in which an individual, other than as an independent contractor, performs work or performs job functions directly for a person in exchange for payment or other consideration, including employment on a board or committee, executive-level arrangements or services, and employment services at an operational level.” See § 202.217(a). A potential example of an employment agreement is a U.S. company that employs a team of individuals who are citizens of and primarily reside in a country of concern and have access to back-end IT services ( print page 86176) and company systems that contain bulk human genomic data ( see Example 1 in § 202.217(b)). Similarly, the employment of a lead project manager or a CEO of a U.S. company who primarily resides in a country of concern and who has access to bulk U.S. sensitive personal data would be considered a restricted transaction ( see Examples 2 and 3 in § 202.217(b)).

    Any employment agreements involving government-related data or bulk U.S. sensitive personal data between U.S. persons and countries of concern or covered persons would need to comply with security requirements. The cost of security and due diligence requirements may drive some companies to cease employment agreements with these covered persons, while other companies may incur costs to ensure compliance or even implement job transfers to eliminate the potential cost of compliance with the proposed regulation. Ultimately, employment agreements may incur larger upfront costs once the proposed regulation comes into effect that may be minimized over time as the initial market disruptions due to the proposed rule settle, the costs associated with job transfers are minimized, and firms learn how to operate in the changed environment.

    iii. Investment Agreements

    This NPRM defines an investment agreement as “an agreement or arrangement in which any person, in exchange for payment or other consideration, obtains direct or indirect ownership interests in or rights in relation to (1) real estate located in the United States or (2) a U.S. legal entity.” See § 202.228(a). An example is when a U.S. company intends to build a data center located in a U.S. territory to store bulk personal health data on U.S. persons, and a foreign private equity fund located in a country of concern agrees to provide capital for the construction of the data center in exchange for acquiring a majority ownership stake in the data center ( see Example 1 in § 202.227(c)). Ultimately, investment agreements may incur larger upfront costs once the proposed regulation comes into effect that may be minimized over time.

    iv. Security Requirements

    The proposed rule authorizes three classes of otherwise prohibited transactions (vendor agreements, employment agreements, and investment agreements) if they meet the security requirements proposed by CISA. The goal of the proposed security requirements is to address national security and foreign-policy threats that arise when countries of concern and covered persons access government-related data or bulk U.S. sensitive personal data that may be implicated by the categories of restricted transactions. The proposed security requirements (incorporated by reference in § 202.402 of this NPRM) have been developed by DHS through CISA, which has published the proposed requirements on its website, as announced via a Federal Register request for comments on the proposed security requirements, issued concurrently with this proposed rule. After CISA receives and considers public input, it will revise as appropriate and publish the security requirements.

    Regarding investment agreements, as described in § 202.228 and § 202.508, the proposed rule would treat investment agreements entered into by U.S. persons with countries of concern or covered persons as restricted transactions even if they are also covered transactions subject to CFIUS review, unless and until CFIUS issues an interim order, enters into a mitigation agreement, or imposes a condition with respect to a particular covered transaction.[446] As a result, any investment agreement that is both a restricted transaction under the proposed rule and a covered transaction subject to CFIUS review would be subject to the security requirements under the proposed rule unless and until the transaction is filed with CFIUS and CFIUS takes a “CFIUS action,” as defined in the proposed rule, by entering into a mitigation agreement or imposing mitigation measures. Because the security requirements are likely at least similar to and potentially less burdensome than any bespoke mitigation measures that CFIUS would enter into or impose, the parties to such a covered transaction would likely face, as a result of the proposed rule, only the marginal cost of complying with the security requirements before CFIUS takes action. Because this cost of compliance is marginal, and because it appears likely, based on the Department's experience, that many investment agreements by countries of concern or covered persons that involve access to sensitive personal data would also be covered transactions subject to CFIUS review, it appears likely that there will not be a meaningful cost for investment agreements to comply with the security requirements.

    v. Due Diligence and Recordkeeping

    Due diligence and recordkeeping requirements will be important considerations when engaging in a restricted transaction or as a condition of a license (general or specific) and may be similar to certain requirements of an IEEPA-based sanctions program administered by OFAC. Section 202.1101 of the proposed rule requires U.S. persons subject to these affirmative requirements to maintain documentation of their due diligence to assist in inspections and enforcement, and to maintain the results of annual audits that verify their compliance with the security requirements, as applicable, and the conditions of any licenses, where relevant, that the U.S. persons may also have. Entities may be required to collect, maintain, and analyze readily available information to make appropriate judgments regarding their transactions and potential requirements under the proposed regulation. They may also be required to make available to the Department any annual audits that verify the U.S. person's compliance with the security requirements and any conditions on a license.

    vi. Audits

    The proposed rule imposes certain audit requirements on restricted transactions to ensure compliance with the security requirements for covered data transactions, such as appointing a qualified auditor to annually assess compliance. Such audits would address the nature of the U.S. person's covered data transaction and whether it is in accordance with applicable security requirements, the terms of any license issued by the Attorney General, or any other aspect of the regulations.

    vii. Licenses

    General and specific licenses would be available under the proposed regulation. Such licenses would permit transactions that are otherwise prohibited by the proposed regulation. Both general and specific licenses could include a range of requirements or obligations as the Department deems appropriate. The benefits of this type of regime include giving regulated parties the ability to bring specific concerns to the Department and seek appropriate regulatory relief and affording the Department the flexibility to resolve varied cases either generally or individually. ( print page 86177)

    4. Need for Regulatory Action

    There are many statutes, regulations, and programs that aim to keep America secure by monitoring, restricting, prohibiting, or otherwise regulating the flow of goods, services, investments, and information to foreign countries and foreign nationals, especially countries considered to be adversaries. For example, CFIUS has the authority to take action to mitigate any national security risk arising from certain foreign investments in U.S. businesses or involving U.S. real estate, or to recommend that the President suspend or prohibit a transaction on national security grounds. In addition, OFAC “administers and enforces economic and trade sanctions based on U.S. foreign-policy and national security goals against targeted foreign countries and regimes, terrorists, international narcotics traffickers, those engaged in activities related to the proliferation of weapons of mass destruction, and other threats to the national security, foreign policy, or economy of the United States.” [447]

    In Executive Order 13873, the President authorized the Department of Commerce to prohibit transactions in the information and communications technology and services supply chain or to impose mitigation measures to address an unusual and extraordinary threat to the national security, foreign policy, and economy of the United States.[448] The Secretary of Commerce exercises this authority through the Bureau of Industry and Security.[449] Executive Order 14034 takes various steps to protect sensitive personal data from foreign adversaries.[450]

    While existing legislation provides the Department of Justice with authority to promulgate this proposed rule, no existing statute replicates the measures undertaken here. Neither do any of the previous executive actions set forth in Executive Order 14117[451] broadly empower the government to prohibit or otherwise restrict the sale or transfer of government-related data or bulk U.S. sensitive personal data to countries of concern. Therefore, the proposed regulation will not be duplicative of any existing regulatory regime.

    As relevant here, the regulatory philosophy of Executive Order 12866 provides that agencies should issue regulations when there is a compelling public need, such as a market failure.[452] Executive Order 12866 further directs agencies issuing new regulations to identify, where applicable, the specific market failure that warrants new agency action and to assess its significance. In perfect, unregulated markets, supply and demand lead to transactions that allocate resources efficiently, fully supply the market at prices that buyers are willing to pay, and do not harm third parties. However, some transactions result in market failures known as “negative externalities;” that is, harms to parties not directly involved in the transactions. The sale of government-related data or bulk U.S. sensitive personal data to adversaries is an example. Such transactions are mutually beneficial to the parties: U.S. data brokers obtain monetary benefits, and adversaries obtain possession of a potentially strategic asset of sensitive data that they can put to malicious use. However, when the data can be used to harm U.S. nationals who are not directly involved in the transactions by presenting a risk to national security or foreign policy, then the transaction creates negative externalities. These market failures demonstrate a need for the regulation being proposed, which will eliminate or reduce the risk to national security and foreign policy from such transactions. Circular A-4 also recognizes a common need for regulation to protect civil rights, civil liberties, or advancing democratic values, all of which are threatened if government-related data or bulk U.S. sensitive personal data end up in the hands of adversaries.[453]

    5. Baseline (Without the Proposed Rule)

    The baseline refers to what the world would look like without the regulatory changes being proposed here, which is closely related to the need for the regulation described above. To inform the public of the rationale behind the agency's proposed regulations, the Department must analyze the quantifiable and qualitative costs and benefits of the proposed action. The baseline for the proposal under consideration is the state of the world without the regulation, often referred to as the “no-action alternative,” which here includes the current regulatory regime.

    a. Baseline National Security and Foreign-Policy Risks by Category of Data

    i. Human Genomic and Human ’Omic Data

    Human genomic data presents characteristics that, under certain circumstances, allow for misuse. Although humans share more than 99 percent of their DNA, the remaining differences play a significant role in physical and mental health.[454] In addition to genomic data, which describes a person's DNA sequence, if combined with other data, data characterizing other human systems, known as ’omic data, can also uniquely identify individuals. For example, transcriptomic data describes RNA transcripts, or the expression of genes as impacted by environmental factors; proteomic data describes the complete set of proteins expressed by a cell, tissue, or organism; and metabolomic data describes the small molecule metabolites found within a biological sample.

    Human genomic data and human `omic data are therefore highly personal, and there are both person- and population-level risks to national security associated with the potential sale of such data to foreign adversaries.[455] Genomic data has been widely acknowledged in scientific, policy, and ethics literature to have the potential to be used to track an individual; breach their privacy; and expose individuals to discrimination, exclusion, or social embarrassment when paired with other personally identifiable information, such as by coloring public perception of a person's competence or health. Genomic data that is de-identified by standard healthcare practices ( i.e., removal of name, date of birth) can, in some cases, be potentially re-identified by methods that combine genomic data with other privately and publicly available information.[456] There are also other ( print page 86178) potential risks related to such data.[457] For instance, the 2023 Annual Threat Assessment of the U.S. Intelligence Community explains that generally, “[r]apid advances in dual-use technology, including bioinformatics, synthetic biology, nanotechnology, and genomic editing, could enable development of novel biological weapons that complicate detection, attribution, and treatment.” [458] Additionally, as the National Counterproliferation and Biosecurity Center has stated, “[r]esearch in genome editing by countries with different regulatory or ethical standards than those of Western countries probably increases the risk of the creation of potentially harmful biological agents or products.” Furthermore, because biological relatives share some genetic traits, the misuse of genomic and related information can potentially harm not only the individual but also their current and future biological relatives, to some degree.

    For example, Huntington's Disease is neurodegenerative, is frequently highly incapacitating, has no cure, and is tied to specific genetic variants.[459] Revealing that an individual carries the variants for Huntington's Disease could therefore be used to claim that a political candidate for office may soon become incapacitated or to harm that person's family members mentally or emotionally.

    At a population level, there are multiple examples of the risks posed by harmful use of genomic data. For example, the PRC has collected and used genetic data from minority groups and potential political dissidents to carry out human-rights abuses against those groups and to support state surveillance.[460] The PRC's collection of healthcare data from the United States poses equally serious risks, not only to the privacy of Americans, but also to the economic and national security of the United States.[461]

    It is conceivable that bulk genomic data in the wrong hands could be used to identify or track ethnic or racial subgroups in the United States and to target them for physical, mental, or emotional harm. As the NCSC has publicly explained, for example, “[c]oncerns over the exploitation of healthcare and genomic data by the [People's Republic of China] are not hypothetical,” as China “has a documented history of exploiting DNA for genetic surveillance and societal control of minority populations in Xinjiang, China.” [462] Specifically, China “has established a high-tech surveillance system across Xinjiang, as part of a province-wide apparatus of oppression aimed primarily against traditionally Muslim minority groups.” [463] This apparatus includes an “initiative launched by the PRC government in 2014” that “has been used to justify the collection of biometric data from all Xinjiang residents ages 12 to 65.” [464] Chinese authorities have “collected DNA samples, fingerprints, iris scans, and blood types” and linked the biometric data “to individuals' identification numbers and centralized [it] in a searchable database used by PRC authorities.” [465] As NCSC has further explained, “[s]pecific abuses by the PRC government as part of this effort include mass arbitrary detentions, severe physical and psychological abuse, forced labor, oppressive surveillance used arbitrarily or unlawfully, religious persecution, political indoctrination, and forced sterilization of members of minority groups in Xinjiang.” [466] In 2020, the Department of Commerce “sanctioned two subsidiaries of China's BGI for their role in conducting genetic analysis used to further the PRC government's repression of Uyghurs and other Muslim minority groups in Xinjiang.” [467] As this example shows, “[t]he combination of stolen PII, personal health information, and large genomic data sets collected from abroad affords the PRC”—and other countries of concern—“vast opportunities to precisely target individuals in foreign governments, private industries, or other sectors for potential surveillance, manipulation, or extortion.” [468] The potential exploitation of this kind of data is not limited to targeting and repression within the borders of a country of concern, as this data could help “not only recruit individuals abroad, but also act against foreign dissidents.” [469]

    There are additional risks to national security associated with the sale of bulk genomic data to countries of concern. For example, BGI Group, a Chinese company, grew exponentially during the COVID-19 pandemic by selling COVID-19 test kits in 180 countries around the world, which enabled it to collect biospecimens and DNA sequences from the individuals tested.[470] The company also built laboratories in 18 countries, widely distributing its genetic sequencing/gathering technology across the globe, and the government of China helped to coordinate some of BGI's arrangements with other countries. The human genetic samples that BGI collected may be shared publicly on China's government-funded National GeneBank, creating individual privacy risks, and the Chinese government has indicated that its backing of BGI is intended to support China in commanding a significant position in the international biotechnology industry.[471]

    ii. Biometric Identifiers

    As previously discussed, the gathering and aggregating of biometric data can be a complex process, and much about the market for biometric data is still unknown. The legitimate use of biometrics across many areas of technology is increasing rapidly, and the exposure of biometric data to countries of concern could prove to be especially damaging since the physical characteristics linked to biometrics are often difficult or impossible to change.

    The PRC already has a demonstrated ability to collect and exploit the biometric data of its citizens, an effort that has been especially targeted at oppressed groups within its population. This has included gathering data such as “DNA samples, fingerprints, iris scans, and blood types” and creating a database where such data is linked with an individual's personal identifier.[472] These capabilities will likely continue to be developed as the technology improves and could easily be used to undermine U.S. national security.

    ( print page 86179)

    iii. Precise Geolocation Data

    Precise geolocation data in the hands of foreign adversaries poses national security risks with respect to two areas: (1) operations, including missions, deployments, exercises, and activities of national security personnel; and (2) personnel, including those in the military and their families, as well as nonmilitary persons with the potential to obtain or hold information vital to national security.[473]

    Potentially sensitive information can be gleaned outside direct conflict zones. Precise geolocation data can be readily used to identify the location and purpose of important national security-related infrastructure, facilities, and equipment, all of which could lead to immense harm to national security.

    Precise geolocation data can also be used to coerce military personnel, State Department officials, and anyone else with access to sensitive national security information, including through the use of such data on their family members or other close associates.[474] Compromising information gleaned from geolocation data can be used by adversaries for surveillance and intelligence gathering as well as to extort, blackmail, dox, and manipulate behavior to obtain sensitive national security information. With all the information that is readily available from data brokers, it is quite feasible to develop effective strategies to identify national security personnel and diplomatic/foreign-policy personnel working with specific sensitive information and to track their movements and behavior.

    Adversaries could use these datasets to identify where national security personnel work, then use the personnel's health or financial information to bribe or blackmail them into providing the adversaries with access to restricted systems, sensitive information, or critical programs or infrastructure. Precise geolocation data could also allow these countries to track service members' and other national security personnel's movements, impersonate personnel online or in email, and identify personnel working on specific tasks within the national security community.

    Countries of concern can also exploit access to government-related data, regardless of its volume. As one report has explained, for example, location-tracking data on individuals ( e.g., military members, government employees and contractors, or senior government officials) can “reveal sensitive locations—such as visits to a place of worship, a gambling venue, a health clinic, or a gay bar[,]” or “reputationally damaging lifestyle characteristics, such as infidelity[,]” which “could be used for profiling, coercion, blackmail, or other purposes[.]” [475]

    In addition, these geolocation capabilities, combined with photography, “can expose personal information, locations, routines and numbers of [Department of Defense (DOD)] personnel, and potentially create unintended security consequences and increased risk to the joint force and mission.” [476]

    iv. Personal Health Data

    Personal health data also presents threats in the hands of foreign adversaries. There are a few documented cases of data brokers selling sensitive health information to foreign governments, including those in part IV.D.1.a of this preamble. Purchasers may have direct, indirect, or undisclosed ties to foreign officials that provide these entities with access to otherwise prohibited data. The presence of foreign adversaries in the U.S. health data market makes the variety and amount of American health data available in the data-brokerage ecosystem risky. Notably, the types of sensitive personal data ( e.g., mental health or HIV/AIDS diagnoses) available, paired with the increasing speed and ease with which artificial intelligence and other technologies can re-identify individuals using as few as 15 demographic attributes ( e.g., ZIP code, date of birth, gender, citizenship, race, occupation) from another dataset, have the potential to produce harmful outcomes for the American public if placed in the wrong hands.[477]

    Currently, health data brokers collect and sell a wealth of information encompassing everything from general health conditions to addiction and prescription drug use. Additional discussion of these risks associated with personal health data can be found in part V.A.4 of this preamble.

    v. Personal Financial Data

    Financial data tied to individuals can pose associated national security threats in the hands of foreign adversaries. There is also an associated threat to national security to U.S. persons who might be targeted for recruitment by a foreign adversary through the use of financial data as leverage over such U.S. persons. Data about an individual's credit, charge, or debit card, or bank account, including purchases and payment history; data in a bank, credit, or other financial statement, including assets, liabilities and debts, and transactions; or data in a credit or “consumer report” expose that individual to more than monetary losses.[478] The threat of exposing an individual's spending habits, particularly spending that may be embarrassing, can render that person open to extortion or blackmail.[479] In instances where a threatened individual has access to especially sensitive information, national security may be at risk.

    vi. Covered Personal Identifiers

    Covered personal identifiers are a form of sensitive personal data that are both widely available and highly variable in nature. For example, covered personal identifiers may include demographic or contact data ( e.g., first and last name, birthplace, ZIP code, residential street or postal address, phone number, and email address and similar public account identifiers) that is linked to financial account numbers. Many types of covered personal identifiers can be used effectively in combination with other typers of sensitive personal data. The versatility of this data could make covered personal identifiers a valuable target for foreign adversaries attempting to increase the effectiveness of the bulk sensitive personal data in their possession by linking together separate datasets. For example, the PRC has both stolen data on U.S. persons that has included covered personal identifiers ( e.g., names and Social Security numbers, as evidenced in the 2015 hack of the health insurer Anthem, Inc.) and has effectively used personal identifiers within internal datasets on their citizens as a way to more effectively surveil marginalized groups.[480]

    ( print page 86180)

    vii. Government-Related Data

    It has become increasingly evident in recent years that government-related location data is at risk of being exploited by countries of concern through location information collected from electronic devices, including cell phones and fitness apps.[481] Such data can be used to not only track the movements of targeted government associates but also to link them with sensitive activities and vices, such as gambling or prostitution. This information can then be used to pressure these persons to reveal sensitive information, thereby compromising U.S. national security. Methods include malicious cyber-enabled activities, espionage, and blackmail.[482]

    b. Baseline: Total Potential U.S. Population Affected by Risks

    Part IV.A.1 of this preamble explains how adversaries can use their access to Americans' bulk sensitive personal data to engage in malicious cyber-enabled activities and malign foreign influence and to track and build profiles on U.S. individuals, including members of the military and government employees and contractors, for illicit purposes such as blackmail and espionage. As of July 2021, one of the largest data brokers, Acxiom, sold products that purported to cover 45.5 million current and former U.S. military personnel and 21.3 million current and former government employees.[483] The proposed rule also observes that countries of concern can exploit their access to Americans' bulk sensitive personal data to collect information on activists, academics, journalists, dissidents, political figures, and members of nongovernmental organizations or marginalized communities to intimidate them; curb political opposition; limit the freedoms of expression, peaceful assembly, or association; or enable other forms of suppression of civil liberties. Even family members of primary targets can be ensnared in such malicious activity. Finally, individuals with access to advanced intellectual property, such as semiconductor designs, could be high-value targets of countries of concern.

    Tables VII-2 and VII-3 of this preamble provide estimates of the size of these targeted populations, but these figures should not be added together to calculate a single population figure, since a single individual could be a member of more than one of the communities.

    Several of the estimates presented in Table VII-2 of this preamble required calculations based on certain assumptions. Because data on the number of current Federal employees provided by the Office of Personnel Management does not include employees of the U.S. Postal Service, Office of the Director of National Intelligence, or Central Intelligence Agency, data on those groups was obtained from other sources and added in separate lines. The estimated number of former Federal Government contractors was calculated by applying the economy-wide labor turnover rate from 2001 to 2023 to the number of current Federal Government contractors; the Department assumed that half of the labor turnover involved workers staying in Federal Government contracting, and half involved workers leaving the industry. The estimated number of family members of military veterans was calculated by applying the current average number of family members for current military members to the military veteran population.

    Table VII-2—Affected Population—Government-Related Groups

    Category Population
    Current Federal Government employees (excluding employees of the U.S. Postal Service, director of National Intelligence, and Central Intelligence Agency) a 2,271,498
    U.S. Postal Service employees b 525,469
    Office of the Director of National Intelligence employees c 1,750
    Central Intelligence Agency employees d 20,000
    Former Federal Government employees e 4,103,208
    Current Federal Government contractors f 4,100,000
    Former Federal Government contractors  f g 1,715,850
    Department of Defense active duty h 1,304,720
    Coast Guard active duty h 39,485
    Ready Reserve h 994,860
    Standby Reserve h 5,253
    Retired Reserve h 183,728
    Current military family members h 2,482,499
    Military veterans i 17,680,000
    Former military family members  h i 21,188,328
    a  U.S. Off. of Pers. Mgmt., Status Data: Employment, Federal Workforce Data (Feb. 2024), https://perma.cc/​7NF9-CTSC.
    bNumber of Postal Employees Since 1926, U.S. Postal Service (Feb. 2024), https://about.usps.com/​who/​profile/​history/​employees-since-1926.htm [ https://perma.cc/​6W5W-VJJ6].
    c  Charles C. Clark, Lifting the Lid, Gov't Exec. (Sept. 1, 2012), https://www.govexec.com/​magazine/​features/​2012/​09/​lifting-lid/​57807/​ [ https://perma.cc/​N8Z8-GEL8].
    d  Michael J. O'Neal, CIA, Formation and History, Encylopedia.com, https://www.encyclopedia.com/​politics/​encyclopedias-almanacs-transcripts-and-maps/​cia-formation-and-history [ https://perma.cc/​RZ24-YJAE]. ( print page 86181)
    e  U.S. Off. of Pers. Mgmt., Dynamics Data: Separations, FY 20052005-FY 2009 (data cube), Federal Workforce Data (Feb. 2024), https://www.fedscope.opm.gov/​ibmcognos/​bi/​v1/​disp?​b_​action=​powerPlayService&​m_​encoding=​UTF-8&​BZ=​1AAABv9rMvcF42pVOQW6DQAz8jE2SQyOvYRM4cFjYRcmhkAYuPVXbZFNFpRAB~1cFqEraW2dkyR6PR~bKYl1WxdHsddwPbef2eonMVxOQkYFKhJbbgEIZCl9xsNlkyt8mUhAyr7zx1qhjujuoahcjZ6e2GVwzIGeXtj67DmWCATX2y6GvFwd7_​rQfrn8r3c12dri2Tb9AqZGz27z67X_​wIVPVueaMTMvsFZmYSCLTEzL9zNFqDPN0ma7TIs9NWu2LPFfPJv53kJe8xBciEEQkBAEAgSRggpEA90BkQh7TVF0jRdoO7o8EyCGyT8hOIL8jR7Mg7gJMQPZH_​wPExKmbn5lqfmHGNxVvb8I%3D [ https://perma.cc/​L42Z-AFAA]; U.S. Off. of Pers. Mgmt., Dynamics Data, Separations, FY 2010-FY 2014 (data cube), Federal Workforce Data (Feb. 2024), https://www.fedscope.opm.gov/​ibmcognos/​bi/​v1/​disp?​b_​action=​powerPlayService&​m_​encoding=​UTF-8&​BZ=​1AAABv4ldgp142pVOwW6CQBD9mR3UQ83sg1U4cAB2iR4KVrj01FBdG1MKBvj~NEAabW99L5PMvHnzMk6Rr4syP5q9Dvuh7exeLwm4GqmiTZAoX_​vAg_​~HasNbTwdBbCL4W0PAyhlvTXRMdoeo3IWE9NQ2g20GQnpp67PtSMXkcVN9WXL14lCdPqsP278V9lZ11XBtm35BShPS27z67X_​wEbjsbHMm8DJ9JTBYMoGfCPwze6sxzNFFsk7yLDNJuc_​zLHo24b_​DnPglvDALycxSshCChWIBFiOFuAcSmDCmRXVNHOhqsH8kQfAJLhOsJLwTglmQd0FMILij~QFy4tTNz0w1vzDjG3wAb~w%3D [ https://perma.cc/​5D43-3SY8]; U.S. Off. of Pers. Mgmt., Dynamics Data, Separations, FY 2015-FY 2019 (data cube), Federal Workforce Data (Feb. 2024), https://www.fedscope.opm.gov/​ibmcognos/​bi/​v1/​disp?​b_​action=​powerPlayService&​m_​encoding=​UTF-8&​BZ=​1AAABv6ePmJp42pVOQW6DQAz8jE2SQyOvFyI4cFjYReFQSAOXnqptsqmiUoiA~6sCVCXtrTOyZI~HI3tVua3q8mhyHQ9j17tcr5H5GpJvRCgCtVPCJyOjROosldpPDCU7Qci88aZbo47p~qDqfYycnbp2dO2InF265ux6DBL0qbVfDqVeHezp03644a1yN9vb8dq1wwoDjZzdltVv~4MNmeretWdkWmevyMQkAmR6QqafOdpMYZ6u0m1aFoVJ67wsCvVs4n8HeclLfCECQURCEAAQBARMMBHgHohMyFOaahqkSNvR~ZEAOUSWhOwE8jtytAjiLsAMZDnZHyBmzt3yzFzLCwu_​AVRTb_​0%3D [ https://perma.cc/​646G-6NBA]; U.S. Off. of Pers. Mgmt., Dynamics Data, Separations, FY 2020-FY 2024 (data cube), Federal Workforce Data (Feb. 2024), https://www.fedscope.opm.gov/​ibmcognos/​bi/​v1/​disp?​b_​action=​powerPlayService&​m_​encoding=​UTF-8&​BZ=​1AAABv3dM77542pVOwW6DMAz9GZu2h1WOAa05cIAkqD0MusKlpylr06kagwr4f00BTe1223uyZD8%7EPzmoynVVlwez08kwdr3b6SUyX2PJmeFYylCrSD9HKemNyFiQkkJpEyHzKvC3Jj2o7T6ttwlyfura0bUjcn7pmrPrMc4wotZ_​OQz1Ym9Pn%7EbDDW_​Vu9nejteuHRYYa_​T8Nq9_​_​x9syFT3rj0j0zI%7EIpMnMj0h088crXxYoCu1VmVRGFXvyqJIX0zy76Age00uRCCISAgCAIKYgAk8Ae6B6K99Wto0SFLb0f2RAHmDHBKyE8jvyHIWxF2ACcihtz9ATJy6_​Zmp5hdmfANkKW%7Ev [ https://perma.cc/​58FZ-T7VA].
    f  David Welna & Marisa Peñaloza, Not Expecting Back Pay, Government Contractors Collect Unemployment, Dip into Savings, NPR (Jan. 7, 2019), https://www.npr.org/​2019/​01/​07/​682821224/​most-contractors-do-not-expect-to-get-back-pay-when-the-shutdown-ends [ https://perma.cc/​K4AW-ARFW].
    g  U.S. Bureau of Labor Stat., Total Separations Rate, Total Nonfarm, Not Seasonally Adjusted (JTU000000000000000TSR), Job Openings and Labor Turnover Survey, https://data.bls.gov/​toppicks?​survey=​jt [ https://perma.cc/​VQQ3-7AT3] (data extracted Sept. 2024) (select “Total separations rate, Total nonfarm, not seasonally adjusted” from list; then click “Retrieve data”). Estimate is based on the average annual separations rate from 2000 to 2022. The estimate of former government officials is based on the average turnover rate for all employees in the economy.
    h  U.S. Dep't of Def., ICF, 022 Demographics: Profile of the Military Community (2022), https://download.militaryonesource.mil/​12038/​MOS/​Reports/​2022-demographics-report.pdf [ https://perma.cc/​TP2G-UADR].
    i  U.S. Bureau of Labor Stat., Population Level—Total Veterans, 18 Years and Over (LNU00049526), Labor Force Statistics from the Current Population Survey, https://beta.bls.gov/​dataViewer/​view/​timeseries/​LNU00049526 [ https://perma.cc/​P396-7M3A] (data extracted Feb. 2024).

    The proposed rule highlights data associated with “activists, academics, journalists, dissidents, political figures, or members of nongovernmental organizations or marginalized communities” that could be used to “intimidate such persons; curb political opposition; limit freedoms of expression, peaceful assembly, or association; or enable other forms of suppression of civil liberties.” [484] Table VII-3 of this preamble describes the size of these populations.

    Table VII-3 of this preamble also contains several figures that required calculations based on certain assumptions. The estimated number of activists was calculated using a survey from the Washington Post and Kaiser Family Foundation that asked respondents whether they considered themselves activists; the percentage that answered “yes” was then applied to the current adult population. No data was available on the number of people residing in the United States who would be considered dissidents, so an estimate is provided for the size of this group. For this analysis, marginalized communities were assumed to include members of the lesbian, gay, bisexual, or transgender (“LGBT”) community; religious minorities; and racial minorities.[485] The number of religious minorities was calculated using the percentage of the population that identified as Jewish, Muslim, Buddhist, Hindu, or another religion.[486]

    As noted earlier in this section, the populations affected by risks have substantial overlap, so the Department is unable to provide a single estimate of the affected population. Nonetheless, these estimates show that a substantial portion of the U.S. population is currently affected by the risks resulting from adversaries' access to bulk sensitive personal data.

    Table VII-3—Populations Affected by Risks—Other Groups

    Category Population
    Activists  a b 46,973,153
    Academics c 1,380,290
    Journalists d 44,530
    Dissidents  e f g h 127,929
    Political figures i 519,682
    Members of non-governmental organizations j 715,790
    Marginalized communities—LGBT k 13,942,200
    Marginalized communities—Religious  b l 14,352,908
    ( print page 86182)
    Marginalized communities—Race m (white Hispanic population not included) 130,398,545
    a  Wash. Post & Kaiser Family Found., Survey on Political Rallygoing and Activism (Apr. 2018), https://files.kff.org/​attachment/​Topline-Washington-Post-Kaiser-Family-Foundation-Survey-on-Political-Rallygoing-and-Activism [ https://perma.cc/​7ELT-NM6L].
    b  U.S. Census Bureau, K200104: Population by Age, American Community Survey, 1-Year Supplemental Estimates (2022), https://data.census.gov/​table/​ACSSE2022.K200104 [ https://perma.cc/​D6KP-JTKD].
    c  U.S. Bureau of Labor Stat., 25-1000: Postsecondary Teachers, National Occupational Employment and Wage Estimates (May 2023), https://www.bls.gov/​oes/​current/​oes_​nat.htm [ https://perma.cc/​8FTZ-FCMW].
    d  U.S. Bureau of Labor Stat., 27-3023: News Analysts, Reporters and Journalists, National Occupational Employment and Wage Estimates (May 2023), https://www.bls.gov/​oes/​current/​oes_​nat.htm [ https://perma.cc/​8FTZ-FCMW].
    e  U.S. Dep't of Homeland Sec., Off. of Immigr. Stat., 2003 Yearbook of Immigration Statistics (2004), https://www.dhs.gov/​ohss/​topics/​immigration/​yearbook/​2003 [ https://perma.cc/​FSJ3-XRSG].
    f  U.S. Dep't of Homeland Sec., Off. of Immigr. Stat., 2012 Yearbook of Immigration Statistics (2013), https://www.dhs.gov/​ohss/​topics/​immigration/​yearbook/​2012 [ https://perma.cc/​XZG6-WL65].
    g  U.S. Dep't of Homeland Sec., Off. of Immigr. Stat., 2022 Yearbook of Immigration Statistics (2023), https://www.dhs.gov/​ohss/​topics/​immigration/​yearbook/​2022 [ https://perma.cc/​9YXU-Y2EF].
    h  U.S. Dep't of Just., Exec. Off. for Immigr. Rev., Adjudication Statistics: Asylum Decision Rates by Nationality (2023), https://www.justice.gov/​eoir/​page/​file/​1107366/​dl [ https://perma.cc/​PJ7C-GRK4].
    iHow Many Politicians Are There in the USA? PoliEngine, https://poliengine.com/​blog/​how-many-politicians-are-there-in-the-us [ https://perma.cc/​D5DG-KJHM].
    j  Ctr. on Nonprofits, Philanthropy, and Soc. Enter., George Mason U., Nonprofit Employment Data Project Jobs Recovery Data Dashboard (Jan. 10, 2023), https://nonprofitcenter.schar.gmu.edu/​nonprofit-employment-data-project/​resources-and-dashboards/​ [ https://perma.cc/​2XBZ-ACDW].
    k  Andrew R. Flores & Kerith J. Conron, Adult LGBT Population in the United States, Williams Inst., UCLA Sch. of L. (2023), https://williamsinstitute.law.ucla.edu/​publications/​adult-lgbt-pop-us/​ [ https://perma.cc/​MZQ2-EAP9].
    l2022 PRRI Census of American Religion: Religious Affiliation Updates and Trends, PRRI (Feb. 24, 2023), https://www.prri.org/​spotlight/​prri-2022-american-values-atlas-religious-affiliation-updates-and-trends/​ [ https://perma.cc/​BQA7-MK2J].
    m  U.S. Census Bureau, K200201: Race, American Community Survey, 1-Year Supplemental Estimates (2022), https://data.census.gov/​table/​ACSSE2022.K200201 [ https://perma.cc/​XJ3V-LH8J].

    c. Summary of Baseline (Without the Proposed Rule)

    As stated in part IV.A.1 of this preamble, the government-related data or bulk U.S. sensitive personal data discussed here may, under certain conditions, be used against individuals—such as members of the military, government employees, and government contractors—for illicit purposes, including blackmail and espionage. The risks of any particular individual or group being targeted may vary depending on the circumstances, and these data illustrate the range of such activities. Countries of concern can also use access to government-related data or Americans' bulk U.S. sensitive personal data to collect information on activists, academics, journalists, dissidents, political figures, and members of nongovernmental organizations or marginalized communities to intimidate such persons; curb political opposition; limit freedoms of expression, peaceful assembly, or association; or enable other forms of suppression of civil liberties.

    The individuals within the subgroups most at risk of having their sensitive personal data exploited by countries of concern not only play important roles in American society, but also make up a large portion of the population. When we consider that threats to the spouses and children of targeted individuals could also be made, the Department estimates that the total population of individuals who could potentially be targeted is well over 100 million individuals. Thus, the total number of those who are at risk of being targeted by foreign adversaries could exceed one-third of the entire American population. Failing to prevent the current and future sale or transfer of government-related data or bulk U.S. sensitive personal data to countries of concern effectively forgoes all the benefits that may be realized from such an action. Given the nature of the benefits to be gained from protecting national security and foreign policy from malicious actors, these benefits are unable to be monetized or quantified but will be contrasted with the estimated costs of the proposed regulation.

    6. Alternative Approaches

    In addition to the proposed action, the Department considered two alternatives. The first alternative, the No Action alternative, would take no regulatory action and allow the unrestricted transfer of bulk U.S. sensitive personal data to any foreign company, person, or country, including the countries of concern. The No Action alternative would not achieve the benefits of reducing the risks to the targeted populations or to U.S. national security and foreign policy. The growing threats related to foreign adversaries' use of bulk U.S. sensitive personal data, enhanced by advancing technologies such as artificial intelligence, for purposes of subjecting American citizens to exposure to blackmail and other malicious actions would continue. In addition to the sale of bulk U.S. sensitive personal data, the vendor, employment, and investment agreements would also continue without restrictions, as would the risks that the proposed rule is intended to reduce. Of course, the No Action alternative would also result in no additional costs to industry. As explained in part VII.A.7 of this preamble, however, the Department considers the expected benefits of the proposed regulation to greatly exceed the estimated costs, resulting in net benefits that are not realized by the No Action alternative. Therefore, the Department rejects the No Action alternative.

    The second alternative considered was a prohibition of the transfer to countries of concern of all data that would fall within the scope of the proposed rule. This alternative would go further than directed by the Order, the provisions of which were directed at bulk U.S. sensitive personal data and would entail more complicated and costly enforcement efforts than the proposed rule. Since this alternative would prohibit not only bulk U.S. sensitive personal data but also small transfers of sensitive personal data that may not present any marginal substantial risk to national defense or foreign policy, the small additional benefits are not likely to justify the much larger value of lost transactions and compliance costs than the proposed rule's estimated cost of $502 million. ( print page 86183) Since the marginal costs of this alternative over the costs of the proposed regulation are expected to be larger than the marginal benefits—if there are any—associated with it, this alternative would necessarily have lower net benefits, as measured by total benefits less total costs.

    More generally, in addressing proposals from commenters, the Department also considered alternatives that could broaden the scope of the rule (and thus potentially be more costly) or narrow its scope (and thus potentially be less costly). These alternatives include, for example, lowering or increasing the proposed bulk thresholds, prohibiting or restricting (instead of exempting) additional categories of transactions such as those involving telecommunications or clinical-trial data, expanding the list of countries of concern, and expanding the categories of covered persons. The Department declined to adopt them because they would appear not to appropriately tailor the proposed rule to the national security risks and could cause unintended economic effects, for the reasons more fully discussed with respect to those proposals.

    7. Benefits of the Proposed Rule

    As mentioned in part VII.A.2 of this preamble, the benefits of the proposed rule associated with reducing threats to national security and foreign policy are difficult to measure. While these benefits are difficult to measure, there is a liberal opportunity for foreign adversaries to access and exploit Americans' sensitive data without the proposed rule. This situation and the best-available information indicate that there is a high likelihood of harm to national security and that the harm to national security could be high, suggesting that the expected benefits of the proposed rule exceed its expected costs. Alternatively, even if the likelihood of harm to national security is low in some circumstances, the potential damage to national security remains high, suggesting that even modest risk reductions are justified.

    The proposed rule focuses on the risk of access to government-related data or bulk U.S. sensitive personal data by countries of concern and covered persons. Countries of concern can use their access to Americans' bulk sensitive personal data to engage in malicious cyber-enabled activities and malign foreign influence as well as to track and build profiles of U.S. individuals, including members of the military and Federal employees and contractors, for illicit purposes such as blackmail and espionage. Countries of concern can also exploit their access to Americans' bulk sensitive personal data to collect information on activists, academics, journalists, dissidents, political figures, and members of nongovernmental organizations or marginalized communities to intimidate them; curb political opposition; limit freedoms of expression, peaceful assembly, or association; or enable other forms of suppression of civil liberties. Nongovernmental experts have underscored these risks.

    Reducing these threats may produce many qualitative benefits, such as improving the security of the American people and safeguarding democratic values, all of which are beyond a reasonable, reliable, and acceptable estimate of quantified monetary value. Other benefits may also arise, such as the creation of new businesses to provide vetting information to firms seeking entrance into the restricted transactions market, or advancements in overall industry cybersecurity technology that result in more secure systems. We make no attempt to quantify these potential benefits, but we welcome comments that may allow us to do so.

    8. Costs of the Proposed Rule

    The economic costs of the proposed rule are the lost economic value of the covered transactions that are prohibited or forgone, referred to as “direct costs,” and the costs of compliance (for restricted transactions, the cost of complying with the security requirements established by DHS/CISA, affirmative due diligence requirements, audit requirements, and affirmative reporting requirements).

    Other provisions included in the regulations—including regulations of investment, employment, and vendor agreements through the imposition of security requirements—will have a mixture of economic impacts, such as one-time costs of switching to approaches that will comply with new regulations, and economic benefits, such as improved cybersecurity controls.

    The challenge of estimating the economic impact with any degree of precision is that, because there are no regulations prohibiting cross-border bulk U.S. sensitive personal data transactions, currently available data provides incomplete, unreliable, or irrelevant estimates of the types, volume, and value of the bulk U.S. sensitive personal data transfer activity and thus creates uncertainty in estimates of lost value due to the proposed rule.

    Similarly, the estimates of the costs of requirements for affirmative due diligence, security, recordkeeping, affirmative reporting, and audits are very preliminary in this analysis because the size of the industry, per-company costs, and per-transaction costs are very difficult to estimate precisely. Furthermore, based on its experience with similar regulations related to economic sanctions and export controls, the Department expects the costs of compliance with this proposed rule to vary significantly across companies.

    Our estimates reflect costs for firms that currently engage in transactions involving bulk U.S. sensitive personal data. The universe of firms that engage in transactions involving bulk U.S. sensitive personal data is larger than the subset of such firms that knowingly transfer such data to countries of concern or covered persons; this larger universe of firms will need to undertake some due diligence measures to ensure that their typical data transfers are not in fact going to countries of concern or covered persons. Comments are solicited and welcome on the estimates that follow.

    a. Value of Lost and Forgone Transactions

    The costs of the proposed rule would include the economic value of lost or forgone transactions related to the sale, transfer, and licensing of bulk U.S. sensitive personal data, as well as biospecimens, to the six countries of concern. These lost or forgone transactions would include transactions that are prohibited as well as covered data transactions that are forgone because an entity decides not to bear the costs of complying with the due diligence and security requirements necessary to engage in a restricted transaction.

    The total economic value of lost and forgone transactions should not exceed the total economic value of such exports to these countries of concern, as, all else equal, an entity would forgo a transaction if its expected compliance costs exceed the expected economic value of the transaction. The anticipated value of potentially regulated transactions with all countries of concern except China is negligible, given the lack of general cross-border transactions involving bulk sensitive personal data and the existing impediments to trade, such as economic sanctions. More specifically, in recent years, the proportions of U.S. bidirectional trade and investment ( print page 86184) represented by trade with Iran,[487] North Korea,[488] and Cuba,[489] respectively, have been negligible or unknown. Trade with Venezuela represents 0.1 percent of U.S. totals.[490]

    Trade with Russia averaged about 0.5 percent of all U.S. trade from 2014 to 2023, dropping to 0.1 percent in 2023.[491] U.S. cross-investment ( i.e., two-way foreign direct investment) [492] with Russia averaged about 0.2 percent of total U.S. foreign cross-investment from 2013 to 2022.[493] In contrast, the respective shares of U.S. imports/exports and cross-investment that are conducted with China have averaged about 12 percent and 1.3 percent, respectively, during the same periods.[494] Given this, the estimation of the economic costs of lost or forgone transactions here focuses primarily on China, although Russia is also considered.

    Similarly, foreign direct investment into the United States from countries of concern in the information industry is a relatively small portion of the total level of foreign direct investment in this sector. Chinese investment, which has been the highest among countries of concern, has been steadily falling over the past several years. Foreign direct investment figures, based on the country of the ultimate beneficial owner and the country of direct foreign ownership, are presented in Tables VII-4 and VII-5 of this preamble.

    Table VII-4—Foreign Direct Investment Position in the United States on a Historical-Cost Basis in the Information Industry by Country of Ultimate Beneficial Owner

    [millions of dollars]

    2016 2017 2018 2019 2020 2021 2022
    All Countries Total 171,474 197,266 208,932 204,950 182,913 259,867 254,691
    China 1,806 2,413 2,286 2,936 1,700 1,432 452
    Hong Kong 366 416 148 562 413 429 417
    Venezuela (*) (*) −3 −5 −6 −7 (D)
    China + Hong Kong Share 1.3% 1.4% 1.2% 1.7% 1.2% 0.7% 0.3%
    Venezuela Share n/a n/a 0.0% 0.0% 0.0% 0.0% n/a
    Note : (*) indicates a nonzero value that rounds to zero. (D) indicates that the data in the cell have been suppressed to avoid disclosure of data of individual companies.
    Source: U.S. Bureau of Econ. Analysis, Balance of Payments and Direct Investment Position Data: Foreign Direct Investment in the U.S., Foreign Direct Investment Position in the United States on a Historical-Cost Basis, Country of UBO and Industry (NAICS) (Millions of Dollars), https://apps.bea.gov/​iTable/​?ReqID=​2&​step=​1&​_​gl=​1*ubcfnx*_​ga*MjEzMzE0NDY0Ny4xNzA1NTc5Mjcw*_​ga_​J4698JNNFT*MTcyMDYzNjY0My44Mi4xLjE3MjA2MzgzMzQuNTMuMC4w#eyJhcHBpZCI6Miwic3RlcHMiOlsxLDIsMyw0LDUsNywxMF0sImRhdGEiOltbIlN0ZXAxUHJvbXB0MSIsIjIiXSxbIlN0ZXAxUHJvbXB0MiIsIjEiXSxbIlN0ZXAyUHJvbXB0MyIsIjEiXSxbIlN0ZXAzUHJvbXB0NCIsIjIyIl0sWyJTdGVwNFByb21wdDUiLCIyMiJdLFsiU3RlcDVQcm9tcHQ2IiwiMSJdLFsiU3RlcDdQcm9tcHQ4IixbIjI4LDI5LDMwLDMxLDMyLDMzLDM0LDM1LDM2LDM3LDM4LDM5LDQwLDQxLDQyLDQzLDQ4LDQ5LDUyLDU1LDU2LDU4LDYwLDYxLDY1LDY2Il1dLFsiU3RlcDhQcm9tcHQ5QSIsWyI3Il1dLFsiU3RlcDhQcm9tcHQxMEEiLFsiMSJdXV19 [ https://perma.cc/​X7SK-2LD8].

    Table VII-5—Foreign Direct Investment Position in the United States on a Historical-Cost Basis in the Information Industry by Country of Direct Foreign Parent

    [millions of dollars]

    2016 2017 2018 2019 2020 2021 2022
    All Countries Total 171,474 197,266 208,932 204,950 182,913 259,867 254,691
    China 134 (D) 4,286 4,882 3,480 2,809 772
    Hong Kong (D) (D) −53 389 284 282 196
    Venezuela −4 (D) −3 −2 −2 (*) (D)
    China + Hong Kong Share n/a n/a 2.0% 2.6% 2.1% 1.2% 0.5%
    Venezuela Share 0.0% n/a 0.0% 0.0% 0.0% N/A N/A
    Note : (*) indicates a nonzero value that rounds to zero. (D) indicates that the data in the cell have been suppressed to avoid disclosure of data of individual companies.
    Source: U.S. Bureau of Econ. Analysis, Balance of Payments and Direct Investment Position Data: Foreign Direct Investment in the U.S., Foreign Direct Investment Position in the United States on a Historical-Cost Basis, By Country and Industry (NAICS) (Millions of Dollars) https://apps.bea.gov/​iTable/​?ReqID=​2&​step=​1&​_​gl=​1*ay559c*_​ga*MTM5ODMzNTkyNy4xNzEwMjgzOTY0*_​ga_​J4698JNNFT*MTcyMjk0ODk5My4yNi4xLjE3MjI5NTA2NTEuNTcuMC4w#eyJhcHBpZCI6Miwic3RlcHMiOlsxLDIsMyw0LDUsNywxMF0sImRhdGEiOltbIlN0ZXAxUHJvbXB0MSIsIjIiXSxbIlN0ZXAxUHJvbXB0MiIsIjEiXSxbIlN0ZXAyUHJvbXB0MyIsIjEiXSxbIlN0ZXAzUHJvbXB0NCIsIjIyIl0sWyJTdGVwNFByb21wdDUiLCIxIl0sWyJTdGVwNVByb21wdDYiLCIxIl0sWyJTdGVwN1Byb21wdDgiLFsiNjYiLCI2NSIsIjYxIiwiNjAiLCI1OCIsIjU2IiwiNTUiXV0sWyJTdGVwOFByb21wdDlBIixbIjciXV0sWyJTdGVwOFByb21wdDEwQSIsWyIxIl1dXX0=​ [ https://perma.cc/​NH7V-GZNU].
    ( print page 86185)

    Note that some fraction of the lost or forgone data transactions may be for beneficial uses. Beneficial uses of bulk U.S. sensitive personal data in the countries of concern may include consumer-choice improvements and the use of artificial intelligence to expedite innovation in drug discovery and increase knowledge of patterns of, for example, consumption, commerce, transportation, traffic, information/news transmission, nutrition, and health.

    The following analysis relies in part on data available from the BEA on U.S. exports of telecommunications, computer, and information services, both in total and for each of the three sub-categories, to China and Russia. Tables VII-6 and VII-7 of this preamble rely on an analysis of the BEA data to approximate the value of lost transactions.

    i. Global Market Value of Genomic, Biometric, and Location Data

    Genomic data includes data that is used in drug discovery and development, specifically in developing products such as systems, software, and reagents, and in developing processes such as cell isolation, sample preparation, and genomic analysis.[495] Biometric data is used in consumer electronics and automotive applications for safety, surveillance, and identification methods, including facial, posture, voice, fingerprint, and iris recognition technologies.[496] Location data is used in smart devices, network services for improved connectivity, systems integration, monitoring, and satellite location technology, as well as to produce timely, relevant and personalized offers/information for customers.[497]

    According to market research data from one company,[498] the global genomics market's value is estimated at $27.58 billion in 2021, $32.56 billion in 2022, and, given an estimated compound annual growth rate of 18.2 percent, $38.49 billion in 2023 and $45.49 billion in 2024.[499] One estimate of the global biometric technology market values it at $34.27 billion in 2022 and, given an estimated 20.4 percent compound annual growth rate, $41.26 billion in 2023 and $49.68 billion in 2024.[500] Finally, one estimate values the global location data market at $18.52 billion in 2023 and, given an estimated 15.6 percent compound annual growth rate, $21.41 billion in 2024.[501] Table VII-6 of this preamble presents these global totals for genomic, biometric, and location data estimates.[502]

    Table VII-6—Global Technology Market Value Estimates for Genomic, Biometric, and Location Data for 2021-2024 (in Billions of 2022 Dollars) With Compound Annual Growth Rate (“CAGR”)

    Data type 2021 2022 2023 2024 CAGR (%)
    Genomic a $27.58 $32.56 $38.49 $45.49 18.2
    Biometric b 34.27 41.26 49.68 20.4
    Location c 18.52 21.41 15.6
    aGenomics Global Market to Reach $63.5 Billion in 2026 at a CAGR of 18.2%, Globe Newswire (Sept. 6, 2022), https://www.globenewswire.com/​en/​news-release/​2022/​09/​06/​2510235/​28124/​en/​Genomics-Global-Market-to-Reach-63-5-Billion-in-2026-at-a-CAGR-of-18-2.html [ https://perma.cc/​SUV8-VVMK].
    b  Grand View Research, Report ID No. 978-1-68038-299-0, Biometric Technology Market Size, Share & Trends Analysis Report, 2023-2030 (2023), https://www.grandviewresearch.com/​industry-analysis/​biometrics-industry [ https://perma.cc/​KN36-3KZW].
    c  Grand View Research, Report ID No. GVR-2-68038-401-7, Location Intelligence Market Size, Share & Trends Analysis Report, 2024-2030 (2024), https://www.grandviewresearch.com/​industry-analysis/​location-intelligence-market [ https://perma.cc/​WS6U-2324].

    ii. U.S. Exports to Relevant Specific Categories and to Countries of Concern

    Data is available on U.S. exports in the category of telecommunications, computer, and information services, both in total and for each of these three service subcategories, to China and Russia.[503] Data on exports in a relevant sub-category of information services—database and other information services—is available globally and for both China and Russia individually.

    Telecommunications, Computer, and Information Services is one of the eleven service categories BEA presents in the U.S. international transactions accounts. The Telecommunications Services sub-category includes basic services ( e.g., transmitting messages between destinations) as well as value-added and support services. The Computer Services sub-category includes software, computing and data-storage services, hardware and software consultancy, and licensing agreements tied to downloading applications. The category of information services includes database services and web search portals, which belong to one subcategory, and news agency services, which belong to the other.[504] This Database Services sub-category includes data brokers.

    Table VII-7 of this preamble presents the value of U.S. exports of telecommunications services, computer services, information services, and the database and other information services component of information services. The table also reports exports to China for the three components combined and the exports to China and Russia individually for database and other information services. ( print page 86186)

    Table VII-7—U.S. Exports of Telecommunications, Computer, and Information Services

    [In billions of 2023 dollars]

    Service All China Russia
    Components
    Telecommunications Services $9.329 $0.095 $0.041
    Computer Services $50.328 $1.847 $0.113
    Information Services $10.972 $0.318 $0.034
    Total Value $70.629 $2.260 $0.188
    Percentage of Total 100% 3.20% 0.27%
    Database and Other Information Services
    Value $10.768 * $0.318 $0.032
    Percentage of Total 100% 2.94% 0.30%
    Source: U.S. Bureau of Econ. Analysis, International Transactions, International Services, and International Investment Position Tables, Tables 2.2, https://www.bea.gov/​itable/​direct-investment-multinational-enterprises [ https://perma.cc/​9XWQ-A8YQ] (last updated July 23, 2024).
    * An upper bound for 2023 is $0.318.

    The $10.768 billion Database and Other Information Services sub-category comprises most (98.1 percent) of the $10.972 billion Information Services category, with the other $0.275 billion (2.6 percent) in the News-Agency Services category. For the Database and Other Information Services sub-category, U.S. exports to China are $0.318 billion ($318 million) for 2023, which is 2.94 percent of the U.S. export total. U.S. exports to Russia are $0.32 billion ($32 million), which is 0.30 percent of the U.S. export total.

    These U.S. export estimates are significantly over-inclusive, as they include many kinds of data that are explicitly excluded from regulation under the proposed rule, such as web browser history and other expressive data, in addition to services that do not involve data transfer at all. Consequently, the estimates of the costs due to lost or foregone transactions resulting from the proposed rule are probably overstated.

    Tables VII-6 and VII-7 of this preamble comprise the “raw” data on U.S. exports to countries of concern that provide upper-bound estimates of the value of lost or forgone transactions. Given the available data that informs the following analysis, the Department welcomes comments on the use of this data and on any alternative or additional data that could also be employed.

    iii. Estimates of U.S. Exports of Genomic, Biometric, and Location Data

    This section provides estimates of U.S. revenue from sales for three categories of data covered under the proposed rule for which data on the global market are available: genomic, biometric, and location data.

    Given the lack of available published estimates of the value of U.S. exports of such data to China, Russia, and other countries of concern, the Department developed a multi-step method for estimating the value of lost transactions in genomic, biometric, and location data to the countries of concern. To summarize, we began with market research companies' estimates of the size of the global markets in geometric, biometric, and location data shown in Table VII-6 of this preamble. Then we derived estimates of the value of lost transactions through a three-step process that involved estimating the U.S. share (domestic plus export) of the global market, estimating the percentage of U.S. global sales that are domestic, and finally making some data-informed assumptions about the share of global sales in those industries that were to the countries of concern.

    In 2022, the total value of the U.S. location data market was $4.20 billion, with a projected compound annual growth rate of 13.6 percent.[505] Based on these estimates, it can be projected that the U.S. portion in 2023 would be $4.77 billion ($4.20 * 1.136 = $4.77). As shown in Table VII-6 of this preamble, the market research company estimated that the global market value of location data in 2023 was $18.52 billion, so the estimated U.S. portion in 2023 would constitute 25.76 percent of that estimated global value ($4.77/$18.52 = 0.2576). Because the market research company estimated the U.S. compound annual growth rate for location data to be 13.6 percent and the global compound annual growth rate to be 15.6 percent, it can be projected that the U.S. portion of the global market in 2024 would fall slightly, from 25.76 percent to 25.31 percent (($4.77 * 1.136)/($18.52 * 1.156) = 0.2531).

    The market research company estimated that the North American revenue share of the global biometric technology market in 2022 was 30.7 percent.[506] If Canada and Mexico were responsible for 5 percent of global market value, then the U.S. share of the global biometric data market in 2022 would be 25.7 percent, nearly the same as the portion for location data in 2023. Given the alignment in our estimates of the U.S. market share for location and biometric data markets, we assume that the estimated U.S. location data market in 2024 (25.31 percent) also applies to the U.S. portion of global genomic and biometric data. With that assumption, we estimate that the U.S. genomic data market is worth $12.57 billion in 2024 (25.31 percent of $49.68 billion (from Table VII-6 of this preamble)), and the U.S. market is worth $5.42 billion (25.31 percent of $21.41 billion). Table VII-8 of this preamble provides 2024 estimates of U.S. revenue (foreign plus domestic) for those three industries.

    Next, the Department assumes that U.S. exports in genomic, biometric, and location data constitute 30 percent of total U.S. revenue (domestic sales plus exports). This assumption is based on market research from a U.S.-based company,[507] which estimated that in ( print page 86187) 2017, 30 percent of revenue for data brokerage companies came from international sales. When applying this assumption to revenue from sales of genomic, biometric, and location data, we estimate that exports of U.S. genomic data in 2023 were worth $3.454 billion for genomic data (30 percent of $11.513 billion); exports of biometric data were worth $3.772 billion (30 percent of $12.574 billion); and exports of location data were worth $1.626 billion (30 percent of $5.419 billion). Table VII-8 of this preamble presents estimated revenue from U.S. exports (Step 2) alongside the other estimates from which it was derived.

    Table VII-8—Estimated Revenue From International Sales of Genomic, Biometric, and Location Data in 2023

    [In billions of 2024 dollars]

    Category of data Global revenue (from Table VII-6) a U.S. revenue (domestic sales + exports) b Revenue from U.S. exports c
    Genomic $45.49 $11.51 $3.45
    Biometric $49.68 $12.57 $3.77
    Location $21.41 $5.42 $1.63
    Total $116.58 $29.51 $8.85
    U.S. Revenue Share of Global Total 25.31%
    U.S. Export Share of U.S. Revenue 30%
    aGenomics Global Market to Reach $63.5 Billion in 2026 at a CAGR of 18.2%, Globe Newswire (Sept. 6, 2022), https://www.globenewswire.com/​en/​news-release/​2022/​09/​06/​2510235/​28124/​en/​Genomics-Global-Market-to-Reach-63-5-Billion-in-2026-at-a-CAGR-of-18-2.html [ https://perma.cc/​SUV8-VVMK]; Grand View Research, Report ID No. 978-1-68038-299-0, Biometric Technology Market Size, Share & Trends Analysis Report, 2023-2030 (2023), https://www.grandviewresearch.com/​industry-analysis/​biometrics-industry [ https://perma.cc/​KN36-3KZW]; Grand View Research, Report ID No. GVR-2-68038-401-7, Location Intelligence Market Size, Share & Trends Analysis Report, 2024-2030 (2024), https://www.grandviewresearch.com/​industry-analysis/​location-intelligence-market [ https://perma.cc/​WS6U-2324].
    b  Department of Justice estimates based on global revenue data from Table VII-6 of this preamble. U.S. Revenue (Domestic Sales + Exports) is assumed to be 25.31 percent of the total global revenue for each category of data.
    c  Department of Justice estimates based on data in global revenue data from Table VII-6 of this preamble. Revenue from U.S. exports is assumed to be 30 percent of the U.S. revenue for each category of data.

    To reiterate, the Department assumes that U.S. exports in genomic, biometric, and location data constitute 30 percent of total U.S. revenue (domestic sales plus exports). The Department uses this assumption to inform the analysis throughout part VII of this preamble.

    iv. Estimates of U.S. Exports of Genomic, Biometric, and Location Data to the Six Countries of Concern

    As delineated above in this section, the current value of potentially regulated transactions with all countries of concern except China and, to a lesser degree, Russia is negligible, given the lack of general cross-border trade in data and data-driven services and the general impediments to trade with these countries of concern, such as economic sanctions. We therefore focus this part of the analysis on China, and to a lesser degree on Russia due to the $32 million in U.S. exports of database and other information services to Russia.[508] The Department lacks data on cross-border transfers of genomic, biometric, and location data other than sales. An example of such cross-border transfers may include transfer of data within multinational companies.

    As set forth in Table VII-7 of this preamble and the subsequent discussion in part VII.A.8.a.ii of this preamble, 3 percent (0.032 = $0.318 of $10.768 billion) of U.S. exports of database and other information services are currently to China and 1 percent are to Russia (0.0106 = $0.111 of $10.768 billion). Applying these percentages to the value of U.S. exports of genomic, biometric, and location data set forth in Table VII-8 of this preamble yields estimates for the value of U.S. exports of genomic, biometric, and location data to China and Russia. The estimates for U.S. exports of genomic, biometric and location data to China total $267 million and to Russia total $94 million.

    Table VII-9—Estimates of U.S. Exports of Genomic, Biometric, and Location Data to China and Russia

    [In billions of 2022 dollars]

    Category of data U.S. exports (from table VII-8) U.S. exports to China a U.S. exports to Russia b
    Genomic $3.45 $0.10 $0.04
    Biometric $3.77 $0.11 $0.04
    Location $1.63 $0.05 $0.02
    Total $8.85 $0.27 $0.10
    China share of U.S. exports 3.02%
    Russia share of U.S. exports 1.06%
    a  Revenue from U.S. exports to China is assumed to be 3.02 percent of total revenue from U.S. exports for each category of data.
    b  Revenue from U.S. exports to Russia is assumed to be 1.06 percent of total revenue from U.S. exports for each category of data. ( print page 86188)
    Source: Department of Justice estimates based on market research data from U.S.-based company, including: Genomics Global Market to Reach $63.5 Billion in 2026 at a CAGR of 18.2%, Globe Newswire (Sept. 6, 2022), https://www.globenewswire.com/​en/​news-release/​2022/​09/​06/​2510235/​28124/​en/​Genomics-Global-Market-to-Reach-63-5-Billion-in-2026-at-a-CAGR-of-18-2.html [ https://perma.cc/​SUV8-VVMK]; Grand View Research, Report ID No. 978-1-68038-299-0, Biometric Technology Market Size, Share & Trends Analysis Report, 2023-2030 (2023), https://www.grandviewresearch.com/​industry-analysis/​biometrics-industry [ https://perma.cc/​KN36-3KZW]; Grand View Research, Report ID No. GVR-2-68038-401-7, Location Intelligence Market Size, Share & Trends Analysis Report, 2024-2030 (2024), https://www.grandviewresearch.com/​industry-analysis/​location-intelligence-market [ https://perma.cc/​WS6U-2324].

    v. Total Estimated Value of Lost and Forgone Transactions

    To reiterate, U.S. exports of genomic, biometric, and location data to China and Russia totaled approximately $361 million in 2022. Some of these exports may have been for beneficial uses, such as consumer-choice improvement; effective medical responses; and increased knowledge of patterns of consumption, commerce, transportation, traffic, information/news transmission, nutrition, and health. The Department's estimates of the value of lost transactions do not include the potential value of any lost positive externalities to U.S. residents. The estimated annual value of lost or forgone transactions is presented in Table VII-10 of this preamble.

    Table VII-10—Estimated Annual Value of Lost Transactions: Genomic, Biometric, and Location Data

    [In millions of 2022 dollars]

    Country of concern Value of forgone transactions
    China $267
    Russia 94
    Total 361
    Source: Department of Justice estimates based on market research data from U.S.-based company, including: Genomics Global Market to Reach $63.5 Billion in 2026 at a CAGR of 18.2%, Globe Newswire (Sept. 6, 2022), https://www.globenewswire.com/​en/​news-release/​2022/​09/​06/​2510235/​28124/​en/​Genomics-Global-Market-to-Reach-63-5-Billion-in-2026-at-a-CAGR-of-18-2.html [ https://perma.cc/​SUV8-VVMK]; Grand View Research, Report ID No. 978-1-68038-299-0, Biometric Technology Market Size, Share & Trends Analysis Report, 2023-2030 (2023), https://www.grandviewresearch.com/​industry-analysis/​biometrics-industry [ https://perma.cc/​KN36-3KZW]; Grand View Research, Report ID No. GVR-2-68038-401-7, Location Intelligence Market Size, Share & Trends Analysis Report, 2024-2030 (2024), https://www.grandviewresearch.com/​industry-analysis/​location-intelligence-market [ https://perma.cc/​WS6U-2324].

    The Department welcomes comments on the use of this data and on any alternative or additional data that could also be employed. The Department reiterates the following limitations on these estimates, described in further detail in the analysis above:

    1. The estimate assumes that the U.S. share of the global market value for location data is the same as the U.S. share of the global market value for genomic and biometric technology.

    2. The export share of each of these respective U.S. markets is assumed to be the same as the share of revenue of U.S. data-brokerage companies that comes from international sales, which is 30 percent.

    3. The estimate assumes that the database (and other information) services category of BEA's data includes most data brokers.

    4. The estimate uses the share of U.S. exports of database and other information services that go to China and Russia to estimate the share of U.S. exports of genomic, biometric, and location data that go to China and Russia.

    5. The Department assumes that the annual economic value of lost and forgone transactions would be equal to the value of all U.S. exports of biometric, location, and genomic data to China and Russia.

    vi. Alternative Methodology for Estimating the Value of Lost and Forgone Transactions

    An alternative estimate of the value of U.S. exports to China and Russia for this analysis can be derived from BEA data on the value of U.S. exports of database and other information services. As shown in the bottom of Table VII-7 of this preamble, BEA's estimates for 2023 were $318 million for China and $32 million for Russia.

    Given the rapid growth of Chinese exports, the Department projected the 2023 BEA estimate forward to 2024. Based on the growth rates of U.S. exports of information services to China between 2006 and 2023,[509] using an annual growth rate of 5 percent for China [510] would increase BEA's $318 million estimate for 2023 to $334 million for 2024.

    The Department's alternative estimates for the value of lost transactions are $334 million in forgone exports of information services to China and $32 million in foregone exports of information services to Russia, as shown in Table VII-11 of this preamble. ( print page 86189)

    Table VII-11—Estimates of the Annual Value of Lost Transactions Using Alternative Methodology

    [In millions of 2022 dollars]

    Country of concern Estimated value of U.S. exports of data and information services
    China $334
    Russia 32
    Total 366
    Source: U.S. Bureau of Econ. Analysis, Table 2.3. U.S. Trade in Services, by Country or Affiliation and by Type of Service, International Transactions, International Services, and International Investment Position Tables, https://apps.bea.gov/​iTable/​?reqid=​62&​step=​9&​isuri=​1&​product=​4#eyJhcHBpZCI6NjIsInN0ZXBzIjpbMSw5LDEwLDcsN10sImRhdGEiOltbInByb2R1Y3QiLCI0Il0sWyJUYWJsZUxpc3QiLCIzMDU4MyJdLFsiVGFibGVMaXN0U2Vjb25kYXJ5IiwiMzA2NTYiXSxbIkZpbHRlcl8jMSIsWyIxIiwiMiIsIjMiLCI0IiwiNSIsIjYiLCI3IiwiOCIsIjkiLCIxMCIsIjExIiwiMTIiLCIxMyIsIjE0IiwiMTUiLCIxNiIsIjE3IiwiMTgiXV0sWyJGaWx0ZXJfIzIiLFsiMjgiLCI3MyJdXSxbIkZpbHRlcl8jMyIsWyIxIiwiNTUiLCI2MSIsIjYzIl1dLFsiRmlsdGVyXyM0IixbIjAiXV0sWyJGaWx0ZXJfIzUiLFsiMCJdXV19 [ https://perma.cc/​7USS-P3PL].

    Under this alternative methodology, the Department's estimate of the economic value of transactions lost due to the proposed rule is $366 million per year (Table VII-11 of this preamble), compared with the $361 million estimate reached in the main analysis (Table VII-10 of this preamble).

    The alternative methodology may overestimate the annual value of lost transactions, since the BEA category for the Database and Other Information Services may include many kinds of data that are explicitly excluded from regulation under the proposed rule (such as web-browser history and other expressive data).[511] The comparability of the main and alternative methodologies suggests that the main estimate of $361 million may underestimate the annual value of lost transactions because it does not include personal financial data or health records. With these disparities in mind, we are estimating the value of lost transactions at the midpoint of these estimates, $364 million, and solicit comments on this total.

    The Department estimates that the economic value of lost or forgone transactions would be minimal for firms not engaged in data brokerage ( i.e., for restricted transactions). In other words, the Department assumes that firms not engaged in data brokerage would bear minimal economic costs beyond those that may be associated with implementing and maintaining a risk-based compliance program (as described in § 202.302 of the proposed rule). For example, the Department assumes that the small number of U.S.-based firms currently using Chinese cloud-service providers to store prohibited or restricted data would be able to switch to another cloud service provider at low or no cost, given that Chinese cloud-service providers constitute only a tiny fraction of the cloud market for U.S.-based firms. The Department assumes that the other five countries of interest do not sell cloud services of any significant value to the United States.[512]

    b. Security Costs

    Data security is an important aspect of protecting government-related data or bulk U.S. sensitive personal data from being improperly accessed by foreign adversaries or in ways that could pose a threat to national security. The proposed rule incorporates by reference proposed security requirements that have been developed by DHS through CISA, which represent conditions that businesses must meet to engage in restricted transactions. These security requirements are intended to address national security and foreign-policy threats that arise when countries of concern and covered persons access government-related data or bulk U.S. sensitive personal data that may be implicated by the categories of restricted transactions. Specifically, the proposed rule would prohibit U.S. persons from engaging in transactions unless they comply with three categories of requirements, the first two of which are addressed in the proposed security requirements, which are proposed to be incorporated by reference:

    1. Organizational and system-level requirements for instituting cybersecurity policies, practices, and requirements for any covered system (which CISA proposes to define as a specific type of information system that is used to conduct a number of activities related to covered data as part of a restricted transaction).

    2. Data-level requirements using any combination of the following capabilities necessary to prevent access to covered data by covered persons or countries of concern:

    a. Data minimization and data masking;

    b. Encryption;

    c. Privacy-enhancing technologies; and

    d. Denial of access.

    3. Compliance-Related Requirements for Independent Testing and Auditing

    These requirements would impose new costs on firms to the extent that they are not already voluntarily meeting such requirements. Any additional costs may be offset by reducing cyber incidents and their associated costs. The Department estimates the new cybersecurity-related costs imposed on affected firms by analyzing the costs that companies at different sizes and levels of technological maturity face when implementing existing security standards and frameworks of similar scope.

    i. Similar Security Standards and Frameworks

    The proposed rule would create cybersecurity standards that are based on, and thus overlap with, several similar, widely used cybersecurity standards or frameworks. Currently, firms engaged in transactions that are proposed to be restricted transactions are encouraged—but generally not explicitly required—to comply with existing Federal cybersecurity standards or frameworks. Given the similarities between the proposed rule's security requirements and existing cybersecurity standards and frameworks, the costs of complying with one of the existing, commonly implemented sets of cybersecurity standards or frameworks are likely similar to the costs that would be incurred by businesses to comply ( print page 86190) with the proposed rule. Furthermore, such costs will be offset because, as commenters generally agreed, most companies will already have foundational baseline security requirements in place.

    As required by the Order, the security requirements for firms engaged in restricted transactions are based on the National Institute of Standards and Technology (“NIST”) Cybersecurity Framework (“CSF”) [513] and the NIST Privacy Framework (“PF”).[514] CISA has also leveraged existing performance goals, guidance, practices, and controls, including the CISA Cross-Sector Cybersecurity Performance Goals (“CPGs”),[515] which are themselves based on the NIST CSF and PF.

    The CPGs, NIST CSF, and NIST PF are sets of recommendations, based on current best practices, that companies can voluntarily follow. The CPGs are themselves mapped to the CSF. In its proposed security requirements, CISA has included mapping to the CPGs and NIST CSF and PF, as applicable. Furthermore, the CSF aligns its requirements with other similar standards that are commonly used in industry, including NIST SP 800-53 and International Organization for Standardization/International Electrotechnical Commission (“ISO”)/(“IEC”) 27001:2013. The ISO/IEC 27001:2013 standard, unlike the CISA and NIST frameworks, has a more formal certification process and has granted certificates to 48,671 companies globally and 1,898 companies in the United States.[516] Finally, NIST SP 800-171 rev.3,[517] a common security framework, lays out security standards for firms handling controlled unclassified information, while the Department of Defense's Cybersecurity Maturity Model Certification (“CMMC”) program [518] includes the NIST SP 800-171 requirements but with a more formal auditing and certification process.

    ii. Current Industry Compliance Level

    The majority of firms affected by the proposed rule likely already comply with some portion of the security requirements in the proposed rule. The level of existing adherence to the proposed security requirements for the average U.S. company would vary significantly based on each firm's size, industry, existing regulatory landscape, technological maturity, and internalized priorities.[519] Given the high degree of overlap between the DHS draft security requirements and existing standards and frameworks discussed below, it is possible that the level of existing compliance among affected firms will be high.

    One survey of business expenditures on cybersecurity conducted by IANS Research indicates that such spending can vary depending on industry and business size. According to this survey, the average firm spent nearly 10 percent of its IT budget on cybersecurity, with firms in industries like technology, healthcare, and business services spending the highest proportion at over 13 percent.[520] Furthermore, in the same report, an analysis of firm size found that smaller businesses spend the highest proportion of their IT budgets on cybersecurity.[521] While it is possible that companies with larger expenditures on cybersecurity would be closer to compliance with the proposed rule, it is difficult to determine with the available data the extent to which companies are using their cybersecurity budgets to keep up with evolving best practices and maintain the capabilities required by the proposed security requirements.

    The level of technological and cybersecurity maturity also varies significantly, even among larger firms. Recent data indicates that there is great variability in cybersecurity practice sophistication and maturity. In 2023, Deloitte conducted a survey of cyber decision makers at firms around the world with at least 1,000 employees and $500 million in annual revenue. The Deloitte study determined that of these organizations, 38 percent had low cyber maturity (as defined in the study), 41 percent had medium cyber maturity, and 21 percent had high cyber maturity.[522] Another survey of IT professionals found that 45 percent of firms did not have a designated Chief Information Security Officer, which would make them noncompliant with the proposed security requirements.[523]

    Furthermore, research suggests that the cost of cybersecurity activities can vary based on factors such as the size of the company, the cybersecurity capabilities required, and whether those capabilities are developed in house or by contracting with third parties. Additionally, the Center for internet Security provided high and low estimates of cybersecurity budgets based on company size, with an average estimate of $22,000 for a small firm (1 to 10 employees) and about $800,000 for a large firm (100 to 999 employees).[524]

    iii. Costs of Compliance

    Regarding the cost of compliance, the Department assumes that most affected companies will not have to build cybersecurity capabilities from the ground up to meet the requirements of the proposed rule. Given that most firms will have existing cybersecurity protections in place, a more realistic approach to calculating the potential cost of the proposed rule would be to consider the additional expenditures that a company would have to make to increase its cybersecurity standards. The Department assumes, based on the ( print page 86191) design of the proposed rule, that added cybersecurity compliance costs will closely mirror the costs that companies face when complying with similar or more onerous/prescriptive standards, such as NIST 800-171, NIST 800-53, ISO/IEC 27001, and the CMMC program, providing a helpful tool to estimate the added compliance costs associated with the data security requirements, though such estimates may be conservative. Furthermore, since some firms may already clearly be in voluntary compliance with more stringent standards than the proposed security standards, which would allow them to forgo some of the following steps, some of the costs may not be incurred by all firms.

    The first step that most firms engaging in restricted transactions will take toward compliance is completing an assessment of their current capabilities and shortcomings. For example, it would cost a small to medium-sized firm that is working toward compliance with NIST 800-171 and NIST 800-53 around $30,000 to $35,000 to build in-house assessment capabilities.[525] Along the same lines, another source estimates that an assessment for a CMMC certification for a firm with 250 employees could cost up to $35,000.[526] For the initial assessment stage of the ISO/IEC 27001 certification process, a small business would expect to spend $25,000 to $40,000 to complete the process internally and around $30,000 to hire a consultant.[527] However, a company with more complicated compliance issues could expect to pay as much as $130,000 for the consulting and assessment.[528] Thus, the Department finds that an assessment will cost between $25,000 and $130,000 for most firms, depending on the scale of the compliance needs involved.

    The next step in the compliance process is remediating the issues found in the initial assessment. It is likely that remediation would involve a combination of fixed and recurring costs. One-time remediation costs could involve revising security policies or patching vulnerabilities in covered systems, while recurring costs could include subscriptions to services that provide data encryption, multifactor authentication, or password management services, as well as costs associated with maintaining access controls or required documentation. Estimates for remediation costs for NIST 800-171 compliance range between $35,000 and $115,000.[529] Another estimate suggests that midsized companies with lower levels of technological maturity can expect to pay approximately $100,000 to correct any compliance issues.[530]

    In accordance with the security requirements with which U.S. persons engaged in restricted transactions must comply under the proposed rule, every firm, regardless of its initial compliance level, will need to annually verify its compliance through audits and testing. This is a common cost that firms incur to comply with existing frameworks or standards. For a small company with 50 employees, the annual recertification audit for ISO/IEC 27001 compliance costs an estimated $6,000 to $7,500.[531] The continuous monitoring costs associated with NIST 800-171 compliance for small businesses are estimated to be around $6,500 to $13,000.[532] However, annual surveillance audits to ensure compliance with ISO/IEC 27001 standards can cost as much as $40,000.[533]

    Table VII-12 of this preamble summarizes the high and low estimates for the costs—both one-time and ongoing—that an average U.S. company engaged in restricted transactions may face under the proposed rule. A firm may find itself in the higher-cost category based on either greater size and complexity or a lower level of technological maturity. For this analysis, it is assumed that even firms in the low-cost scenario will have added costs in each category. Furthermore, based on the Department's experience, half of the added one-time remediation costs in both the high and low estimates are assumed to recur annually. Finally, for the added training costs, the low estimate was taken from the small business low-cost figure in a report by the Center for internet Security, and the high estimate was taken from the midsized business high-cost figure in the same report.[534]

    Table VII-12—Costs of Complying With the Proposed Security Requirements

    Category Cost—Low Cost—High Type
    Initial Assessment $25,000 $130,000 One-time.
    Remediation 35,000 115,000 One-time.
    Ongoing Remediation 17,500 57,500 Annually recurring.
    Compliance Audits 6,000 40,000 Annually recurring.
    Training 120 3,660 Annually recurring.

    c. Costs Associated With Compliance Program: Due Diligence, Recordkeeping, and Auditing

    In addition to security requirements, the proposed rule also introduces affirmative due diligence, recordkeeping, affirmative reporting, and auditing requirements as conditions of a license or for U.S. persons engaged in restricted transactions, each of which would likely impose added costs. In this section, the Department estimates costs for affirmative due diligence, recordkeeping, and auditing for firms engaged in licensed or restricted transactions.

    The compliance program for affirmative due diligence, ( print page 86192) recordkeeping, and auditing would consist partly of risk-based procedures for verifying the data flows involved in any restricted transaction. Further requirements would include a policy describing the compliance program and process, a policy describing the implementation of any applicable security requirements or other conditions, annual certification of such compliance policies, maintenance records documenting the due diligence performed in implementing the compliance policy with respect to data transactions, and an annual certification of the completeness and accuracy of the records documenting due diligence as supported by an audit.

    With regard to due diligence, recordkeeping, and auditing costs for U.S. companies, precise numbers on the number of affected firms, their sizes, and per-company or per-transaction costs are very difficult to estimate. Further, the compliance costs for firms that have established programs relative to existing Federal and State regulations may be minimal, as their compliance approach can be modified at low or no cost to address the proposed security requirements, whereas firms without such compliance programs would likely incur higher costs.

    In particular, many firms may have existing compliance programs targeted at three notable provisions that were passed and implemented in recent years: the California Consumer Privacy Act of 2018 (“CCPA”) [535] the EU's General Data Protection Regulation (“GDPR”),[536] and the APEC CBPR.[537] More than 10 other U.S. States have also recently passed data privacy legislation, and many others are considering such laws.[538] Due to these laws and existing Federal export-related regulations, it is possible that some expected due diligence costs imposed by the proposed rule may have already been incurred by affected businesses. However, given that the definitions under the proposed rule do not fully align with the definitions used in these frameworks, there are likely to be separate due diligence costs.

    The Department has estimated upper- and lower-bound costs to firms from the proposed rule related to due diligence, recordkeeping, and auditing based on our analysis of the literature regarding the CCPA, GDPR, and other data privacy rules and related research. These upper- and lower-bound estimates and the supporting literature are discussed in this part VII.A.8.c of this preamble. In summary, part VII.A.8.c.i of this preamble estimates that Know Your Customer/Know Your Vendor (“KYC”/“KYV”) costs for verifying one's business and its executives are between $150 (lower bound) and $4,230 (upper bound). In addition, parts VII.A.8.c.ii through VII.A.8.c.iv of this preamble estimate that the combined annual recordkeeping and auditing costs per firm are between a lower bound of $1,260 ($300 for auditing + $960 for recordkeeping) and an upper bound of $232,500 ($7,500 for auditing + $225,000 for recordkeeping). These estimates are based on the Department's analysis, but could be different depending on industry and context. The Department welcomes additional input from stakeholders on this point.

    i. Due Diligence Costs

    The proposed rule requires entities engaged in restricted transactions to perform due diligence that includes KYC/KYV activities, which may involve verifications to confirm the legitimacy and eligibility of customers and vendors. Costs would generally be incurred one time per customer or vendor, but they could be repetitive if there is reason to believe that a customer or vendor's legitimacy or eligibility has changed. The Department estimates the due diligence ( i.e., KYC/KYV) costs for verifying one business and its executives at between $150 (lower bound) and $4,230 (upper bound).

    The upper-bound estimate assumes that the background check costs for one customer or vendor business would include a background check for the business and three background checks for executives residing outside the United States. The Department estimates an upper-bound cost of $1,200 per business background check based on a study showing an upper-bound range of more than $1,000 for a due diligence background check of a business.[539] The Department estimates an upper-bound cost of $1,010 per executive background check based on the highest cost for a background check of an executive residing in a country of concern (which is associated with Venezuela) from Table VII-13 of this preamble.[540] Therefore, the upper-bound background check costs for one customer business with three executives could be as high as $4,230 ($1,200 per business [541] + ($1,010 per executive [542] * 3)).

    One company, Global Background Screening, charges $150 to $250 for business background checks, with the higher end for businesses headquartered outside the United States.[543] These business background checks include documentation on directorship, financials, registration, judgments, liens, bankruptcies, and credit risk.[544] Screenings for firm executives appear to be separate costs, which vary by country of residence and type of background check, as summarized in Table VII-13 of this preamble. Countries identified in the proposed rule as countries of concern ( see § 202.209) are included in Table VII-13 of this preamble where sufficient data is available. Santoni also advertises due diligence business background checks, which appear to include foreign firms and officers, ranging from $395 to more than $1,000.[545]

    ( print page 86193)

    Table VII-13—Estimated International Screening Costs for Individuals by Country of Residence

    Type of screening China Russia Cuba Venezuela All nations low All nations high
    Criminal background $80-$110 $129 $169 $135 $59 $260
    Civil judgements 80 145 239 159 45 279
    Identity verification 25 25 25 25 25 25
    Bankruptcy records 50 188 23 188
    Credit history 80 127 274 234 50 532
    Employment verification 35-99 35-99 35-99 35-99 35 99
    Education verification 35 35 35 35 35 35
    Worldscan (global databases) 11-65 11-65 11-65 11-65 11 65
    Social media scan 50-70 50-70 50-70 50-70 50 70
    Totals 446-614 557-695 838-976 872-1,010 333 1,553
    Source: International Screening Checkout Portal, Global Background Screening, https://www.globalbackgroundscreening.com/​online-background-check/​International-Employee-Screening-Select-Country-For-Pricing-p303546142 (reflecting costs of services at the time the Department drafted the proposed rule).

    The remainder of this section summarizes additional research on background check costs for foreign firms and for individuals residing outside the United States, which is broadly consistent with the Department's upper- and lower-bound estimates discussed so far in this section.

    The U.S. International Trade Administration (“ITA”) provides basic background check and in-depth data on foreign firms to help U.S. companies determine the suitability of possible business partners. To be eligible for the service, companies must be export-ready and endeavoring to export goods or services of U.S. origin with at least 51-percent U.S. content. The ITA provides partial profiles that include general business information, background and product data, pertinent executives, reputation information, brief analysis of information collected, and identities of the references used. Fees for these partial profiles range from $150 to $450 depending on the size of the inquiring firm. Full business profiles add onsite visits and interviews of company executives, with costs ranging from $700 to $2,000. Costs could increase if ITA staff are required to travel more than 80 kilometers or 2 hours from an ITA office.[546]

    Diligentia, Inc. categorizes background checks on individuals based on the thoroughness of the investigation. An individual or red flag investigation is designed to identify adverse issues predominantly via online searches, at a cost of $500 to $1,500. A professional background investigation is a more thorough review that adds onsite records depository visits and analysis of documents there, at a cost of $1,500 to $2,500. A comprehensive background investigation goes even further, with additional analyses, reviews of business interests, the use of other intelligence sources, financial investigation, and a credit history, at a cost of more than $2,500. This provider does not advertise a price difference between domestic and foreign background investigations.[547]

    Checkr states that international background checks can range from $30 to $500, with their fees varying from $32 to $300 and covering more than 200 countries. Checkr asserts that a global background check may include searches for criminal history, watchlist posting, education/employment verification, and media checks.[548]

    An ITA partial or full profile of foreign businesses would likely be preferable for U.S. companies due to the real or perceived credibility of a government agency and the comparatively reasonable costs. Accordingly, for verifying one business and its executives, the Department relied on ITA's pricing for the estimated lower-bound cost of $150.

    The sources supporting our cost estimates do not discuss whether the costs include extensive investigations that involve foreign travel or contracting with third parties in foreign locations to perform onsite visits, inquiries, interviews, and other in-depth activities. The Department estimates that it is unlikely that firms would allocate resources for these kinds of investigations, particularly when the ITA service is available. However, some firms may not meet the ITA's eligibility criteria for their services. Thus, it is possible that KYC/KYV activities may need to be performed via a private vendor, such as one of the vendors just described. Again, based on the analysis, the Department estimates the due diligence costs for verifying one business and its executives at between $150 (lower bound) and $4,230 (upper bound).

    These estimates are based on Department analysis but could be different depending on the industry and context. The DOJ welcomes additional input from stakeholders on this point.

    ii. Recordkeeping Costs

    The proposed rule's recordkeeping requirements would include generating or maintaining documents pertinent to various data transactions details, verifications of transaction partners, transactions agreements, licenses, exemptions, advisory opinions, annual due diligence certifications, and supporting documentation, as applicable. Data brokers incorporated in the United States market and sell data on individuals not only domestically but from many other countries; [549] for example, Acxiom markets data coverage for more than 62 countries.[550] Assuming that this data on foreign persons includes individuals protected by EU ( print page 86194) law, these data brokers are subject to the GDPR.

    Since 2018, the GDPR has required all organizations that target or collect data relative to persons in the EU to abide by privacy and security standards outlined in that law. One of the seven data protection principles in the GDPR is accountability or due diligence. Accordingly, data controllers ( i.e., holders of data) must be able to demonstrate compliance relative to accountability by (1) designating data protection responsibilities as appropriate; (2) maintaining comprehensive records of collected data, its use, and those responsible for it; (3) training staff and executing technical and organizational security measures; (4) implementing contracts with third parties that process data on their behalf; and (5) appointing a data protection officer (if a public authority or regularly processing personal data on a large scale).[551] Thus, a portion of covered persons subject to the proposed rule are already complying with GDPR recordkeeping requirements and would arguably not incur the full magnitude of these new costs.

    iii. Executive Order on Modernizing Regulatory Review Recordkeeping and Related Costs

    As shown in the following analysis, the annual recordkeeping and related costs per firm are estimated to be between $960 (lower bound) and $225,000 (upper bound).

    The Department calculates a lower-bound estimate of annual recordkeeping costs per firm by starting with the average annual incremental compliance costs/administrative burdens from the EU impact assessment of GDPR. According to the EU's impact assessment of the GDPR, average annual incremental compliance costs/administrative burdens for small and medium-sized enterprises (“SMEs”) [552] are approximately $9,624 (in 2024 dollars).[553] The Department assumes that the incremental recordkeeping costs of the proposed rule would only be about 10 percent of the estimated incremental annual costs for GDPR compliance. This assumption is based on the facts that the GDPR includes extensive recordkeeping requirements [554] and that many of the proposed rule's recordkeeping requirements are similar in scope to the obligations of existing data protection regulations.[555] Furthermore, the EU's impact assessment of the GDPR includes costs of compliance beyond recordkeeping costs. Based on these considerations and input from SMEs, the Department estimates that 1,400 small to medium-sized firms will incur recordkeeping costs of $960 per firm per year.

    An upper-bound estimate of annual recordkeeping costs per firm is also based on estimates of company privacy protection annual costs, which for large firms were estimated at $4.5 million per firm.[556] The Department further estimates that the incremental recordkeeping costs of the proposed rule for large firms would be approximately 5 percent of the estimated annual costs for privacy protections. This assumption is based on the same factors as those described for the lower-bound annual recordkeeping cost estimate as well as the fact that the prior study included additional necessary costs ( e.g., IT upgrades) beyond recordkeeping alone. Further, the Department believes that larger firms predominantly have the added benefit of possessing additional sophistication in complying with existing data privacy and security regimes and already have significant compliance programs and mechanisms in place. Thus, based on this analysis and subject-matter expert input, the Department estimates that 100 firms will incur the higher recordkeeping costs of $225,000 per firm.

    To provide context on these upper- and lower-bound recordkeeping costs, this section summarizes additional studies.

    The CCPA mandated that businesses in California update privacy policies, develop mechanisms for providing notice to consumers when collecting personal information (“PI”), and adequately respond to consumer wishes regarding the handling of such data. The State of California Department of Justice, Office of the Attorney General's (“CDOJAG”) standardized regulatory impact assessment for the CCPA regulations estimated the following rule-imposed costs per firm: $959 in one-time operational costs ( e.g., establishing workflows/plans), $7,500 for technological systems development (assumed one-time), $615 per year for training, $984 per year to abide by record-keeping requirements (one data privacy professional at $61.50 per hour * 16 hours),[557] and $492 (assumed per year) to provide financial incentives or differential services/prices to promote non-discriminatory practices in their treatment of consumers exercising their CCPA rights. Apart from the $984 in compliance-related costs, the CDOJAG assumed that there were no incremental costs for collecting the information subject to the CCPA's recordkeeping requirement, as affected businesses likely already had mature mechanisms for identifying, processing, and analyzing PI from their data-mapping and consumer response practices. The CDOJAG's total estimated costs per firm to comply with the CCPA were about $29,000 ($2,900 annually) for the period from 2020 to 2030.[558]

    Christensen et al. estimated GDPR compliance costs at between $5,065 and $12,157 (2024 dollars) [559] per year per SME, which represents a 16- to 40-percent increase in annual IT budgets. These presumably would align with the aforementioned GDPR accountability ( print page 86195) elements identified by Wolford,[560] which are generally consistent with those of the proposed rule.

    A 2018 International Association of Privacy Professionals and Ernst & Young (“IAPP”/“EY”) study/survey identified much higher average expected spending of about $3 million per firm on GDPR compliance, or $300,000 annually if assumed over 10 years. This included $1,276,000 already spent, another $822,000 expected for adaptation of products and services, and $989,000 for other adaptation activities. However, the average annual costs per firm due to GDPR were unclear based on the IAPP/EY 2018 and 2019 surveys. Company annual mean and median privacy-related spending ranged from $128 to $147 on a per-employee basis. Survey respondents were a mix of company sizes ranging from under 100 employees to more than 75,000.[561]

    A 2021 Cisco annual global survey of all major industries found that annual privacy budgets doubled from the previous year to an average of $2.4 million (with smaller firms at a lower end of $1.6 million and larger firms at an upper end of $3.7 million, as reported in 2020).[562] This average figure of $2.4 million is comparable to a high-end estimate found in another study that aimed to project the costs to businesses incurred by possible Florida consumer privacy legislation; that study had a lower-bound estimate of about $733,000 in one-time costs per firm and subsequent ongoing annual costs ranging from about $542,000 to $1.5 million.[563] Organizations may spend an average of about $1,406 per subject rights request by consumers pursuant to privacy regulations.[564] According to one estimate, data processing agreements may have an average cost of $785.[565] Though privacy budgets, including some of their underlying elements, are different in scope and detail from the proposed rule they nonetheless have relevance for estimating the costs of the proposed rule as these budgets often serve similar objectives and require companies to undertake similar processes to protect sensitive data.

    The relatively new APEC CBPR is a voluntary accountability framework regulating data transfers between member nations that is somewhat similar to the EU's GDPR, but based on Organisation for Economic Co-operation and Development (“OECD”) privacy principles.[566] In the United States, APEC CBPR annual certification costs range from $15,000 to $40,000.[567] The APEC CBPR's data security [568] due diligence mechanism is another requirement with which firms may already be complying and thus could reduce incremental costs of the proposed rule due to comparable or related requirements.

    In summary, available sources show variations regarding the compliance costs for data privacy and cybersecurity regulations specific to recordkeeping. The Department estimates that it is very likely that incremental recordkeeping costs for at least some firms impacted by the proposed rule are zero, as the CDOJAG discussed in its cost estimates for the CCPA. Conversely, the possibility exists that larger firms have not been subject to the EU's GDPR and would be impacted by the proposed rule, resulting in their incurring some portion of the $4.5 million in 2024 in estimated annual recordkeeping costs documented by Cisco for such firms.[569] These ranges, along with the other data and analysis discussed throughout this subpart, were taken into consideration for the calculations of the proposed rule's average annual lower- and upper-bound costs per firm of $960 and $225,000. Part VII.F of this preamble estimates that costs due to the proposed annual reporting requirements for certain categories of U.S. persons engaged in certain subsets of restricted transactions would range from $821,100 (lower bound) to $1,642,200 (upper bound).

    iv. Auditing Costs

    As shown in the following analysis, annual auditing costs per firm are estimated to be between $300 (lower bound) and $7,500 (upper bound).

    Auditing costs for restricted transactions would be incurred in the form of independent examinations to support the due diligence certifications. TrustNet offers services to help firms determine their compliance with the CCPA. One such service is a CCPA gap assessment, which covers scope, project management, risk assessment, controls identification, testing/analysis, remediation roadmap, and reporting. The cost of this service starts at $10,000. TrustNet also offers a CCPA compliance assessment with costs starting at $15,000, which covers similar elements.[570]

    According to Neumetric, a cybersecurity products and services company, GDPR accreditation or certification is not offered by the EU or any of its member states. Firms do not need to certify that they are GDPR compliant; however, there are third-party certification bodies/consultants that offer GDPR certification services for consultant fees ranging from $3,000 to $11,000, on average.[571] This does not include internal costs to prepare for certification or other prerequisites for obtaining ISO/IEC 27001 and ISO 27701 certification, which could cost between $1,000 and $4,000.[572] Another source estimated that costs for GDPR certification range from about $5,000 to $20,000 or more (excluding ISO/IEC 27001 and ISO 27701 certification).[573]

    The American Institute of Certified Public Accountants has developed a cybersecurity compliance framework known as Service Organization Control 2 (“SOC 2”). Cybersecurity audit costs can be divided into SOC 2 Type 1 audits and SOC 2 Type 2 audits. Type 1 audits evaluate the suitability of controls at a specific point in time and can cost between $5,000 and $25,000. Type 2 audits gauge the effectiveness of controls over a more extended ( print page 86196) timeframe and can range in costs from $30,000 to $100,000.[574] The latter is more appropriate for firms processing highly sensitive personal data on a regular basis.[575] “Dunkelberger provides a slightly different range for SOC 2 Type 1 audits, estimating that they can cost $15,000 to $50,000 for small to medium-sized businesses and between $50,000 to $100,000 for large businesses; and for SOC 2 Type 2 Audits, estimating that they can cost $30,000 to $75,000 for small to medium-sized businesses and $75,000 to $150,000 for large businesses. These costs include the price of a readiness assessment, audit, remediation, and consultant fees.[576]

    As these estimates show, the costs of annual audits for compliance with CCPA, GDPR, and SOC 2 range from $3,000 to $150,000, depending on audit type and firm size. The Department expects that such examiners may not always charge these full rates separately just to certify compliance with the proposed rule, due to redundancies with existing legislation and efficiencies of conducting simultaneous audits pursuant to multiple rules. Nevertheless, there would be increased costs, as there are likely to be variations in addition to the redundancies. For purposes of this analysis, the Department assumes that for all small firms, the proposed rule would result in audit costs that are 10 percent of the estimated cost of an audit from the reviewed literature, or $300 ($3,000 * 10 percent incremental cost). The Department assumes that for all large firms, the proposed rule would result in audit costs that are 5 percent of the estimated cost of an audit from the reviewed literature, or $7,500 ($150,000 * 5 percent incremental cost).

    v. Estimated Recordkeeping Costs From the Reviewed Literature

    The wide-ranging estimates of recordkeeping costs in the studies reviewed, and the entwinement of the former with other costs, demonstrate the difficulty in determining specific costs for each due to the proposed rule. Further, the literature is not specific to compliance with this proposed rule. Rather, the literature relates to the business costs of protecting personal information from unauthorized dissemination while establishing procedures for its processing and transfer, in addition to protocols for responding to consumer preferences regarding handling of their own personal information. In recent years, privacy and protection laws affecting the entities that will likely be impacted by the proposed rule have proliferated. Thus, the recordkeeping costs contemplated under the proposed rule have already been incurred to some extent.

    vi. Summary of a Compliance Program: Due Diligence, Recordkeeping, and Auditing

    From this examination of the available literature, the due diligence, recordkeeping, and auditing requirements are likely to unevenly impact firms that must comply with the proposed rule, depending on the size of each firm and how much it currently spends on the components of due diligence. Although the means by which firms will comply is uncertain, the Department has relied on a variety of research in the topic areas to make preliminary estimates of costs due to the proposed rule.

    Uncertainty is prevalent in these restricted transactions and data-brokerage market cost estimates for several reasons. In particular, the estimates of recordkeeping costs based on the percentage of the costs of compliance with GDPR and other data protection regimes reported in various studies are highly speculative. Estimates of the proposed rule-imposed incremental costs above and beyond similar compliance activities already taking place are also speculative. Consequently, the Department welcomes comments on these cost calculations from affected industries and stakeholders to better inform decision making relative to the proposed rule.

    Beyond the cost impacts of the proposed regulation, there could possibly be adjustments and market movements in reaction to changes in the threshold levels that are being proposed. This analysis assumes that all the current bulk U.S. sensitive personal data transactions are above the lower threshold levels as defined by the proposed rule. If the threshold levels are set at the higher level in the final rule, it is possible that there may be less immediate market disruption but also a greater risk of more data falling into malicious hands, including through evasion techniques such as structuring and smurfing ( i.e., conducting smaller and more frequent transactions using additional individuals). Since there is no available data on the number of transactions by volume of personal data being transferred, the impacts of selecting one bulk threshold over another within the ranges in the NPRM are uncertain at the time of the proposal, and the Department welcomes comments on this subject.

    Note that the recordkeeping costs discussed here (part VII.A of this preamble) are also included in part VII.F of this preamble (Paperwork Reduction Act), which presents cost estimates for the six new information collection requests introduced by the proposed rule. The costs of affirmative annual reporting are also discussed above. In addition to recordkeeping costs and the cost of affirmative annual reporting, part VII.F of this preamble presents estimates for the applications for specific licenses, reports of rejected prohibited transactions, requests for advisory opinions, petitions for removal from the Covered Persons List, and reports of known or suspected violations of onward transfers prohibition. All of those information collections affect a relatively small number of firms. Additional detail on those annual costs is available in the Information Collection Request submitted for Office of Management and Budget review under the Paperwork Reduction Act and publicly available on reginfo.gov.

    9. Summary of Regulatory Analysis

    Regulatory analysis in the areas of national security and foreign policy is often not easily quantifiable or monetizable due to an array of factors, such as inadequate information, inaccessibility of sensitive or proprietary data; and the absence of a good measure of the effectiveness of the regulations.

    The purpose of the Preliminary RIA is to gather and analyze enough adequate information to inform agency decision makers about whether a proposed rulemaking is in the public's interest. The analysis should describe the impacts on firms in the market and in the supply chain, remembering that the intermediate firms in the chain are customers of the suppliers. To the extent possible, the impacts on the general public should be considered, as well as—in the case of this proposed rulemaking—the impact on national security and foreign policy and the impact of data-brokerage restrictions on the positive uses of bulk data. The precision of estimates depends on the availability of data, the confidence in the accuracy of the data, and the degree of understanding of the impacted markets. ( print page 86197)

    These economic impact estimates lack precision due to significant gaps in the available data on the number of firms and data transactions that would be affected by the proposed rule and by the lack of confidence in much of the available data. Due to relatively recent and emerging developments in studying the market for data, relevant, reliable, and representative size, sales, employment, and other descriptive information on the data-brokerage market and other entities that will be subject to the proposed rule does not appear to be currently available. The Department is not aware of reliable data on the exact number of firms that currently engage in prohibited data-brokerage transactions, the size distribution of these firms, or the numbers of firms that sell above or below the threshold levels that would bring them under the proposed rule's umbrella. The Department welcomes additional input on this point. The low and high threshold levels for the different categories of sensitive personal data or government-related data vary by factors of 10 to 1 for human genomic data and 1,000 to 1 for personal health data and personal financial data. Furthermore, the Department lacks data on the broader universe of firms that transact in government-related data or bulk U.S. sensitive personal data in the context of restricted transactions. As noted in the NPRM, firms that transact in bulk U.S. sensitive personal data above the proposed thresholds, as laid out in part V.C of this preamble, will need to ensure that their typical data transfers are not in fact going to countries of concern or covered persons (for prohibited transactions) and to comply with the security and due diligence requirements for restricted transactions.

    This analysis leverages the limited available data on the number of data-brokerage firms and the volume of data-brokerage exports, along with estimates of security and due diligence costs from studies of similar policies and guidelines. The Department finds that, based on certain assumptions, the proposed rule will have at least some measurable economic impacts. From Table VII-10 of this preamble, the Department estimates that the total annual value of lost transactions is $361 million, or an estimated $80,222 per firm for 4,500 firms (3,000 data brokers + 1,500 firms engaged in restricted transactions).

    Table VII-14 of this preamble presents estimates of security compliance costs derived from data shown in part VII.B.2 of this preamble and estimates of due diligence, recordkeeping, and auditing costs derived from data shown in part VII.A.8.c of this preamble. The variations in costs are due to firm size and other factors. As explained in part VII.A.8.c of this preamble, the Department estimates the KYC/KYV ( i.e., due diligence) costs for verifying one business and its executives to be between a lower bound of $150 and an upper bound of $4,230. There is no information on how many verifications a firm will do, but the Department assumes for purposes of this analysis 10 verifications per firm per year, for a total cost of between $1,500 and $42,300. Adding the lower bounds of due diligence costs ($1,500), auditing costs ($300), and recordkeeping costs ($960) per firm, the resulting costs at a lower bound are $2,760 per firm for other compliance costs. Adding the upper bound of due diligence costs ($42,300), audit costs ($7,500), and recordkeeping costs ($225,000) per firm, the resulting costs are $274,800 for total compliance annual costs per large firm. Table VII-14 of this preamble shows annual compliance costs per firm. The Department welcomes additional input on this point.

    Table VII-14—Annual Compliance Costs per Firm

    [For Firms Engaged in Restricted Transactions]

    Cost category Low (small firms) High (large firms)
    Security: One-Time Costs $60,000 $245.000
    Security: Recurring Costs 23,620 101,160
    Other Compliance Costs (Due Diligence, Audits, Recordkeeping) 2,760 274,800

    When the proposed rule is finalized and becomes effective, market dynamics will set in, and firms will exit and enter the market as they adjust to the new regulatory environment. As noted in part VII.A.3.b of this preamble, the U.S. data-brokerage market ranges from around $30 billion to $180 billion per year, suggesting average revenues per firm at around $10 million to $60 million per year, assuming an estimated 3,000 firms. The compliance costs per firm will determine whether firms pursue restricted transactions.

    For the purposes of this estimate, the Department assumes that 1,500 firms will engage in restricted transactions, the largest 100 will incur the high costs, and the remaining 1,400 will incur the lower costs. Although it is estimated that there are a relatively few U.S.-based firms conducting business with Chinese cloud-service providers that may continue these activities under the restrictions, it is expected that a large—but unknown—number of other firms will pursue the restricted transaction opportunities involving employment and investment agreements. Under these conditions, the Department assumes that about 1,500 firms beyond the 3,000 data brokers will be active in pursuing vendor, employment, and investment agreement opportunities in the restricted transactions market, the 100 largest of which will be at the high cost and 1,400 of which will incur the lower costs.

    The annualized costs of the proposed rule are determined by deriving the 10-year projections for three cost components: the economic value of lost transactions, security costs, and other compliance costs (due diligence, auditing, and recordkeeping). Our analysis assumes that 4,500 firms, including 3,000 data brokers and 1,500 other firms engaged in restricted transactions, will incur economic costs. The analyses also assume that 4,300 of those firms are small firms (including 2,900 data brokers and 1,400 firms engaged in restricted transactions) and 200 of those firms are large firms (100 data brokers and 100 firms engaged in restricted transactions). The analysis also assumes that the data-brokerage industry affected by the proposed rule is growing at a 5-percent annual rate.

    Turning to compliance costs, our analyses assume that 1,500 firms will incur compliance costs as a result of the proposed rule. The Department assumes that security costs have one-time components—initial assessment and remediation—that are only realized in the first year, as well as recurring components—ongoing remediation, compliance audits, and training—that ( print page 86198) are present for all 10 years. In addition, it is assumed that the other compliance costs, including affirmative due diligence, auditing, and recordkeeping costs, will decline as firms become more efficient and learn to pursue lower-cost compliance options. These due diligence, auditing, and recordkeeping costs are presented as annually decreasing, but at a decreasing rate. As companies move away from reliance on employees in countries of concern or vendors in countries of concern, the Department assumes that these costs will decrease over time. Further, since the security measures are all reliant on existing NIST standards and CISA performance goals to which many companies already align their security posture, the Department assumes that due diligence, auditing, and recordkeeping costs will decrease 15 percent in the second year, 12 percent in the third year, 9 percent in the fourth year, 7 percent in the fifth year, and then 5 percent, 4 percent, 3 percent, 2 percent, and 1 percent in each of the sixth through tenth years. The costs are presented undiscounted (0-percent rate) and at discounted by 2 percent.

    In sum, the parameter assumptions of the 10-year projections are:

    1. The annual growth rate of the economic value of lost transactions is 5 percent, compounded annually.

    2. Due diligence, auditing, and recordkeeping costs in Year 1 are taken from Table VII-14 of this preamble. Costs in Years 2 through 10 decrease, but at a decreasing rate of 15 percent, 12 percent, 9 percent, 7 percent, 5 percent, 4 percent, 3 percent, 2 percent, and 1 percent.

    3. Security costs have both one-time and recurring components in Year 1 and only recurring components in Years 2 through 10 (as shown in Table VII-14 of this preamble).

    4. The analysis assumes either undiscounted costs or a 2-percent annual discount rate.

    5. The value of lost transactions is from Table VII-10 of this preamble.

    6. Small firms will bear “low” costs shown in the security cost and lost transaction totals, and large firms will bear the “high” costs shown in the security cost and lost transaction totals in Tables VII-15 and VII-16 of this preamble.

    7. One thousand five hundred (1,500) firms will incur compliance costs as a result of the proposed rule, and a broader group of 4,500 firms will incur costs due to lost transactions.

    The 10-year annualized cost analysis (undiscounted and for a 2-percent discount rate) for security and other compliance costs (due diligence, auditing, and recordkeeping costs) is presented in Table VII-15 of this preamble for the 1,400 small firms and in Table VII-16 of this preamble for the 100 large firms. These estimates for security, due diligence, auditing, and recordkeeping costs for both small and large firms engaged in restricted transactions are combined with the industry-wide estimates for the economic value of lost transactions to obtain total costs for all firms, which are presented in Table VII-17 of this preamble.

    Table VII-15—10-Year Annualized Cost Analysis for Security, Due Diligence, Auditing (and Recordkeeping) for (the 1,400) Small Firms

    [Millions of dollars]

    Cost category Year 1 Year 2 Year 3 Year 4 Year 5 Year 6 Year 7 Year 8 Year 9 Year 10 Total Annualized
    Undiscounted
    Security $117 $35 $36 $38 $40 $42 $44 $47 $49 $51 $500 $50
    Due Diligence, Audits, and Recordkeeping 4 3 3 3 3 3 3 3 3 3 33 3.3
    Total 121 38 40 41 43 45 47 50 52 55 532 53
    Discount Rate: 2 Percent
    Security 117 34 35 36 37 38 39 41 42 43 463 46
    Due Diligence, Audits, and Recordkeeping 4 3 3 3 3 3 3 3 3 3 30 3.0
    Total 121 37 38 39 40 41 42 43 45 46 493 49
    Key Assumptions: Industry growth rate of 5 percent; due diligence, auditing, and recordkeeping costs decreasing at a decreasing rate of 15-12-9-7-5-4-3-2-1 percent over years 2-10.

    These year-to-year changes are the same in percentage terms for the analysis of large firms in Table VII-16 of this preamble.

    Table VII-16—10-Year Annualized Cost Analysis for Security and Due Diligence (and Recordkeeping) for (the 100) Large Firms

    [Millions of dollars]

    Cost category Year 1 Year 2 Year 3 Year 4 Year 5 Year 6 Year 7 Year 8 Year 9 Year 10 Total Annualized
    Undiscounted
    Security $35 $11 $11 $12 $12 $13 $14 $14 $15 $16 $152 $15
    Due Diligence, Audits, and Recordkeeping 27 25 23 22 22 22 22 22 23 24 232 23
    ( print page 86199)
    Total 62 35 34 34 34 35 35 37 38 40 383 38
    Discount Rate: 2 Percent
    Security 35 10 11 11 11 12 12 12 13 13 140 14
    Due Diligence, Audits, and Recordkeeping 27 24 22 21 20 19 19 19 19 20 212 21
    Total 62 35 33 32 31 31 31 32 32 33 352 35
    Key Assumptions: Industry growth rate of 5 percent; due diligence, auditing, and recordkeeping costs decreasing at a decreasing rate of 15-12-9-7-5-4-3-2-1 percent over years 2-10.

    The total annualized costs of the proposed rule for small and large firms are combined and presented in Table VII-17 of this preamble, estimated at $549 million undiscounted and $502 million discounted at 2 percent (any differences are due to rounding). Tables VII-15 and VII-16 of this preamble only include the costs of the security, due diligence, and recordkeeping requirements of the proposed rule, while Table VII-17 of this preamble also includes the costs associated with the value of lost transactions.

    Table VII-17—10-Year Annualized Cost Analysis for All Firms

    [Millions of dollars]

    Cost category Year 1 Year 2 Year 3 Year 4 Year 5 Year 6 Year 7 Year 8 Year 9 Year 10 Total Annualized
    Undiscounted
    Lost Transactions $364 $382 $401 $421 $442 $465 $488 $512 $538 $565 $4,578 $458
    Security 152 45 48 50 52 55 58 61 64 67 652 65
    Due Diligence, Audits, and Recordkeeping 31 28 26 25 25 25 25 25 26 27 264 26
    Total 547 456 475 497 520 544 571 598 628 659 5,494 549
    Discount Rate: 2 Percent
    Lost Transactions 364 375 22 398 410 422 435 448 461 475 4,173 417
    Security 152 44 46 47 49 50 52 53 55 56 604 60
    Due Diligence, Audits, and Recordkeeping 31 28 25 24 23 22 22 22 22 23 241 24
    Total 547 447 93 469 481 494 508 523 538 554 5,018 502
    Key Assumptions: Industry growth rate of 5 percent; due diligence, auditing, and recordkeeping costs decreasing at a decreasing rate of 15-12-9-7-5-4-3-2-1 percent over years 2-10.

    Table VII-18 of this preamble summarizes the 10-year annualized cost analysis (presented in Tables VII-15, VII-16, and VII-17 of this preamble) for small and large firms separately and in total, both undiscounted and with a discount rate of 2 percent.

    This cost estimate reflects the likelihood that a number of smaller firms will drop out of the market if the costs of compliance are greater than expected revenues ( i.e., if marginal costs exceed marginal revenues). Of course, this could also be true of larger firms that lack the infrastructure or financial resources to comply with the proposed rules and therefore choose to forgo certain transactions or business operations in that market altogether.

    In addition to the potential decrease in the number of firms in the industry, another related effect is that the proposed rule may create a barrier to entry for potential data brokers. That is, the same compliance burdens that affect marginal current brokers will also affect potential ones.

    Table VII-18—Summary of Total 10-Year Annualized Costs

    [Undiscounted and for a 2-Percent Discount Rate]

    Discount rate Total cost
    Undiscounted $549,000,000
    2 Percent 502,000,000

    These preliminary estimated costs of the proposed rule appear to be reasonable when balanced against the expected benefits of preventing the potential risk and harms to national security and foreign policy that are possible when government-related data or bulk U.S. sensitive personal data is transferred to foreign adversaries. These ( print page 86200) benefits are beyond monetary calculation but suggest that the proposed rule will have very large net benefits, including protections to well over 100 million American individuals who are potential targets of adversaries using government-related data or bulk U.S. sensitive personal data. A wide range of benefits of the regulation will also be realized by firms, including the savings associated with potentially reducing the likelihood of data breaches thanks to improved security, which are estimated to cost an average of $4.88 million per breach.[577] And firms that sell data to, or buy data from, brokers will have increased confidence in the security and due diligence arrangements associated with the regulation.

    Both the benefits to be realized and the costs to the economy and government will be determined, to some extent, by the effectiveness of compliance and enforcement activities and by the methods that market participants use to attempt to avoid detection of prohibited or restricted activities. For example, “back doors” are used to circumvent economic sanctions, and digital assets are used to hide sanctioned transactions themselves. The countries of concern are known to conduct commercial and military operations through proxies. As shown in parts IV.D.1.b and IV.D.1.f of this preamble, Cuba and Venezuela have acted as third parties to promote malicious acts by other countries of concern. Unless the due diligence requirements are fully complied with and the due diligence procedures and inquiries provide accurate information, the effectiveness of the proposed rule may be weakened, leading to reduced expected benefits.

    One commenter suggested that the Department conduct a retrospective review of the impact after the final rule becomes effective. The Order already requires such a review. Under section 5 of the Order, within 1 year after the final rule becomes effective, the Department must submit a report to the President that addresses, to the extent practicable, the effectiveness of the measures imposed under the Order in addressing threats to the national security of the United States described in the Order and the economic impact of the implementation of the Order, including on the international competitiveness of U.S. industry. The Order requires the Department to solicit public comment in evaluating the economic impact. The Department also intends to regularly monitor the effectiveness and impact of the regulations once they become effective.

    Table VII-19—OMB Circular A-4 Accounting Statement Provisions Pertaining to Preventing Access to U.S. Sensitive Personal Data and Government-Related Data by Countries of Concern or Covered Persons NPRM

    Category Estimate Units Notes
    Primary Low High Dollar year Discount rate Time horizon
    Benefits
    Annualized monetized benefits The benefits of the proposal include the security of the American people, economic prosperity and opportunity, and democratic values, all of which are beyond a reasonable, reliable, and acceptable estimate of quantified monetary value. Details in NPRM.
    Annualized quantified, but non-monetized, benefits The Department did not identify any benefits that were quantified.
    Unquantified benefits Discussed in NPRM.
    Cost
    Annualized monetized costs $549,000,000 undiscounted Years 1-10 The primary costs of the proposed rule are the lost value of transactions due to the prohibitions and costs related to the restrictions that will require due diligence expenditures for enhanced security, KYC/KYV verifications, recordkeeping, reporting, and audits.
    $502,000,000 2% Years 1-10
    Annualized quantified, but non-monetized, costs
    Unquantified costs
    Transfers
    Annualized monetized Federal budgetary transfers
    From/To:
    Other annualized monetized transfers
    From/To:
    Effects
    Effects on State, local, or Tribal governments The proposed rule would not have Tribal implications warranting the application of Executive Order 13175. It would not have substantial direct effects on one or more Indian Tribes, on the relationship between the Federal Government and Indian Tribes, or on the distribution of power and responsibilities between the Federal Government and Indian Tribes.
    Effects on small businesses This analysis assumes that the small entities affected by the proposed rule will incur compliance costs of around $32,380 per firm annually, compared with an annual compliance cost of $400,460 for the largest affected firms. Both cost figures are undiscounted. The Department estimates that the proposed rule will impact just over 4,000 small entities, and that the highest-cost scenario will apply to approximately 100 firms.
    Effects on wages The Department did not estimate any impacts on wages.
    ( print page 86201)
    Effects on growth The Department did not estimate any impacts on growth.

    B. Regulatory Flexibility Act

    The Department is proposing this rule to address the growing threat posed by the efforts of foreign adversaries to access and exploit the government-related data or Americans' bulk U.S. sensitive personal data. On February 28, 2024, the President issued Executive Order 14117 on “Preventing Access to Americans' Bulk Sensitive Personal Data and United States Government-Related Data by Countries of Concern.” This Order directs the Attorney General to, among other things, determine which classes of data transactions ought to be prohibited due to the unacceptable risk they pose by allowing countries of concern or covered persons to access government-related data or bulk U.S. sensitive personal data. The Order also directs the Attorney General to work with relevant agencies to identify countries of concern and classes of covered persons, establish a process to issue licenses authorizing transactions that would otherwise be prohibited or restricted transactions, address the need for requirements for recordkeeping and reporting transactions, and determine which classes of transactions will be required to comply with separate security requirements.

    The need for the proposed rule stems from the increased efforts that countries of concern are making to obtain sensitive personal data of Americans and to utilize it in a way that undermines national security and foreign policy. Advances in computing technology, artificial intelligence, and methods for processing large datasets allow countries of concern to more effectively leverage collected data for malicious purposes. The capability currently exists to allow those who government-related data or Americans' bulk U.S. sensitive personal data to combine and manipulate it in ways that could identify sensitive personal data, including personal identifiers and precise geolocation information.

    1. Succinct Statement of the Objectives of, and Legal Basis for, the Proposed Rule

    Through the Order, the President used his authority under IEEPA and the National Emergencies Act to declare national emergencies and regulate certain types of economic transactions in order to protect the country against foreign threats. The Order expands upon the national emergency previously declared by Executive Order 13873 of May 15, 2019 (Securing the Information and Communications Technology and Services Supply Chain), which was modified by Executive Order 14034 of June 9, 2021 (Protecting Americans' Sensitive Data from Foreign Adversaries). Furthermore, the President, under title 3, section 301 of the U.S. Code, authorized the Attorney General, in consultation with the heads of relevant executive agencies, to employ the President's powers granted by IEEPA as may be necessary or appropriate to carry out the purposes of the Order.

    IEEPA empowers the President to “investigate, regulate, or prohibit” foreign exchanges in cases where there is a threat coming from outside the United States that threatens the country's “national security, foreign policy, or economy.” Existing IEEPA-based programs include those administered by OFAC, which enforces economic and trade sanctions, and the Department of Commerce's Bureau of Industry and Security, which is responsible for information and communications technology and services supply chain security.

    2. Description of and, Where Feasible, an Estimate of the Number of Small Entities to Which the Proposed Rule Will Apply

    The proposed rule would affect data-brokerage firms and other firms engaged in covered data transactions that pose a risk of exposing government-related data or bulk U.S. sensitive personal data to countries of concern or covered persons. The Department has estimated that about 4,500 firms, just over 90 percent of which are small businesses (hereafter referred to as “small entities”), would be impacted by the proposed rule. Therefore, the Department estimates that this proposed rule would impact approximately 4,050 small entities and approximately 450 firms that would not be classified as small entities.

    Small entities, as defined by the Regulatory Flexibility Act, include small businesses, small nonprofit organizations, and small governmental jurisdictions. The definition of “small entities” includes the definition of “small businesses” pursuant to section 3 of the Small Business Act of 1953, as amended: “A small business concern . . . shall be deemed to be one which is independently owned and operated, and which is not dominant in its field of operation.” The definition of “small business” varies from industry to industry (as specified by NAICS code and found in 13 CFR 121.201) to reflect the typical company size in each industry.

    NAICS code 518210, “Computing Infrastructure Providers, Data Processing, Web Hosting, and Related Services,” contains all the affected data brokers as well as some of the other entities engaged in one or more of the classes of restricted data transactions.[578] The number of small entities affected by the proposed rule was estimated by using the Small Business Administration (“SBA”) small business size standard for the NAICS code to calculate the proportion of firms that are considered small entities. Data brokers are only a subset of the total firms contained in the identified NAICS code; however, for this analysis, it was assumed that the proportion of small entities was the same for both the broader NAICS industry and the specific data broker industry. Because more than 90 percent of impacted firms across all relevant industries can be considered small entities, the proposed rule would have an impact on a substantial number of small entities.

    ( print page 86202)

    Table VII-20—Small Business Size Standard and Affected Firms

    Number of affected firms Share of affected firms that are small Number of affected small firms
    4,500 Approximately 90 percent Approximately 4,050.

    This analysis assumes that the small entities affected by the proposed rule will incur compliance costs of around $32,380 per firm per year, compared with an annual compliance cost of $400,460 for the largest affected firms.

    The Department is not aware of reliable revenue data by firm size for the data broker industry, but a reasonable assumption is that if a firm's revenues from data sales are not sufficient to cover the compliance costs, then that firm will have an incentive to exit that market. Furthermore, calculating the proportion of the costs associated with the proposed rule that falls on small firms is complicated by the fact that several of the proposed rule's provisions—specifically the requirements related to cybersecurity, due diligence, recordkeeping, and reporting—likely involve high fixed costs. Even if small entities have less complex business operations, leading to fewer complications related to compliance, they may still face a higher cost burden from the proposed rule than larger firms. Large entities will likely already have a greater portion of the fixed costs associated with the proposed rule covered by existing capabilities. Therefore, while the costs associated with the security and due diligence requirements will be smaller in absolute terms for smaller entities, such entities will likely need to pay a higher proportion of their overall budgets to comply. Due to the unknowns and the large number of small entities, it is possible that a substantial number of small firms will experience a significant impact. The Department welcomes comments on this topic.

    3. Description of the Projected Reporting, Recordkeeping, and Other Compliance Requirements of the Proposed Rule

    The proposed rule would require firms engaged in restricted transactions to adhere to certain standards for data security, due diligence, recordkeeping, and reporting. See § 202.401. To mitigate the risk of sharing government-related data or bulk U.S. sensitive personal data with countries of concern or covered persons through restricted transactions, organizations engaged in restricted transactions would be required to institute organizational and system-level cybersecurity policies, practices, and requirements and data-level requirements developed by DHS through CISA in coordination with the Department. See § 202.402. Those requirements, which CISA will release through a separate Request For Information, overlap with several similar, widely used cybersecurity standards or frameworks. In addition, the security requirements developed by CISA would require firms to protect the data associated with restricted transactions using combinations of the following capabilities necessary to prevent access to covered data by covered persons or countries of concern:

    1. data minimization and data masking;

    2. encryption;

    3. privacy-enhancing technologies; and

    4. denial of access.

    Firms will also be required to undergo annual independent testing and auditing to ensure their continuing compliance with the security requirements.

    Additionally, in order to ensure that government-related data or Americans' bulk U.S. sensitive personal data are not accessible by countries of concern or covered persons, firms will be required to engage in due diligence before pursuing restricted transactions, which involves utilizing KYC/KYV programs to complete background checks on potential partners. Furthermore, firms will be required to keep records that contain extensive details of their restricted transactions as well as the details of the other parties involved. They will also be required to undergo annual audits of their records to ensure compliance and assess potential risks.

    4. Identification of All Relevant Federal Rules That May Duplicate, Overlap, or Conflict With the Proposed Rule

    As discussed in part IV.K of this preamble, while the PADFAA seeks to address some of the same national security risks of the proposed rule, there are clear differences between the PADFAA, the Order, and this proposed rule, including the scope of regulated data brokerage activities, the types of bulk sensitive personal data that are covered, and the relevant countries of concern. Further, while the PADFAA allows the FTC to investigate certain data-brokerage activities involving countries of concern as unfair trade practices consistent with the FTC's existing jurisdiction, the proposed rule establishes a new set of consistent regulatory requirements that apply across multiple types of commercial transactions and sectors. Finally, as stated in part IV.K of this preamble, the Department will coordinate closely with the FTC to ensure consistency in how both authorities are implemented.

    Some restricted transactions under the proposed rule could also end up being subject to review and action by CFIUS. The Foreign Investment Risk Review Modernization Act of 2018 gave CFIUS the authority to review certain non-controlling foreign investments that may pose a risk to national security by allowing the sensitive personal data of U.S. citizens to be exploited.[579] However, while CFIUS acts on a transaction-by-transaction basis, the proposed rule would create restrictions and prohibitions on covered data transactions that would apply to categories of data transactions involving the six countries of concern. In a situation where a covered data transaction regulated by the proposed rule was later subject to a CFIUS review, it would be exempt from the proposed rule to the extent that CFIUS takes any of the actions identified in the proposed rule. See §§ 202.207; 202.508.

    Furthermore, the categories of covered data transactions covered by the proposed rule extend beyond the scope of CFIUS, including the provision of government-related data or bulk U.S. sensitive personal data through data brokerage, vendor agreements, and employment agreements. The proposed rule also covers investment agreements that may not be covered by CFIUS as well as cases where the relevant risks do not result from the covered transaction or may occur before a CFIUS action takes place.

    C. Executive Order 13132 (Federalism)

    The proposed rule would not have federalism implications warranting the application of Executive Order 13132. The proposed rule does not have substantial direct effects on the States, on the relationship between the national government and the States, or on the distribution of power and responsibilities among the various levels of government. ( print page 86203)

    D. Executive Order 13175 (Consultation and Coordination With Indian Tribal Governments)

    The proposed rule would not have Tribal implications warranting the application of Executive Order 13175. It does not have substantial direct effects on one or more Indian Tribes, on the relationship between the Federal Government and Indian Tribes, or on the distribution of power and responsibilities between the Federal Government and Indian Tribes.

    E. Executive Order 12988 (Civil Justice Reform)

    This proposed rule meets the applicable standards set forth in sections 3(a) and 3(b)(2) of Executive Order 12988.

    F. Paperwork Reduction Act

    The collections of information contained in this notice of proposed rulemaking have been submitted to the Office of Management and Budget for review in accordance with the Paperwork Reduction Act of 1995, 44 U.S.C. 3507(d), under control number 1124-AA01.

    Written comments on this collection can be submitted by visiting www.reginfo.gov/​public/​do/​PRAMain. Find this document by selecting “Currently Under Review—Open for Public Comments” or by using the search function. Comments on the collection of information should be received by November 29, 2024.

    The Department of Justice is soliciting comments from members of the public concerning this collection of information to:

    • Evaluate whether the proposed collection of information is necessary for the proper performance of the functions of the agency, including whether the information will have practical utility;
    • Evaluate the accuracy of the agency's estimate of the burden of the proposed collection of information;
    • Enhance the quality, utility, and clarity of the information to be collected; and
    • Minimize the burden of the collection of information on those who are to respond, including through the use of appropriate automated collection techniques or other forms of information technology.

    The proposed rule includes seven new collections of information: annual reports; applications for specific licenses; reports on rejected prohibited transactions; requests for advisory opinions; petitions for removal from the designated Covered Persons List; reports of known or suspected violations of the onward transfers prohibition; and recordkeeping requirements for restricted transactions.

    Based on wage rates from the Bureau of Labor Statistics and lower- and upper-bound estimates (used because this is a new program and there is uncertainty in the estimated number of potential respondents for each of the forms), the following are the estimated burdens of the proposed collections:

    • Annual reports. The Department estimates that 375 to 750 filers will send an average of one annual report per year, spending an estimated average of 40 hours to prepare and submit each annual report. The Department estimates the aggregated costs for all filers at $821,100 to $1,642,200 annually for annual reports.
    • Applications for specific licenses. The Department estimates that 15 to 25 filers will send an average of one application for a specific license per year, spending an estimated average of 10 hours to prepare and submit each application for a specific license. The Department estimates the aggregated costs for all filers at $8,211 to $13,685 annually for applications for specific licenses.
    • Reports on rejected prohibited transactions. The Department estimates that 15 to 25 filers will send an average of one report on a rejected prohibited transaction per year, spending an estimated average of 2 hours to prepare and submit each application for a specific license. The Department estimates the aggregated costs for all filers at $1,642 to $2,737 annually for reports on rejected prohibited transactions.
    • Requests for advisory opinions. The Department estimates that 50 to 100 filers will send an average of one request for an advisory opinion per year, spending an estimated average of 2 hours to prepare and submit each request for an advisory opinion. The Department estimates the aggregated costs for all filers at $5,474 to $10,948 annually for requests for advisory opinions.
    • Petitions for removal from covered persons list. The Department estimates that 15 to 25 filers will send an average of one petition for removal from the Covered Persons List per year, spending an estimated average of 5 hours to prepare and submit each petition for removal from the Covered Persons List. The Department estimates the aggregated costs for all filers at $4,106 to $6,843 annually for petitions for removal from the Covered Persons List.
    • Reports of known or suspected violations of onward transfers prohibition. The Department estimates that 300 to 450 filers will send an average of one report of known or suspected violations of the onward transfers prohibition per year, spending an estimated average of 2 hours to prepare and submit each report of known or suspected violations of the onward transfers prohibition. The Department estimates the aggregated costs for all filers at $32,844 to $49,266 annually for reports of known or suspected violations of the onward transfers prohibition.
    • Recordkeeping requirements for restricted transactions. The Department estimates that 1,400 small to medium-sized firms will incur a total of $1,344,000 in recordkeeping costs per year. Also, the Department estimates that 100 large firms will incur a total of $84,844,000 in recordkeeping costs per year.

    Under the Paperwork Reduction Act, an agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a valid control number assigned by the Office of Management and Budget.

    G. Unfunded Mandates Reform Act

    The Unfunded Mandates Reform Act requires that Federal agencies prepare a written statement assessing the effects of any Federal mandate in a proposed or final agency rule that may directly result in the expenditure of $100 million or more in 1995 dollars (adjusted annually for inflation) in any 1 year by State, local, and Tribal governments, in the aggregate, or by the private sector (2 U.S.C. 1532(a)). However, the Unfunded Mandates Reform Act does not apply to “any provision” in a proposed or final rule that is “necessary for the national security” (2 U.S.C. 1503(5)).

    In the Order, the President explained that “[t]he continuing effort of certain countries of concern to access Americans' sensitive personal data and United States Government-related data constitutes an unusual and extraordinary threat, which has its source in whole or substantial part outside the United States, to the national security and foreign policy of the United States.” The Order expanded the scope of the national emergency declared in Executive Order 13873 of May 15, 2019 (Securing the Information and Communications Technology and Services Supply Chain), and further addressed with additional measures in Executive Order 14034 of June 9, 2021 (Protecting Americans' Sensitive Data From Foreign Adversaries). Section 2(a) of the Order thus requires the Attorney General to issue the regulations in this ( print page 86204) part, subject to public notice and comment, “[t]o assist in addressing the national security emergency described” in the Order. Because the entirety of this proposed rule and every provision in it addresses the national emergency described by the President in the Order, the Department has concluded that the Unfunded Mandates Reform Act does not apply to this proposed rule.

    List of Subjects in 28 CFR Part 202

    • Computer technology
    • Health records
    • Incorporation by reference
    • Investments
    • Military personnel
    • National security
    • Personally identifiable information
    • Privacy
    • Reporting and recordkeeping requirements
    • Security measures

    Under the rulemaking authority vested in the Attorney General in 5 U.S.C. 301; 28 U.S.C. 509, 510 and delegated to the Assistant Attorney General for National Security by A.G. Order No. 6067-2024, and for the reasons set forth in the preamble, the Department of Justice proposes to add part 202 to chapter I of title 28 of the Code of Federal Regulations to read as follows:

    PART 202—ACCESS TO U.S. SENSITIVE PERSONAL DATA AND GOVERNMENT-RELATED DATA BY COUNTRIES OF CONCERN OR COVERED PERSONS

    Subpart A—General
    202.101
    Scope.
    202.102
    Rules of construction and interpretation.
    202.103
    Relation of this part to other laws and regulations.
    202.104
    Delegation of authorities.
    202.105
    Amendment, modification, or revocation.
    202.106
    Severability.
    Subpart B—Definitions
    202.201
    Access.
    202.202
    Attorney General.
    202.203
    Assistant Attorney General.
    202.204
    Biometric identifiers.
    202.205
    Bulk.
    202.206
    Bulk U.S. sensitive personal data.
    202.207
    CFIUS action.
    202.208
    China.
    202.209
    Country of concern.
    202.210
    Covered data transaction.
    202.211
    Covered person.
    202.212
    Covered personal identifiers.
    202.213
    Cuba.
    202.214
    Data brokerage.
    202.215
    Directing.
    202.216
    Effective date.
    202.217
    Employment agreement.
    202.218
    Entity.
    202.219
    Exempt transaction.
    202.220
    Former senior official.
    202.221
    Foreign person.
    202.222
    Government-related data.
    202.223
    Human biospecimens.
    202.224
    Human genomic data.
    202.225
    IEEPA.
    202.226
    Information or informational materials.
    202.227
    Interest.
    202.228
    Investment agreement.
    202.229
    Iran.
    202.230
    Knowingly.
    202.231
    Licenses; general and specific.
    202.232
    Linked.
    202.233
    Linkable.
    202.234
    Listed identifier.
    202.235
    National Security Division.
    202.236
    North Korea.
    202.237
    Order.
    202.238
    Person.
    202.239
    Personal communications.
    202.240
    Personal financial data.
    202.241
    Personal health data.
    202.242
    Precise geolocation data.
    202.243
    Prohibited transaction.
    202.244
    Property; property interest.
    202.245
    Recent former employees or contractors.
    202.246
    Restricted transaction.
    202.247
    Russia.
    202.248
    Security requirements.
    202.249
    Sensitive personal data.
    202.250
    Special Administrative Region of Hong Kong.
    202.251
    Special Administrative Region of Macau.
    202.252
    Telecommunications service.
    202.253
    Transaction.
    202.254
    Transfer.
    202.255
    United States.
    202.256
    United States person or U.S. person.
    202.257
    U.S. device.
    202.258
    Vendor agreement.
    202.259
    Venezuela.
    Subpart C—Prohibited Transactions and Related Activities
    202.301
    Prohibited data-brokerage transactions.
    202.302
    Other prohibited data-brokerage transactions involving potential onward transfer to countries of concern or covered persons.
    202.303
    Prohibited human genomic data and human biospecimen transactions.
    202.304
    Prohibited evasions, attempts, causing violations, and conspiracies.
    202.305
    Knowingly directing prohibited or restricted transactions.
    Subpart D—Restricted Transactions
    202.401
    Authorization to conduct restricted transactions.
    202.402
    Incorporation by reference.
    Subpart E—Exempt Transactions
    202.501
    Personal communications.
    202.502
    Information or informational materials.
    202.503
    Travel.
    202.504
    Official business of the United States Government.
    202.505
    Financial services.
    202.506
    Corporate group transactions.
    202.507
    Transactions required or authorized by Federal law or international agreements, or necessary for compliance with Federal law.
    202.508
    Investment agreements subject to a CFIUS action.
    202.509
    Telecommunications services.
    202.510
    Drug, biological product, and medical device authorizations.
    202.511
    Other clinical investigations and post-marketing surveillance data.
    Subpart F—Determination of Countries of Concern
    202.601
    Determination of countries of concern.
    Subpart G—Covered Persons
    202.701
    Designation of covered persons.
    202.702
    Procedures governing removal from the Covered Persons List.
    Subpart H—Licensing
    202.801
    General licenses.
    202.802
    Specific licenses.
    202.803
    General provisions.
    Subpart I—Advisory Opinions
    202.901
    Inquiries concerning application of this part.
    Subpart J—Due Diligence and Audit Requirements
    202.1001
    Due diligence for restricted transactions.
    202.1002
    Audits for restricted transactions.
    Subpart K—Reporting and Recordkeeping Requirements
    202.1101
    Records and recordkeeping requirements.
    202.1102
    Reports to be furnished on demand.
    202.1103
    Annual reports.
    202.1104
    Reports on rejected prohibited transactions.
    Subpart L—Submitting Applications, Requests, Reports, and Responses
    202.1201
    Procedures.
    Subpart M—Penalties and Finding of Violation
    202.1301
    Penalties for violations.
    202.1302
    Process for pre-penalty notice.
    202.1303
    Penalty imposition.
    202.1304
    Administrative collection and litigation.
    202.1305
    Finding of violation.
    202.1306
    Opportunity to respond to a pre-penalty notice or finding of violation.
    Subpart N—Government-Related Location Data List
    202.1401
    Government-Related Location Data List.

    Authority: 50 U.S.C. 1701 et seq.;50 U.S.C. 1601 et seq.;E.O. 14117, 89 FR 15421.

    Subpart A—General

    Scope.

    (a) Executive Order 14117 of February 28, 2024 (Preventing Access to Americans' Bulk Sensitive Personal Data and United States Government-Related Data by Countries of Concern) (“the Order”), directs the Attorney General to issue regulations that prohibit or otherwise restrict United States persons from engaging in any acquisition, holding, use, transfer, ( print page 86205) transportation, or exportation of, or dealing in, any property in which a foreign country or national thereof has any interest (“transaction”), where the transaction: involves United States Government-related data (“government-related data”) or bulk U.S. sensitive personal data, as defined by final rules implementing the Order; falls within a class of transactions that has been determined by the Attorney General to pose an unacceptable risk to the national security of the United States because the transactions may enable access by countries of concern or covered persons to government-related data or bulk U.S. sensitive personal data; and meets other criteria specified by the Order.

    (b) This part contains regulations implementing the Order and addressing the national emergency declared in Executive Order 13873 of May 15, 2019 (Securing the Information and Communications Technology and Services Supply Chain), and further addressed with additional measures in Executive Order 14034 of June 9, 2021 (Protecting Americans' Sensitive Data from Foreign Adversaries) and Executive Order 14117.

    Rules of construction and interpretation.

    (a) The examples included in this part are provided for informational purposes and should not be construed to alter the meaning of the text of the regulations in this part.

    (b) As used in this part, the term “including” means “including but not limited to.”

    (c) All references to “days” in this part mean calendar days. In computing any time period specified in this part:

    (1) Exclude the day of the event that triggers the period;

    (2) Count every day, including Saturdays, Sundays, and legal holidays; and

    (3) Include the last day of the period, but if the last day is a Saturday, Sunday, or Federal holiday, the period continues to run until the end of the next day that is not a Saturday, Sunday, or Federal holiday.

    Relation of this part to other laws and regulations.

    Nothing in this part shall be construed as altering or affecting any other authority, process, regulation, investigation, enforcement measure, or review provided by or established under any other provision of Federal law, including the International Emergency Economic Powers Act.

    Delegation of authorities.

    Any action that the Attorney General is authorized to take pursuant to the Order or pursuant to this part may be taken by the Assistant Attorney General for National Security or by any other person to whom the Attorney General or Assistant Attorney General for National Security in writing delegates authority so to act.

    Amendment, modification, or revocation.

    Except as otherwise provided by law, any determinations, prohibitions, decisions, licenses (whether general or specific), guidance, authorizations, instructions, orders, or forms issued pursuant to this part may be amended, modified, or revoked, in whole or in part, at any time.

    Severability.

    If any provision of this part is held to be invalid or unenforceable by its terms, or as applied to any person or circumstance, or stayed pending further agency action or judicial review, the provision is to be construed so as to continue to give the maximum effect to the provision permitted by law, unless such holding will be one of utter invalidity or unenforceability, in which event the provision will be severable from this part and will not affect the remainder thereof.

    Subpart B—Definitions

    Access.

    The term access means logical or physical access, including the ability to obtain, read, copy, decrypt, edit, divert, release, affect, alter the state of, or otherwise view or receive, in any form, including through information systems, information technology systems, cloud-computing platforms, networks, security systems, equipment, or software.

    Attorney General.

    The term Attorney General means the Attorney General of the United States or the Attorney General's designee.

    Assistant Attorney General.

    The term Assistant Attorney General means the Assistant Attorney General, National Security Division, United States Department of Justice, or the Assistant Attorney General's designee.

    Biometric identifiers.

    The term biometric identifiers means measurable physical characteristics or behaviors used to recognize or verify the identity of an individual, including facial images, voice prints and patterns, retina and iris scans, palm prints and fingerprints, gait, and keyboard usage patterns that are enrolled in a biometric system and the templates created by the system.

    Bulk.

    The term bulk means any amount of sensitive personal data that meets or exceeds the following thresholds at any point in the preceding 12 months, whether through a single covered data transaction or aggregated across covered data transactions involving the same U.S. person and the same foreign person or covered person:

    (a) Human genomic data collected about or maintained on more than 100 U.S. persons;

    (b) Biometric identifiers collected about or maintained on more than 1,000 U.S. persons;

    (c) Precise geolocation data collected about or maintained on more than 1,000 U.S. devices;

    (d) Personal health data collected about or maintained on more than 10,000 U.S. persons;

    (e) Personal financial data collected about or maintained on more than 10,000 U.S. persons;

    (f) Covered personal identifiers collected about or maintained on more than 100,000 U.S. persons; or

    (g) Combined data, meaning any collection or set of data that contains more than one of the categories in paragraphs (a) through (g) of this section, or that contains any listed identifier linked to categories in paragraphs (a) through (e) of this section, where any individual data type meets the threshold number of persons or devices collected or maintained in the aggregate for the lowest number of U.S. persons or U.S. devices in that category of data.

    Bulk U.S. sensitive personal data.

    The term bulk U.S. sensitive personal data means a collection or set of bulk data relating to U.S. persons, in any format, regardless of whether the data is anonymized, pseudonymized, de-identified, or encrypted.

    CFIUS action.

    The term CFIUS action means any agreement or condition the Committee on Foreign Investment in the United States has entered into or imposed pursuant to 50 U.S.C. 4565(l)(1), (3), or (5) to resolve a national security risk involving access by a country of concern or covered person to sensitive personal data that the Committee on Foreign Investment in the United States has explicitly designated, in the agreement or document containing the condition, as a CFIUS action, including: ( print page 86206)

    (a) Suspension of a proposed or pending transaction, as authorized under 50 U.S.C. 4565(l)(1);

    (b) Entry into or imposition of any agreement or condition with any party to a covered transaction, as authorized under 50 U.S.C. 4565(l)(3); and

    (c) The establishment of interim protections for covered transactions withdrawn before CFIUS's review or investigation is completed, as authorized under 50 U.S.C. 4565(l)(5).

    China.

    The term China means the People's Republic of China, including the Special Administrative Region of Hong Kong and the Special Administrative Region of Macau, as well as any political subdivision, agency, or instrumentality thereof.

    Country of concern.

    The term country of concern means any foreign government that, as determined by the Attorney General with the concurrence of the Secretary of State and the Secretary of Commerce, (1) has engaged in a long-term pattern or serious instances of conduct significantly adverse to the national security of the United States or security and safety of United States persons, and (2) poses a significant risk of exploiting government-related data or bulk U.S. sensitive personal data to the detriment of the national security of the United States or security and safety of U.S. persons.

    Covered data transaction.

    (a) Definition. A covered data transaction is any transaction that involves any access to any government-related data or bulk U.S. sensitive personal data and that involves:

    (1) Data brokerage;

    (2) A vendor agreement;

    (3) An employment agreement; or

    (4) An investment agreement.

    (b) Examples. (1) Example 1. A U.S. institution conducts medical research at its own laboratory in a country of concern, including sending several U.S.-citizen employees to that laboratory to perform and assist with the research. The U.S. institution does not engage in data brokerage or a vendor, employment, or investment agreement that gives a covered person or country of concern access to government-related data or bulk U.S. sensitive personal data. Because the U.S. institution does not engage in any data brokerage or enter into a vendor, employment, or investment agreement, the U.S. institution's research activity is not a covered data transaction.

    (2) [Reserved]

    Covered person.

    (a) Definition. The term covered person means:

    (1) A foreign person that is an entity that is 50 percent or more owned, directly or indirectly, by a country of concern, or that is organized or chartered under the laws of, or has its principal place of business in, a country of concern;

    (2) A foreign person that is an entity that is 50 percent or more owned, directly or indirectly, by an entity described in paragraph (a)(1) of this section or a person described in paragraphs (a)(3), (4), or (5) of this section;

    (3) A foreign person that is an individual who is an employee or contractor of a country of concern or of an entity described in paragraphs (a)(1), (2), or (5) of this section;

    (4) A foreign person that is an individual who is primarily a resident in the territorial jurisdiction of a country of concern; or

    (5) Any person, wherever located, determined by the Attorney General:

    (i) To be, to have been, or to be likely to become owned or controlled by or subject to the jurisdiction or direction of a country of concern or covered person;

    (ii) To act, to have acted or purported to act, or to be likely to act for or on behalf of a country of concern or covered person; or

    (iii) To have knowingly caused or directed, or to be likely to knowingly cause or direct a violation of this part.

    (b) Examples —(1) Example 1. Foreign persons primarily resident in Cuba, Iran, or another country of concern would be covered persons.

    (2) Example 2. Chinese or Russian citizens located in the United States would be treated as U.S. persons and would not be covered persons (except to the extent individually designated). They would be subject to the same prohibitions and restrictions as all other U.S. persons with respect to engaging in covered data transactions with countries of concern or covered persons.

    (3) Example 3. Citizens of a country of concern who are primarily resident in a third country, such as Russian citizens primarily resident in a European Union country or Cuban citizens primarily resident in a South American country that is not a country of concern, would not be covered persons except to the extent they are individually designated or to the extent that they are employees or contractors of a country of concern government or a covered person that is an entity.

    (4) Example 4. A foreign person is located abroad and is employed by a company headquartered in China. Because the company is a covered person that is an entity and the employee is located outside the United States, the employee is a covered person.

    (5) Example 5. A foreign person is located abroad and is employed by a company that has been designated as a covered person. Because the foreign person is the employee of a covered person that is an entity and the employee is a foreign person, the person is a covered person.

    Covered personal identifiers.

    (a) Definition. The term covered personal identifiers means any listed identifier:

    (1) In combination with any other listed identifier; or

    (2) In combination with other data that is disclosed by a transacting party pursuant to the transaction such that the listed identifier is linked or linkable to other listed identifiers or to other sensitive personal data.

    (b) Exclusion. The term covered personal identifiers excludes:

    (1) Demographic or contact data that is linked only to other demographic or contact data (such as first and last name, birthplace, ZIP code, residential street or postal address, phone number, and email address and similar public account identifiers); and

    (2) A network-based identifier, account-authentication data, or call-detail data that is linked only to other network-based identifier, account-authentication data, or call-detail data as necessary for the provision of telecommunications, networking, or similar service.

    (c) Examples of listed identifiers in combination with other listed identifiers— (1) Example 1. A standalone listed identifier in isolation ( i.e., that is not linked to another listed identifier, sensitive personal data, or other data that is disclosed by a transacting party pursuant to the transaction such that the listed identifier is linked or linkable to other listed identifiers or to other sensitive personal data)—such as a Social Security Number or account username—would not constitute a covered personal identifier.

    (2) Example 2. A listed identifier linked to another listed identifier—such as a first and last name linked to a Social Security number, a driver's license number linked to a passport number, a device Media Access Control (“MAC”) address linked to a residential address, an account username linked to a first and last name, or a mobile advertising ID linked to an email address—would constitute covered personal identifiers. ( print page 86207)

    (3) Example 3. Demographic or contact data linked only to other demographic or contact data—such as a first and last name linked to a residential street address, an email address linked to a first and last name, or a customer loyalty membership record linking a first and last name to a phone number—would not constitute covered personal identifiers.

    (4) Example 4. Demographic or contact data linked to other demographic or contact data and to another listed identifier—such as a first and last name linked to an email address and to an IP address—would constitute covered personal identifiers.

    (5) Example 5. Account usernames linked to passwords as part of a sale of a dataset would constitute covered personal identifiers. Those pieces of account-authentication data are not linked as a necessary part of the provision of telecommunications, networking, or similar services. This combination would constitute covered personal identifiers.

    (d) Examples of a listed identifier in combination with other data disclosed by a transacting party— (1) Example 1. A foreign person who is a covered person asks a U.S. company for a list of Media Access Control (“MAC”) addresses from devices that have connected to the wireless network of a U.S. fast-food restaurant located in a particular government building. The U.S. company then sells the list of MAC addresses, without any other listed identifiers or sensitive personal data, to the covered person. The disclosed MAC addresses, when paired with the other data disclosed by the covered person—that the devices “have connected to the wireless network of a U.S. fast-food restaurant located in a particular government building”—makes it so that the MAC addresses are linked or linkable to other sensitive personal data, in this case precise geolocation data of the location of the fast-food restaurant that the national security-related individuals frequent with their devices. This combination of data therefore meets the definition of covered personal identifiers.

    (2) Example 2. A U.S. company sells to a country of concern a list of residential addresses that the company describes (whether in a heading on the list or separately to the country of concern as part of the transaction) as “addresses of members of a country of concern's opposition political party in New York City” or as “addresses of active-duty military officers who live in Howard County, Maryland” without any other listed identifiers or sensitive personal data. The data disclosed by the U.S. company's description, when paired with the disclosed addresses, makes the addresses linked or linkable to other listed identifiers or to other sensitive personal data of the U.S. individuals associated with them. This combination of data therefore meets the definition of covered personal identifiers.

    (3) Example 3. A covered person asks a U.S. company for a bulk list of birth dates for “any American who visited a Starbucks in Washington, DC, in December 2023.” The U.S. company then sells the list of birth dates, without any other listed identifiers or sensitive personal data, to the covered person. The other data disclosed by the covered person—“any American who visited a Starbucks in Washington, DC, in December 2023”—does not make the birth dates linked or linkable to other listed identifiers or to other sensitive personal data. This combination of data therefore does not meet the definition of covered personal identifiers.

    (4) Example 4. Same as Example 3, but the covered person asks the U.S. company for a bulk list of names (rather than birth dates) for “any American who visited a Starbucks in Washington, DC, in December 2023.” The other data disclosed by the covered person—“any American who visited a Starbucks in Washington, DC, in December 2023”—does not make the list of names, without more, linked or linkable to other listed identifiers or to other sensitive personal data. This combination of data therefore does not meet the definition of covered personal identifiers.

    (5) Example 5. A U.S. company sells to a covered person a list of residential addresses that the company describes (in a heading in the list or to the covered person as part of the transaction) as “households of Americans who watched more than 50% of episodes” of a specific popular TV show, without any other listed identifiers or sensitive personal data. The other data disclosed by the U.S. company—“Americans who watched more than 50% of episodes” of a specific popular TV show—does not increase the extent to which the addresses are linked or linkable to other listed identifiers or to other sensitive personal data. This combination of data therefore does not meet the definition of covered personal identifiers.

    Cuba.

    The term Cuba means the Republic of Cuba, as well as any political subdivision, agency, or instrumentality thereof.

    Data brokerage.

    (a) Definition. The term data brokerage means the sale of data, licensing of access to data, or similar commercial transactions involving the transfer of data from any person (the provider) to any other person (the recipient), where the recipient did not collect or process the data directly from the individuals linked or linkable to the collected or processed data.

    (b) Examples— (1) Example 1. A U.S. company sells bulk U.S. sensitive personal data to an entity headquartered in a country of concern. The U.S. company engages in prohibited data brokerage.

    (2) Example 2. A U.S. company enters into an agreement that gives a covered person a license to access government-related data held by the U.S. company. The U.S. company engages in prohibited data brokerage.

    (3) Example 3. A U.S. organization maintains a database of bulk U.S. sensitive personal data and offers annual memberships for a fee that provide members a license to access that data. Providing an annual membership to a covered person that includes a license to access government-related data or bulk U.S. sensitive personal data would constitute prohibited data brokerage.

    (4) Example 4. A U.S. company owns and operates a mobile app for U.S. users with available advertising space. As part of selling the advertising space, the U.S. company provides the bulk precise geolocation data, IP address, and advertising IDs of its U.S. users' devices to an advertising exchange based in a country of concern. The U.S. company's provision of this data as part of the sale of advertising space is data brokerage and a prohibited transaction.

    (5) Example 5. Same as Example 4, but the U.S. company provides the data to an advertising exchange based in the United States. As part of the sale of the advertising space, the U.S. advertising exchange provides the data to advertisers headquartered in a country of concern. The U.S. company's provision of the data to the U.S. advertising exchange would not be a transaction because it is between U.S. persons. The advertising exchange's provision of this data to the country of concern-based advertisers is data brokerage because it is a commercial transaction involving the transfer of data from the U.S. advertising exchange to the advertisers headquartered in the country of concern, where those country-of-concern advertisers did not collect or process the data directly from the individuals linked or linkable to the collected or processed data. ( print page 86208) Furthermore, the U.S. advertising exchange's provision of this data to the country of concern-based advertisers is a prohibited transaction.

    (6) Example 6. A U.S. information technology company operates an autonomous driving platform that collects the precise geolocation data of its cars operating in the United States. The U.S. company sells or otherwise licenses this bulk data to its parent company headquartered in a country of concern to help develop artificial intelligence technology and machine learning capabilities. The sale or license is data brokerage and a prohibited transaction.

    Directing.

    The term directing means having any authority (individually or as part of a group) to make decisions for or on behalf of an entity and exercising that authority.

    Effective date.

    The term effective date refers to the effective date of the applicable prohibitions and directives contained in this part, which is 12:01 a.m. ET on [date to be determined].

    Employment agreement.

    (a) Definition. The term employment agreement means any agreement or arrangement in which an individual, other than as an independent contractor, performs work or performs job functions directly for a person in exchange for payment or other consideration, including employment on a board or committee, executive-level arrangements or services, and employment services at an operational level.

    (b) Examples— (1) Example 1. A U.S. company that conducts consumer human genomic testing collects and maintains bulk human genomic data from U.S. consumers. The U.S. company has global IT operations, including employing a team of individuals who are citizens of and primarily resident in a country of concern to provide back-end services. The agreements related to employing these individuals are employment agreements. Employment as part of the global IT operations team includes access to the U.S. company's systems containing the bulk human genomic data. These employment agreements would be prohibited transactions (because they involve access to bulk human genomic data).

    (2) Example 2. A U.S. company develops its own mobile games and social media apps that collect the bulk U.S. sensitive personal data of its U.S. users. The U.S. company distributes these games and apps in the United States through U.S.-based digital distribution platforms for software applications. The U.S. company intends to hire as CEO an individual designated by the Attorney General as a covered person because of evidence the CEO acts on behalf of a country of concern. The agreement retaining the individual as CEO would be an employment agreement. The individual's authorities and responsibilities as CEO involve access to all data collected by the apps, including the bulk U.S. sensitive personal data. The CEO's employment would be a restricted transaction.

    (3) Example 3. A U.S. company has derived U.S. persons' biometric identifiers by scraping public photos from social media platforms. The U.S. company stores the derived biometric identifiers in bulk, including face-data scans, for the purpose of training or enhancing facial-recognition software. The U.S. company intends to hire a foreign person, who primarily resides in a country of concern, as a project manager responsible for the database. The agreement retaining the project manager would be an employment agreement. The individual's employment as the lead project manager would involve access to the bulk biometric identifiers. The project manager's employment would be a restricted transaction.

    (4) Example 4. A U.S. financial-services company seeks to hire a data scientist who is a citizen of a country of concern who primarily resides in that country of concern and who is developing a new artificial intelligence-based personal assistant that could be sold as a standalone product to the company's customers. The arrangement retaining the data scientist would be an employment agreement. As part of that individual's employment, the data scientist would have administrator rights that allow that individual to access, download, and transmit bulk quantities of personal financial data not ordinarily incident to and part of the company's underlying provision of financial services to its customers. The data scientist's employment would be a restricted transaction.

    (5) Example 5. A U.S. company sells goods and collects bulk personal financial data about its U.S. customers. The U.S. company appoints a citizen of a country of concern, who is located in a country of concern, to its board of directors. This director would be a covered person, and the arrangement appointing the director would be an employment agreement. In connection with the board's data security and cybersecurity responsibilities, the director could access the bulk personal financial data. The director's employment would be a restricted transaction.

    Entity.

    The term entity means a partnership, association, trust, joint venture, corporation, group, subgroup, or other organization.

    Exempt transaction.

    The term exempt transaction means a data transaction that is subject to one or more exemptions described in subpart E of this part.

    Former senior official.

    The term former senior official means either a “former senior employee” or a “former very senior employee,” as those terms are defined in 5 CFR 2641.104.

    Foreign person.

    The term foreign person means any person that is not a U.S. person.

    Government-related data.

    (a) Definition. The term government-related data means the following:

    (1) Any precise geolocation data, regardless of volume, for any location within any area enumerated on the Government-Related Location Data List in § 202.1401 which the Attorney General has determined poses a heightened risk of being exploited by a country of concern to reveal insights about locations controlled by the Federal Government, including insights about facilities, activities, or populations in those locations, to the detriment of national security, because of the nature of those locations or the personnel who work there. Such locations may include:

    (i) The worksite or duty station of Federal Government employees or contractors who occupy a national security position as that term is defined in 5 CFR 1400.102(a)(4);

    (ii) A military installation as that term is defined in 10 U.S.C. 2801(c)(4); or

    (iii) Facilities or locations that otherwise support the Federal Government's national security, defense, intelligence, law enforcement, or foreign policy missions.

    (2) Any sensitive personal data, regardless of volume, that a transacting party markets as linked or linkable to current or recent former employees or contractors, or former senior officials, of the United States Government, including the military and Intelligence Community.

    (b) Examples of government-related data marketed by a transacting party— (1) Example 1. A U.S. company advertises the sale of a set of sensitive ( print page 86209) personal data as belonging to “active duty” personnel, “military personnel who like to read,” “DoD” personnel, “government employees,” or “communities that are heavily connected to a nearby military base.” The data is government-related data.

    (2) Example 2. In discussing the sale of a set of sensitive personal data with a covered person, a U.S. company describes the dataset as belonging to members of a specific named organization. The identified organization restricts membership to current and former members of the military and their families. The data is government-related data.

    Human biospecimens.

    The term human biospecimens means a quantity of tissue, blood, urine, or other human-derived material, including such material classified under any of the following 10-digit Harmonized System-based Schedule B numbers:

    (a) 0501.00.0000 Human hair, unworked, whether or not washed or scoured; waste of human hair

    (b) 3001.20.0000 Extracts of glands or other organs or of their secretions

    (c) 3001.90.0115 Glands and other organs, dried, whether or not powdered

    (d) 3002.12.0010 Human blood plasma

    (e) 3002.12.0020 Normal human blood sera, whether or not freeze-dried

    (f) 3002.12.0030 Human immune blood sera

    (g) 3002.12.0090 Antisera and other blood fractions, Other

    (h) 3002.51.0000 Cell therapy products

    (i) 3002.59.0000 Cell cultures, whether or not modified, Other

    (j) 3002.90.5210 Whole human blood

    (k) 3002.90.5250 Blood, human/animal, other

    (l) 9705.21.0000 Human specimens and parts thereof

    Human genomic data.

    The term human genomic data means data representing the nucleic acid sequences that constitute the entire set or a subset of the genetic instructions found in a human cell, including the result or results of an individual's “genetic test” (as defined in 42 U.S.C. 300gg-91(d)(17)) and any related human genetic sequencing data.

    IEEPA.

    The term IEEPA means the International Emergency Economic Powers Act (50 U.S.C. 1701 et seq.).

    Information or informational materials.

    (a) Definition. The term information or informational materials is limited to expressive material and includes publications, films, posters, phonograph records, photographs, microfilms, microfiche, tapes, compact disks, CD ROMs, artworks, and news wire feeds. It does not include data that is technical, functional, or otherwise non-expressive.

    (b) Exclusions. The term information or informational materials does not include:

    (1) Information or informational materials not fully created and in existence at the date of the data transaction, or the substantive or artistic alteration or enhancement of information or informational materials, or the provision of marketing and business consulting services, including to market, produce or co-produce, or assist in the creation of information or informational materials;

    (2) Items that were, as of April 30, 1994, or that thereafter become, controlled for export to the extent that such controls promote the nonproliferation or antiterrorism policies of the United States, or with respect to which acts are prohibited by 18 U.S.C. chapter 37.

    (c) Examples —(1) Example 1. A U.S. person enters into an agreement to create a customized dataset of bulk U.S. sensitive personal data that meets a covered person's specifications (such as the specific types and fields of data, date ranges, and other criteria) and to sell that dataset to the covered person. This customized dataset is not fully created and in existence at the date of the agreement, and therefore is not information or informational materials.

    (2) Example 2. A U.S. company has access to several pre-existing databases of different bulk sensitive personal data. The U.S. company offers, for a fee, to use data analytics to link the data across these databases to the same individuals and to sell that combined dataset to a covered person. This service constitutes a substantive alteration or enhancement of the data in the pre-existing databases and therefore is not information or informational materials.

    Interest.

    Except as otherwise provided in this part, the term interest, when used with respect to property ( e.g., “an interest in property”), means an interest of any nature whatsoever, direct or indirect.

    Investment agreement.

    (a) Definition. The term investment agreement means an agreement or arrangement in which any person, in exchange for payment or other consideration, obtains direct or indirect ownership interests in or rights in relation to:

    (1) Real estate located in the United States; or

    (2) A U.S. legal entity.

    (b) Exclusion for passive investments. The term investment agreement excludes any investment that:

    (1) Is made:

    (i) Into a publicly traded security, with “security” defined in section 3(a)(10) of the Securities Exchange Act of 1934 (15 U.S.C. 78c(a)(10)), denominated in any currency that trades on a securities exchange or through the method of trading that is commonly referred to as “over-the-counter,” in any jurisdiction;

    (ii) Into a security offered by:

    (A) Any “investment company” (as defined in section 3(a)(1) of the Investment Company Act of 1940 (15 U.S.C. 80a-3(a)(1)) that is registered with the United States Securities and Exchange Commission, such as index funds, mutual funds, or exchange traded funds; or

    (B) Any company that has elected to be regulated or is regulated as a business development company pursuant to section 54(a) of the Investment Company Act of 1940 (15 U.S.C. 80a-53), or any derivative of either of the foregoing; or

    (iii) As a limited partner into a venture capital fund, private equity fund, fund of funds, or other pooled investment fund, if the limited partner's contribution is solely capital and the limited partner cannot make managerial decisions, is not responsible for any debts beyond its investment, and does not have the formal or informal ability to influence or participate in the fund's or a U.S. person's decision making or operations;

    (2) Gives the covered person less than 10% in total voting and equity interest in a U.S. person; and

    (3) Does not give a covered person rights beyond those reasonably considered to be standard minority shareholder protections, including (a) membership or observer rights on, or the right to nominate an individual to a position on, the board of directors or an equivalent governing body of the U.S. person, or (b) any other involvement, beyond the voting of shares, in substantive business decisions, management, or strategy of the U.S. person.

    (c) Examples— (1) Example 1. A U.S. company intends to build a data center located in a U.S. territory. The data center will store bulk personal health ( print page 86210) data on U.S. persons. A foreign private equity fund located in a country of concern agrees to provide capital for the construction of the data center in exchange for acquiring a majority ownership stake in the data center. The agreement that gives the private equity fund a stake in the data center is an investment agreement. The investment agreement is a restricted transaction.

    (2) Example 2. A foreign technology company that is subject to the jurisdiction of a country of concern and that the Attorney General has designated as a covered person enters into a shareholders' agreement with a U.S. business that develops mobile games and social media apps, acquiring a minority equity stake in the U.S. business. The shareholders' agreement is an investment agreement. These games and apps developed by the U.S. business systematically collect bulk U.S. sensitive personal data of its U.S. users. The investment agreement explicitly gives the foreign technology company the ability to access this data and is therefore a restricted transaction.

    (3) Example 3. Same as Example 2, but the investment agreement either does not explicitly give the foreign technology company the right to access the data or explicitly forbids that access. The investment agreement nonetheless provides the foreign technology company with the sufficient ownership interest, rights, or other involvement in substantive business decisions, management, or strategy such that the investment does not constitute a passive investment. Because it is not a passive investment, the ownership interest, rights, or other involvement in substantive business decisions, management, or strategy gives the foreign technology company the ability to obtain logical or physical access, regardless of how the agreement formally distributes those rights. The investment agreement therefore involves access to bulk U.S. sensitive personal data. The investment agreement is a restricted transaction.

    (4) Example 4. Same as Example 3, but the U.S. business does not maintain or have access to any government-related data or bulk U.S. sensitive personal data ( e.g., a pre-commercial company or startup company). Because the data transaction cannot involve access to any government-related data or bulk U.S. sensitive personal data, this investment agreement does not meet the definition of a covered data transaction and is not a restricted transaction.

    Iran.

    The term Iran means the Islamic Republic of Iran, as well as any political subdivision, agency, or instrumentality thereof.

    Knowingly.

    (a) Definition. The term knowingly, with respect to conduct, a circumstance, or a result, means that a person has actual knowledge, or reasonably should have known, of the conduct, the circumstance, or the result.

    (b) Examples— (1) Example 1. A U.S. company sells DNA testing kits to U.S. consumers and maintains bulk human genomic data collected from those consumers. The U.S. company enters into a contract with a foreign cloud-computing company (which is not a covered person) to store the U.S. company's database of human genomic data. The foreign company hires employees from other countries, including citizens of countries of concern who primarily reside in a country of concern, to manage databases for its customers, including the U.S. company's human genomic database. There is no indication of evasion, such as the U.S. company knowingly directing the foreign company's employment agreements with covered persons, or the U.S. company engaging in and structuring these transactions to evade the regulations. The cloud-computing services agreement between the U.S. company and the foreign company would not be prohibited or restricted, because that covered data transaction is between a U.S. person and a foreign company that does not meet the definition of a covered person. The employment agreements between the foreign company and the covered persons would not be prohibited or restricted because those agreements are between foreign persons.

    (2) Example 2. A U.S. company transmits the bulk U.S. sensitive personal data of U.S. persons to a country of concern, in violation of this part, using a fiber optic cable operated by another U.S. company. The U.S. cable operator has not knowingly engaged in a prohibited transaction or a restricted transaction solely by virtue of operating the fiber optic cable because the U.S. cable operator does not know, and reasonably should not know, the content of the traffic transmitted across the fiber optic cable.

    (3) Example 3. A U.S. service provider provides a software platform on which a U.S. company processes the bulk U.S. sensitive personal data of its U.S.-person customers. While the U.S. service provider is generally aware of the nature of the U.S. company's business, the U.S. service provider is not aware of the kind or volume of data that the U.S. company processes on the platform, how the U.S. company uses the data, or whether the U.S. company engages in data transactions. The U.S. company also primarily controls access to its data on the platform, with the U.S. service provider accessing the data only for troubleshooting or technical support purposes, upon request by the U.S. company. Subsequently, without the actual knowledge of the U.S. service provider and without providing the U.S. service provider with any information from which the service provider should have known, the U.S. company grants access to the data on the U.S. service provider's software platform to a covered person through a covered data transaction, in violation of this part. The U.S. service provider itself, however, has not knowingly engaged in a restricted transaction by enabling the covered persons' access via its software platform.

    (4) Example 4. Same as Example 3, but in addition to providing the software platform, the U.S. company's contract with the U.S. service provider also outsources the U.S. company's processing and handling of the data to the U.S. service provider. As a result, the U.S. service provider primarily controls access to the U.S. company's bulk U.S. sensitive personal data on the platform. The U.S. service provider employs a covered person and grants access to this data as part of this employment. Although the U.S. company's contract with the U.S. service provider is not a restricted transaction, the U.S. service provider's employment agreement with the covered person is a restricted transaction. The U.S. service provider has thus knowingly engaged in a restricted transaction by entering into an employment agreement that grants access to its employee because the U.S. service provider knew or should have known of its employee's covered person status and, as the party responsible for processing and handling the data, the U.S. service provider was aware of the kind and volume of data that the U.S. company processes on the platform.

    (5) Example 5. A U.S. company provides cloud storage to a U.S. customer for the encrypted storage of the customer's bulk U.S. sensitive personal data. The U.S. cloud-service provider has an emergency back-up encryption key for all its customers' data, but the company is contractually limited to using the key to decrypt the data only at the customer's request. The U.S. customer's systems and access to the key become disabled, and the U.S. customer requests that the cloud-service provider use the back-up encryption key to decrypt the data and store it on a ( print page 86211) backup server while the customer restores its own systems. By having access to and using the backup encryption key to decrypt the data in accordance with the contractual limitation, the U.S. cloud-service provider does not and reasonably should not know the kind and volumes of the U.S. customer's data. If the U.S. customer later uses the cloud storage to knowingly engage in a prohibited transaction, the U.S. cloud-service provider's access to and use of the backup encryption key does not mean that the U.S. cloud-service provider has also knowingly engaged in a restricted transaction.

    (6) Example 6. A prominent human genomics research clinic enters into a cloud-services contract with a U.S. cloud-service provider that specializes in storing and processing healthcare data to store bulk human genomic research data. The cloud-service provider hires IT personnel in a country of concern, who are thus covered persons. While the data that is stored is encrypted, the IT personnel can access the data in encrypted form. The employment agreement between the U.S. cloud-service provider and the IT professionals in the country of concern is a prohibited transaction because the agreement involves giving the IT personnel access to the encrypted data and constitutes a transfer of human genomic data. Given the nature of the research institution's work and the cloud-service provider's expertise in storing healthcare data, the cloud-service provider reasonably should have known that the encrypted data is bulk U.S. sensitive personal data covered by the regulations. The cloud-service provider has therefore knowingly engaged in a prohibited transaction (because it involves access to human genomic data).

    Licenses; general and specific.

    (a) General license. The term general license means a written license issued pursuant to this part authorizing a class of transactions and not limited to a particular person.

    (b) Specific license. The term specific license means a written license issued pursuant to this part to a particular person or persons, authorizing a particular transaction or transactions in response to a written license application.

    Linked.

    (a) Definition. The term linked means associated.

    (b) Examples— (1) Example 1. A U.S. person transfers two listed identifiers in a single spreadsheet—such as a list of names of individuals and associated MAC addresses for those individuals' devices. The names and MAC addresses would be considered linked.

    (2) Example 2. A U.S. person transfers two listed identifiers in different spreadsheets—such as a list of names of individuals in one spreadsheet and MAC addresses in another spreadsheet—to two related parties in two different covered data transactions. The names and MAC addresses would be considered linked, provided that some correlation existed between the names and MAC addresses ( e.g., associated employee ID number is also listed in both spreadsheets).

    (3) Example 3. A U.S. person transfers a standalone list of MAC addresses, without any additional listed identifiers. The standalone list does not include covered personal identifiers. That standalone list of MAC addresses would not become covered personal identifiers even if the receiving party is capable of obtaining separate sets of other listed identifiers or sensitive personal data through separate covered data transactions with unaffiliated parties that would ultimately permit the association of the MAC addresses to specific persons. The MAC addresses would not be considered linked or linkable to those separate sets of other listed identifiers or sensitive personal data.

    Linkable.

    The term linkable means reasonably capable of being linked.

    Note to § 202.233. Data is considered linkable when the identifiers involved in a single covered data transaction, or in multiple covered data transactions or a course of dealing between the same or related parties, are reasonably capable of being associated with the same person(s). Identifiers are not linked or linkable when additional identifiers or data not involved in the relevant covered data transaction(s) would be necessary to associate the identifiers with the same specific person(s).

    Listed identifier.

    The term listed identifier means any piece of data in any of the following data fields:

    (a) Full or truncated government identification or account number (such as a Social Security number, driver's license or State identification number, passport number, or Alien Registration Number);

    (b) Full financial account numbers or personal identification numbers associated with a financial institution or financial-services company;

    (c) Device-based or hardware-based identifier (such as International Mobile Equipment Identity (“IMEI”), Media Access Control (“MAC”) address, or Subscriber Identity Module (“SIM”) card number);

    (d) Demographic or contact data (such as first and last name, birth date, birthplace, ZIP code, residential street or postal address, phone number, email address, or similar public account identifiers);

    (e) Advertising identifier (such as Google Advertising ID, Apple ID for Advertisers, or other mobile advertising ID (“MAID”));

    (f) Account-authentication data (such as account username, account password, or an answer to security questions);

    (g) Network-based identifier (such as internet Protocol (“IP”) address or cookie data); or

    (h) Call-detail data (such as Customer Proprietary Network Information (“CPNI”)).

    National Security Division.

    The term National Security Division means the National Security Division of the United States Department of Justice.

    North Korea.

    The term North Korea means the Democratic People's Republic of North Korea, and any political subdivision, agency, or instrumentality thereof.

    Order.

    The term Order means Executive Order 14117 of February 28, 2024 (Preventing Access to Americans' Bulk Sensitive Personal Data and United States Government-Related Data by Countries of Concern), 89 FR 15421 (March 1, 2024).

    Person.

    The term person means an individual or entity.

    Personal communications.

    The term personal communications means any postal, telegraphic, telephonic, or other personal communication that does not involve the transfer of anything of value, as set out under 50 U.S.C. 1702(b)(1).

    Personal financial data.

    The term personal financial data means data about an individual's credit, charge, or debit card, or bank account, including purchases and payment history; data in a bank, credit, or other financial statement, including assets, liabilities, debts, or trades in a securities portfolio; or data in a credit report or in a “consumer report” (as defined in 15 U.S.C. 1681a(d)).

    ( print page 86212)
    Personal health data.

    The term personal health data means health information that relates to the past, present, or future physical or mental health or condition of an individual; the provision of healthcare to an individual; or the past, present, or future payment for the provision of healthcare to an individual. This term includes basic physical measurements and health attributes (such as bodily functions, height and weight, vital signs, symptoms, and allergies); social, psychological, behavioral, and medical diagnostic, intervention, and treatment history; test results; logs of exercise habits; immunization data; data on reproductive and sexual health; and data on the use or purchase of prescribed medications.

    Precise geolocation data.

    The term precise geolocation data means data, whether real-time or historical, that identifies the physical location of an individual or a device with a precision of within 1,000 meters.

    Prohibited transaction.

    The term prohibited transaction means a data transaction that is subject to one or more of the prohibitions described in subpart C of this part.

    Property; property interest.

    The terms property and property interest include money; checks; drafts; bullion; bank deposits; savings accounts; debts; indebtedness; obligations; notes; guarantees; debentures; stocks; bonds; coupons; any other financial instruments; bankers acceptances; mortgages, pledges, liens, or other rights in the nature of security; warehouse receipts, bills of lading, trust receipts, bills of sale, or any other evidences of title, ownership, or indebtedness; letters of credit and any documents relating to any rights or obligations thereunder; powers of attorney; goods; wares; merchandise; chattels; stocks on hand; ships; goods on ships; real estate mortgages; deeds of trust; vendors' sales agreements; land contracts, leaseholds, ground rents, real estate and any other interest therein; options; negotiable instruments; trade acceptances; royalties; book accounts; accounts payable; judgments; patents; trademarks or copyrights; insurance policies; safe deposit boxes and their contents; annuities; pooling agreements; services of any nature whatsoever; contracts of any nature whatsoever; any other property, real, personal, or mixed, tangible or intangible, or interest or interests therein, present, future, or contingent.

    Recent former employees or contractors.

    The terms recent former employees or recent former contractors mean employees or contractors who worked for or provided services to the United States Government, in a paid or unpaid status, within the past 2 years of a potential covered data transaction.

    Restricted transaction.

    The term restricted transaction means a data transaction that is subject to subpart D of this part.

    Russia.

    The term Russia means the Russian Federation, and any political subdivision, agency, or instrumentality thereof.

    Security requirements.

    The term security requirements means the Cybersecurity and Infrastructure Agency (“CISA”) Security Requirements for Restricted Transactions (incorporated by reference, see § 202.402).

    Sensitive personal data.

    (a) Definition. The term sensitive personal data means covered personal identifiers, precise geolocation data, biometric identifiers, human genomic data, personal health data, personal financial data, or any combination thereof.

    (b) Exclusions. The term sensitive personal data excludes:

    (1) Public or nonpublic data that does not relate to an individual, including such data that meets the definition of a “trade secret” (as defined in 18 U.S.C. 1839(3)) or “proprietary information” (as defined in 50 U.S.C. 1708(d)(7));

    (2) Data that is, at the time of the transaction, lawfully available to the public from a Federal, State, or local government record (such as court records) or in widely distributed media (such as sources that are generally available to the public through unrestricted and open-access repositories);

    (3) Personal communications; and

    (4) Information or informational materials.

    Special Administrative Region of Hong Kong.

    The term Special Administrative Region of Hong Kong means the Special Administrative Region of Hong Kong, and any political subdivision, agency, or instrumentality thereof.

    Special Administrative Region of Macau.

    The term Special Administrative Region of Macau means the Special Administrative Region of Macau, and any political subdivision, agency, or instrumentality thereof.

    Telecommunications service.

    The term telecommunications service means “telecommunications service” as defined in 47 U.S.C. 153(53).

    Transaction.

    The term transaction means any acquisition, holding, use, transfer, transportation, exportation of, or dealing in any property in which a foreign country or national thereof has an interest.

    Transfer.

    The term transfer means any actual or purported act or transaction, whether or not evidenced by writing, and whether or not done or performed within the United States, the purpose, intent, or effect of which is to create, surrender, release, convey, transfer, or alter, directly or indirectly, any right, remedy, power, privilege, or interest with respect to any property. Without limitation on the foregoing, it shall include the making, execution, or delivery of any assignment, power, conveyance, check, declaration, deed, deed of trust, power of attorney, power of appointment, bill of sale, mortgage, receipt, agreement, contract, certificate, gift, sale, affidavit, or statement; the making of any payment; the setting off of any obligation or credit; the appointment of any agent, trustee, or fiduciary; the creation or transfer of any lien; the issuance, docketing, filing, or levy of or under any judgment, decree, attachment, injunction, execution, or other judicial or administrative process or order, or the service of any garnishment; the acquisition of any interest of any nature whatsoever by reason of a judgment or decree of any foreign country; the fulfillment of any condition; the exercise of any power of appointment, power of attorney, or other power; or the acquisition, disposition, transportation, importation, exportation, or withdrawal of any security.

    United States.

    The term United States means the United States, its territories and possessions, and all areas under the jurisdiction or authority thereof.

    United States person or U.S. person.

    (a) Definition. The terms United States person and U.S. person mean any United States citizen, national, or lawful permanent resident; any individual admitted to the United States as a refugee under 8 U.S.C. 1157 or granted ( print page 86213) asylum under 8 U.S.C. 1158; any entity organized solely under the laws of the United States or any jurisdiction within the United States (including foreign branches); or any person in the United States.

    (b) Examples— (1) Example 1. An individual is a citizen of a country of concern and is in the United States. The individual is a U.S. person.

    (2) Example 2. An individual is a U.S. citizen. The individual is a U.S. person, regardless of location.

    (3) Example 3. An individual is a dual citizen of the United States and a country of concern. The individual is a U.S. person, regardless of location.

    (4) Example 4. An individual is a citizen of a country of concern, is not a permanent resident alien of the United States, and is outside the United States. The individual is a foreign person.

    (5) Example 5. A company is organized under the laws of the United States and has a foreign branch in a country of concern. The company, including its foreign branch, is a U.S. person.

    (6) Example 6. A parent company is organized under the laws of the United States and has a subsidiary organized under the laws of a country of concern. The subsidiary is a foreign person regardless of the degree of ownership by the parent company; the parent company is a U.S. person.

    (7) Example 7. A company is organized under the laws of a country of concern and has a branch in the United States. The company, including its U.S. branch, is a foreign person.

    (8) Example 8. A parent company is organized under the laws of a country of concern and has a subsidiary organized under the laws of the United States. The subsidiary is a U.S. person regardless of the degree of ownership by the parent company; the parent company is a foreign person.

    U.S. device.

    The term U.S. device means any device with the capacity to store or transmit data that is linked or linkable to a U.S. person.

    Vendor agreement.

    (a) Definition. The term vendor agreement means any agreement or arrangement, other than an employment agreement, in which any person provides goods or services to another person, including cloud-computing services, in exchange for payment or other consideration.

    (b) Examples— (1) Example 1. A U.S. company collects bulk precise geolocation data from U.S. users through an app. The U.S. company enters into an agreement with a company headquartered in a country of concern to process and store this data. This vendor agreement is a restricted transaction.

    (2) Example 2. A medical facility in the United States contracts with a company headquartered in a country of concern to provide IT-related services. The contract governing the provision of services is a vendor agreement. The medical facility has bulk personal health data on its U.S. patients. The IT services provided under the contract involve access to the medical facility's systems containing the bulk personal health data. This vendor agreement is a restricted transaction.

    (3) Example 3. A U.S. company, which is owned by an entity headquartered in a country of concern and has been designated a covered person, establishes a new data center in the United States to offer managed services. The U.S. company's data center serves as a vendor to various U.S. companies to store bulk U.S. sensitive personal data collected by those companies. These vendor agreements are restricted transactions.

    (4) Example 4. A U.S. company develops mobile games that collect bulk precise geolocation data and biometric identifiers of U.S.-person users. The U.S. company contracts part of the software development to a foreign person who is primarily resident in a country of concern and is a covered person. The contract with the foreign person is a vendor agreement. The software-development services provided by the covered person under the contract involve access to the bulk precise geolocation data and biometric identifiers. This is a restricted transaction.

    (5) Example 5. A U.S. multinational company maintains bulk U.S. sensitive personal data of U.S. persons. This company has a foreign branch, located in a country of concern, that has access to this data. The foreign branch contracts with a local company located in the country of concern to provide cleaning services for the foreign branch's facilities. The contract is a vendor agreement, the foreign branch is a U.S. person, and the local company is a covered person. Because the services performed under this vendor agreement do not “involve access to” the bulk U.S. sensitive personal data, the vendor agreement would not be a covered data transaction.

    Venezuela.

    The term Venezuela means the Bolivarian Republic of Venezuela, and any political subdivision, agency, or instrumentality thereof.

    Subpart C—Prohibited Transactions and Related Activities

    Prohibited data-brokerage transactions.

    (a) Prohibition. Except as otherwise authorized pursuant to subparts E or H of this part or any other provision of this part, no U.S. person, on or after the effective date, may knowingly engage in a covered data transaction involving data brokerage with a country of concern or covered person.

    (b) Examples— (1) Example 1. A U.S. subsidiary of a company headquartered in a country of concern develops an artificial intelligence chatbot in the United States that is trained on the bulk U.S. sensitive personal data of U.S. persons. While not its primary commercial use, the chatbot is capable of reproducing or otherwise disclosing the bulk sensitive personal health data that was used to train the chatbot when responding to queries. The U.S. subsidiary knowingly licenses subscription-based access to that chatbot worldwide, including to covered persons such as its parent entity. Although licensing use of the chatbot itself may not necessarily “involve access” to bulk U.S. sensitive personal data, the U.S. subsidiary knows or should know that the license can be used to obtain access to the U.S. persons' bulk sensitive personal training data if prompted. The licensing of access to this bulk U.S. sensitive personal data is data brokerage because it involves the transfer of data from the U.S. company ( i.e., the provider) to licensees ( i.e., the recipients), where the recipients did not collect or process the data directly from the individuals linked or linkable to the collected or processed data. Even though the license did not explicitly provide access to the data, this is a prohibited transaction because the U.S. company knew or should have known that the use of the chatbot pursuant to the license could be used to obtain access to the training data, and because the U.S. company licensed the product to covered persons.

    (2) [Reserved]

    Other prohibited data-brokerage transactions involving potential onward transfer to countries of concern or covered persons.

    (a) Prohibition. Except as otherwise authorized pursuant to this part, no U.S. person, on or after the effective date, may knowingly engage in a covered data transaction involving data brokerage with any foreign person that is not a covered person unless the U.S. person: ( print page 86214)

    (1) Contractually requires that the foreign person refrain from engaging in a subsequent covered data transaction involving data brokerage of the same data with a country of concern or covered person; and

    (2) Reports any known or suspected violations of this contractual requirement in accordance with paragraph (b) of this section.

    (b) Reporting known or suspected violations— (1) When reports are due. U.S. persons shall file reports within 14 days of the U.S. person becoming aware of a known or suspected violation.

    (2) Contents of reports. Reports on known or suspected violations shall include the following, to the extent the information is known and available to the person filing the report at the time of the report:

    (i) The name and address of the U.S. person reporting the known or suspected violation of the contractual requirement in accordance with paragraph (b) of this section;

    (ii) A description of the known or suspected violation, including:

    (A) Date of known or suspected violation;

    (B) Description of the data-brokerage transaction referenced in paragraph (a) of this section;

    (C) Description of the contractual provision prohibiting the onward transfer of the same data to a country of concern or covered person;

    (D) Description of the known or suspected violation of the contractual obligation prohibiting the foreign person from engaging in a subsequent covered data transaction involving the same data with a country of concern or a covered person;

    (E) Any persons substantively participating in the transaction referenced in paragraph (a) of this section;

    (F) Information about the known or suspected persons involved in the onward data transfer transaction, including the name and location of any covered persons or countries of concern;

    (G) A copy of any relevant documentation received or created in connection with the transaction; and

    (iii) Any other information that the Department of Justice may require or any other information that the U.S. person filing the report believes to be pertinent to the known or suspected violation or the implicated covered person.

    (3) Additional contents; format and method of submission. Reports required by this section must be submitted in accordance with this section and with subpart L of this part.

    (c) Examples— (1) Example 1. A U.S. business knowingly enters into an agreement to sell bulk human genomic data to a European business that is not a covered person. The U.S. business is required to include in that agreement a limitation on the European business' right to resell or otherwise engage in a covered data transaction involving data brokerage of that data to a country of concern or covered person. Otherwise, the agreement would be a prohibited transaction.

    (2) Example 2. A U.S. company owns and operates a mobile app for U.S. users with available advertising space. As part of selling the advertising space, the U.S. company provides the bulk precise geolocation data, IP address, and advertising IDs of its U.S. users' devices to an advertising exchange based in Europe that is not a covered person. The U.S. company's provision of this data to the advertising exchange is data brokerage and a prohibited transaction unless the U.S. company obtains a contractual commitment from the advertising exchange not to engage in any covered data transactions involving data brokerage of that same data with a country of concern or covered person.

    Prohibited human genomic data and human biospecimen transactions.

    Except as otherwise authorized pursuant to this part, no U.S. person, on or after the effective date, may knowingly engage in any covered data transaction with a country of concern or covered person that involves access by that country of concern or covered person to bulk U.S. sensitive personal data that involves bulk human genomic data, or to human biospecimens from which bulk human genomic data could be derived.

    Prohibited evasions, attempts, causing violations, and conspiracies.

    (a) Prohibition. Any transaction on or after the effective date that has the purpose of evading or avoiding, causes a violation of, or attempts to violate any of the prohibitions set forth in this part is prohibited. Any conspiracy formed to violate the prohibitions set forth in this part is prohibited.

    (b) Examples— (1) Example 1. A U.S. data broker seeks to sell bulk U.S. sensitive personal data to a foreign person who primarily resides in China. With knowledge that the foreign person is a covered person and with the intent to evade the regulations, the U.S. data broker invites the foreign person to travel to the United States to consummate the data transaction and transfer the bulk U.S. sensitive personal data in the United States. After completing the transaction, the person returns to China with the bulk U.S. sensitive personal data. The transaction in the United States is not a covered data transaction because the person who resides in China is a U.S. person while in the United States (unless that person was individually designated as a covered person pursuant to § 202.211(a)(5), in which case their covered person status would remain, even while in the United States, and the transaction would be a covered data transaction). However, the U.S. data broker has structured the transaction to evade the regulation's prohibitions on covered data transactions with covered persons. As a result, this transaction has the purpose of evading the regulations and is prohibited.

    (2) Example 2. A Russian national, who is employed by a corporation headquartered in Russia, travels to the United States to conduct business with the Russian company's U.S. subsidiary, including with the purpose of obtaining bulk U.S. sensitive personal data from the U.S. subsidiary. The U.S. subsidiary is a U.S. person, the Russian corporation is a covered person, and the Russian employee is a covered person while outside the United States but a U.S. person while temporarily in the United States (unless that Russian employee was individually designated as a covered person pursuant to § 202.211(a)(5), in which case their covered person status would remain, even while in the United States, and the transaction would be a covered data transaction). With knowledge of these facts, the U.S. subsidiary licenses access to bulk U.S. sensitive personal data to the Russian employee while in the United States, who then returns to Russia. This transaction has the purpose of evading the regulations and is prohibited.

    (3) Example 3. A U.S. subsidiary of a company headquartered in a country of concern collects bulk precise geolocation data from U.S. persons. The U.S. subsidiary is a U.S. person, and the parent company is a covered person. With the purpose of evading the regulations, the U.S. subsidiary enters into a vendor agreement with a foreign company that is not a covered person. The vendor agreement provides the foreign company access to the data. The U.S. subsidiary knows (or reasonably should know) that the foreign company is a shell company, and knows that it subsequently outsources the vendor agreement to the U.S. subsidiary's parent company. This transaction has the purpose of evading the regulations and is prohibited. ( print page 86215)

    (4) Example 4. A U.S. company collects bulk personal health data from U.S. persons. With the purpose of evading the regulations, the U.S. company enters into a vendor agreement with a foreign company that is not a covered person. The agreement provides the foreign company access to the data. The U.S. company knows (or reasonably should know) that the foreign company is a front company staffed primarily by covered persons. The U.S. company has not complied with either the security requirements in § 202.248 or other applicable requirements for conducting restricted transactions as detailed in subpart J of this part. This transaction has the purpose of evading the regulations and is prohibited.

    (6) Example 6. A U.S. online gambling company uses an artificial intelligence algorithm to analyze collected bulk covered personal identifiers to identify users based on impulsivity for targeted advertising. For the purpose of evasion, a U.S. subsidiary of a company headquartered in a country of concern licenses the derivative algorithm from the U.S. online gambling company for the purpose of accessing bulk sensitive personal identifiers from the training data contained in the algorithm that would not otherwise be accessible to the parent company and shares the algorithm with the parent company so that the parent company can obtain the bulk covered personal identifiers. The U.S. subsidiary's licensing transaction with the parent company has the purpose of evading the regulations and is prohibited.

    Knowingly directing prohibited or restricted transactions.

    (a) Prohibition. Except as otherwise authorized pursuant to this part, no U.S. person, on or after the effective date, may knowingly direct any covered data transaction that would be a prohibited transaction or restricted transaction that fails to comply with the requirements of subpart D and all other applicable requirements under this part, if engaged in by a U.S. person.

    (b) Examples— (1) Example 1. A U.S. person is an officer, senior manager, or equivalent senior-level employee at a foreign company that is not a covered person, and the foreign company undertakes a covered data transaction at that U.S. person's direction or with that U.S. person's approval when the covered data transaction would be prohibited if performed by a U.S. person. The U.S. person has knowingly directed a prohibited transaction.

    (2) Example 2. Several U.S. persons launch, own, and operate a foreign company that is not a covered person, and that foreign company, under the U.S. persons' operation, undertakes covered data transactions that would be prohibited if performed by a U.S. person. The U.S. persons have knowingly directed a prohibited transaction.

    (3) Example 3. A U.S. person is employed at a U.S.-headquartered multinational company that has a foreign affiliate that is not a covered person. The U.S. person instructs the U.S. company's compliance unit to change (or approve changes to) the operating policies and procedures of the foreign affiliate with the specific purpose of allowing the foreign affiliate to undertake covered data transactions that would be prohibited if performed by a U.S. person. The U.S. person has knowingly directed prohibited transactions.

    (4) Example 4. A U.S. bank processes a payment from a U.S. person to a covered person, or from a covered person to a U.S. person, as part of that U.S. person's engagement in a prohibited transaction. The U.S. bank has not knowingly directed a prohibited transaction, and its activity would not be prohibited (although the U.S. person's covered data transaction would be prohibited).

    (5) Example 5. A U.S. financial institution underwrites a loan or otherwise provides financing for a foreign company that is not a covered person, and the foreign company undertakes covered data transactions that would be prohibited if performed by a U.S. person. The U.S. financial institution has not knowingly directed a prohibited transaction, and its activity would not be prohibited.

    (6) Example 6. A U.S. person, who is employed at a foreign company that is not a covered person, signs paperwork approving the foreign company's procurement of real estate for its operations. The same foreign company separately conducts data transactions that use or are facilitated by operations at that real estate location and that would be prohibited transactions if performed by a U.S. person, but the U.S. employee has no role in approving or directing those separate data transactions. The U.S. person has not knowingly directed a prohibited transaction, and the U.S. person's activity would not be prohibited.

    (7) Example 7. A U.S. company owns or operates a submarine telecommunications cable with one landing point in a foreign country that is not a country of concern and one landing point in a country of concern. The U.S. company leases capacity on the cable to U.S. customers that transmit bulk U.S. sensitive personal data to the landing point in the country of concern, including transmissions as part of prohibited transactions. The U.S. company's ownership or operation of the cable does not constitute knowingly directing a prohibited transaction, and its ownership or operation of the cable would not be prohibited (although the U.S. customers' covered data transactions would be prohibited).

    (8) Example 8. A U.S. person engages in a vendor agreement involving bulk U.S. sensitive personal data with a foreign person who is not a covered person. Such vendor agreement is not a restricted or prohibited transaction. The foreign person then employs an individual who is a covered person and grants them access to bulk U.S. sensitive personal data without the U.S. person's knowledge or direction. There is no covered data transaction between the U.S. person and the covered person, and there is no indication that the parties engaged in these transactions with the purpose of evading the regulations (such as the U.S. person having knowingly directed the foreign person's employment agreement with the covered person or the parties knowingly structuring a restricted transaction into these multiple transactions with the purpose of evading the prohibition). The U.S. person has not knowingly directed a restricted transaction.

    (9) Example 9. A U.S. company sells DNA testing kits to U.S. consumers and maintains bulk human genomic data collected from those consumers. The U.S. company enters into a contract with a foreign cloud-computing company (which is not a covered person) to store the U.S. company's database of human genomic data. The foreign company hires employees from other countries, including citizens of countries of concern who primarily reside in a country of concern, to manage databases for its customers, including the U.S. company's human genomic database. There is no indication of evasion, such as the U.S. company knowingly directing the foreign company's employment agreements or the U.S. company knowingly engaging in and structuring these transactions to evade the regulations. The cloud-computing services agreement between the U.S. company and the foreign company would not be prohibited or restricted because that transaction is between a U.S. person and a foreign company that does not meet the definition of a covered person. The employment agreements between the foreign company and the covered persons would not be prohibited or restricted ( print page 86216) because those agreements are between foreign persons.

    Subpart D—Restricted Transactions

    Authorization to conduct restricted transactions.

    (a) Restricted transactions. Except as otherwise authorized pursuant to subparts E or H of this part or any other provision of this part, no U.S. person, on or after the effective date, may knowingly engage in a covered data transaction involving a vendor agreement, employment agreement, or investment agreement with a country of concern or covered person unless the U.S. person complies with the security requirements required by subpart D of this part and all other applicable requirements under this part.

    (b) This subpart does not apply to covered data transactions involving access to bulk human genomic data or human biospecimens from which such data can be derived that is subject to the prohibition in § 202.303 of this part.

    (c) Examples— (1) Example 1. A U.S. company engages in an employment agreement with a covered person to provide information technology support. As part of their employment, the covered person has access to personal financial data. The U.S. company implements and complies with the security requirements. The employment agreement is authorized as a restricted transaction because the company has complied with the security requirements.

    (2) Example 2. A U.S. company engages in a vendor agreement with a covered person to store bulk personal health data. Instead of implementing the security requirements as identified by reference in this subpart, the U.S. company implements different controls that it believes mitigate the covered person's access to the bulk personal health data. Because the U.S. person has not complied with the security requirements, the vendor agreement is not authorized and thus is a prohibited transaction.

    (3) Example 3. A U.S. person engages in a vendor agreement involving bulk U.S. sensitive personal data with a foreign person who is not a covered person. The foreign person then employs an individual who is a covered person and grants them access to bulk U.S. sensitive personal data without the U.S. person's knowledge or direction. There is no covered data transaction between the U.S. person and the covered person, and there is no indication that the parties engaged in these transactions with the purpose of evading the regulations (such as the U.S. person having knowingly directed the foreign person's employment agreement with the covered person or the parties knowingly structuring a prohibited transaction into these multiple transactions with the purpose of evading the prohibition). As a result, neither the vendor agreement nor the employment agreement would be a restricted transaction.

    Incorporation by reference.

    (a) Incorporation by reference. Certain material is incorporated by reference into this part with the approval of the Director of the Federal Register under 5 U.S.C. 552(a) and 1 CFR part 51. This incorporation by reference (“IBR”) material is available for inspection at the Department of Justice and at the National Archives and Records Administration (“NARA”). Please contact the Foreign Investment Review Section, National Security Division, U.S. Department of Justice, 175 N St. NE, Washington, DC 20002, telephone: 202-514-8648, NSD.FIRS.datasecurity@usdoj.gov. You may also obtain the material from the National Security Division at https://www.justice.gov/​nsd. For information on the availability of this material at NARA, visit www.archives.gov/​federal-register/​cfr/​ibr-locations.html or email fr.inspection@nara.gov. The material may also be obtained from the sources in the following paragraphs of this section.

    (b) Other sources. The Cybersecurity and Infrastructure Security Agency, Mail Stop 0380, Department of Homeland Security, 245 Murray Lane, Washington, DC 20528-0380, central@cisa.gov, 888-282-0870, http://www.cisa.gov. You may also obtain the material from the Cybersecurity and Infrastructure Security Agency at https://www.cisa.gov/​.

    (1) The Cybersecurity and Infrastructure Agency (“CISA”), Security Requirements for Restricted Transactions; (Final edition 202X Draft), IBR approved for §§ 202.248; 202.304(b)(4); 202.401(a); 202.401(c)(1); 202.401(c)(2); 202.508(b)(8); 202.508(b)(10); 202.508(b)(11); 202.1001(b)(4); 202.1002(b)(1); 202.1002(e)(4); 202.1002(f)(2)(iv); 202.1002(f)(2)(v); 202.1002(f)(2)(vi); 202.1101(b)(2); 202.1101(b)(3).

    (2) [Reserved]

    Subpart E—Exempt Transactions

    Personal communications.

    Subparts C and D of this part do not apply to data transactions to the extent that they involve any postal, telegraphic, telephonic, or other personal communication that does not involve the transfer of anything of value.

    Information or informational materials.

    Subparts C and D of this part do not apply to data transactions to the extent that they involve the importation from any country, or the exportation to any country, whether commercial or otherwise, regardless of format or medium of transmission, of any information or informational materials.

    Travel.

    Subparts C and D of this part do not apply to data transactions to the extent that they are ordinarily incident to travel to or from any country, including importation of accompanied baggage for personal use; maintenance within any country, including payment of living expenses and acquisition of goods or services for personal use; and arrangement or facilitation of such travel, including nonscheduled air, sea, or land voyages.

    Official business of the United States Government.

    (a) Exemption. Subparts C and D of this part do not apply to data transactions to the extent that they are for the conduct of the official business of the United States Government by its employees, grantees, or contractors; any authorized activity of any United States Government department or agency (including an activity that is performed by a Federal depository institution or credit union supervisory agency in the capacity of receiver or conservator); or transactions conducted pursuant to a grant, contract, or other agreement entered into with the United States Government.

    (b) Examples— (1) Example 1. A U.S. hospital receives a Federal grant to conduct human genomic research on U.S. persons. As part of that federally funded human genomic research, the U.S. hospital contracts with a foreign laboratory that is a covered person, hires a researcher that is a covered person, and gives the laboratory and researcher access to the human biospecimens and human genomic data in bulk. The contract with the foreign laboratory and the employment of the researcher are exempt transactions but would be prohibited transactions if they were not part of the federally funded research.

    (2) [Reserved]

    Financial services.

    (a) Exemption. Subparts C and D of this part do not apply to data transactions, to the extent that they are ordinarily incident to and part of the ( print page 86217) provision of financial services, including:

    (1) Banking, capital-markets (including investment-management services), or financial-insurance services;

    (2) A financial activity authorized for national banks by 12 U.S.C. 24 (Seventh) and rules and regulations and written interpretations of the Office of the Comptroller of the Currency thereunder;

    (3) An activity that is “financial in nature or incidental to such financial activity” or “complementary to a financial activity,” section (k)(1), as set forth in section (k)(4) of the Bank Holding Company Act of 1956 (12 U.S.C. 1843(k)(4)) and rules and regulations and written interpretations of the Board of Governors of the Federal Reserve System thereunder;

    (4) The transfer of personal financial data or covered personal identifiers incidental to the purchase and sale of goods and services (such as the purchase, sale, or transfer of consumer products and services through online shopping or e-commerce marketplaces);

    (5) The provision or processing of payments or funds transfers (such as person-to-person, business-to-person, and government-to-person funds transfers) involving the transfer of personal financial data or covered personal identifiers, or the provision of services ancillary to processing payments and funds transfers (such as services for payment dispute resolution, payor authentication, tokenization, payment gateway, payment fraud detection, payment resiliency, mitigation and prevention, and payment-related loyalty point program administration); and

    (6) The provision of investment-management services that manage or provide advice on investment portfolios or individual assets for compensation (such as devising strategies and handling financial assets and other investments for clients) or provide services ancillary to investment-management services (such as broker-dealers executing trades within a securities portfolio based upon instructions from an investment advisor).

    (b) Examples— (1) Example 1. A U.S. company engages in a data transaction to transfer personal financial data in bulk to a financial institution that is incorporated in, located in, or subject to the jurisdiction or control of a country of concern to clear and settle electronic payment transactions between U.S. individuals and merchants in a country of concern where both the U.S. individuals and the merchants use the U.S. company's infrastructure, such as an e-commerce platform. Both the U.S. company's transaction transferring bulk personal financial data and the payment transactions by U.S. individuals are exempt transactions.

    (2) Example 2. As ordinarily incident to and part of securitizing and selling asset-backed obligations (such as mortgage and nonmortgage loans) to a covered person, a U.S. bank provides bulk U.S. sensitive personal data to the covered person. The data transfers are exempt transactions.

    (3) Example 3. A U.S. bank or other financial institution, as ordinarily incident to and part of facilitating payments to U.S. persons in a country of concern, stores and processes the customers' bulk financial data using a data center operated by a third-party service provider in the country of concern. The use of this third-party service provider is a vendor agreement, but it is an exempt transaction that is ordinarily incident to and part of facilitating payment.

    (4) Example 4. Same as Example 3, but the underlying payments are between U.S. persons in the United States and do not involve a country of concern. The use of this third-party service provider is a vendor agreement, but it is not an exempt transaction because it is not ordinarily incident to facilitating this type of financial activity.

    (5) Example 5. As part of operating an online marketplace for the purchase and sale of goods, a U.S. company, as ordinarily incident to and part of U.S. consumers' purchase of goods on that marketplace, transfers bulk contact information, payment information ( e.g., credit-card account number, expiration data, and security code), and delivery address to a merchant in a country of concern. The data transfers are exempt transactions because they are ordinarily incident to and part of U.S. consumers' purchase of goods.

    (6) Example 6. A U.S. investment adviser purchases securities of a company incorporated in a country of concern for the accounts of its clients. The investment adviser engages a broker-dealer located in a country of concern to execute the trade, and, as ordinarily incident to and part of the transaction, transfers to the broker-dealer its clients' covered personal identifiers and financial account numbers in bulk. This provision of data is an exempt transaction because it is ordinarily incident to and part of the provision of investment-management services.

    (7) Example 7. A U.S. company that provides payment-processing services sells bulk U.S. sensitive personal data to a covered person. This sale is prohibited data brokerage and is not an exempt transaction because it is not ordinarily incident to and part of the payment-processing services provided by the U.S. company.

    (8) Example 8. A U.S. bank facilitates international funds transfers to foreign persons not related to a country of concern, but through intermediaries or locations subject to the jurisdiction or control of a country of concern. These transfers result in access to bulk financial records by some covered persons to complete the transfers and manage associated risks. Providing this access as part of these transfers is ordinarily incident to the provision of financial services and is exempt.

    (9) Example 9. A U.S. insurance company underwrites personal insurance to U.S. persons residing in foreign countries in the same region as a country of concern. The insurance company relies on its own business infrastructure and personnel in the country of concern to support its financial activity in the region, which results in access to the bulk sensitive personal data of some U.S.-person customers residing in the region, to covered persons at the insurance company supporting these activities. Providing this access is ordinarily incident to the provision of financial services and is exempt.

    (10) Example 10. A U.S. bank operates a foreign branch in a country of concern and provides financial services to U.S. persons living within the country of concern. The bank receives a lawful request from the regulator in the country of concern to review the financial activity conducted in the country, which includes providing access to the bulk sensitive personal data of U.S. persons resident in the country or U.S. persons conducting transactions through the foreign branch. Responding to the regulator's request, including providing access to this bulk sensitive personal data, is ordinarily incident to the provision of financial services and is exempt.

    (11) Example 11. A U.S. bank voluntarily shares information, including relevant bulk sensitive personal data, with financial institutions organized under the laws of a country of concern for the purposes of, and consistent with industry practices for, fraud identification, combatting money laundering and terrorism financing, and U.S. sanctions compliance. Sharing this data for these purposes is ordinarily incident to the provision of financial services and is exempt. ( print page 86218)

    (12) Example 12. A U.S. company provides wealth-management services and collects bulk personal financial data on its U.S. clients. The U.S. company appoints a citizen of a country of concern, who is located in a country of concern, to its board of directors. In connection with the board's data security and cybersecurity responsibilities, the director could access the bulk personal financial data. The appointment of the director, who is a covered person, is a restricted employment agreement and is not exempt because the board member access to the bulk personal financial data is not ordinarily incident to the U.S. company's provision of wealth-management services.

    Corporate group transactions.

    (a) Subparts C and D of this part do not apply to data transactions to the extent they are:

    (1) Between a U.S. person and its subsidiary or affiliate located in (or otherwise subject to the ownership, direction, jurisdiction, or control of) a country of concern; and

    (2) Ordinarily incident to and part of administrative or ancillary business operations, including:

    (i) Human resources;

    (ii) Payroll, expense monitoring and reimbursement, and other corporate financial activities;

    (iii) Paying business taxes or fees;

    (iv) Obtaining business permits or licenses;

    (v) Sharing data with auditors and law firms for regulatory compliance;

    (vi) Risk management;

    (vii) Business-related travel;

    (viii) Customer support;

    (ix) Employee benefits; and

    (x) Employees' internal and external communications.

    (b) Examples— (1) Example 1. A U.S. company has a foreign subsidiary located in a country of concern, and the U.S. company's U.S.-person contractors perform services for the foreign subsidiary. As ordinarily incident to and part of the foreign subsidiary's payments to the U.S.-person contractors for those services, the U.S. company engages in a data transaction that gives the subsidiary access to the U.S.-person contractors' bulk personal financial data and covered personal identifiers. This is an exempt corporate group transaction.

    (2) Example 2. A U.S. company aggregates bulk personal financial data. The U.S. company has a subsidiary that is a covered person because it is headquartered in a country of concern. The subsidiary is subject to the country of concern's national security laws requiring it to cooperate with and assist the country's intelligence services. The exemption for corporate group transactions would not apply to the U.S. parent's grant of a license to the subsidiary to access the parent's databases containing the bulk personal financial data for the purpose of complying with a request or order by the country of concern under those national security laws to provide access to that data because granting of such a license is not ordinarily incident to and part of administrative or ancillary business operations.

    (3) Example 3. A U.S. company's affiliate operates a manufacturing facility in a country of concern for one of the U.S. company's products. The affiliate uses employee fingerprints as part of security and identity verification to control access to that facility. To facilitate its U.S. employees' access to that facility as part of their job responsibilities, the U.S. company provides the fingerprints of those employees in bulk to its affiliate. The transaction is an exempt corporate group transaction.

    (4) Example 4. A U.S. company has a foreign subsidiary located in a country of concern that conducts research and development for the U.S. company. The U.S. company sends bulk personal financial data to the subsidiary for the purpose of developing a financial software tool. The transaction is not an exempt corporate group transaction because it is not ordinarily incident to and part of administrative or ancillary business operations.

    (5) Example 5. Same as Example 4, but the U.S. company has a foreign branch located in a country of concern instead of a foreign subsidiary. Because the foreign branch is a U.S. person as part of the U.S. company, the transaction occurs within the same U.S. person and is not subject to the prohibitions or restrictions. If the foreign branch allows employees who are covered persons to access the bulk personal financial data to develop the financial software tool, the foreign branch has engaged in restricted transactions.

    Transactions required or authorized by Federal law or international agreements, or necessary for compliance with Federal law.

    (a) Required or authorized by Federal law or international agreements. Subparts C and D of this part do not apply to data transactions to the extent they are required or authorized by Federal law or pursuant to an international agreement to which the United States is a party, including relevant provisions in the following:

    (1) Annex 9 to the Convention on International Civil Aviation, International Civil Aviation Organization Doc. 7300 (2022);

    (2) Section 2 of the Convention on Facilitation of International Maritime Traffic (1965);

    (3) Articles 1, 12, 14, and 16 of the Postal Payment Services Agreement (2021);

    (4) Articles 63, 64, and 65 of the Constitution of the World Health Organization (1946);

    (5) Article 2 of the Agreement Between the Government of the United States of America and the Government of the People's Republic of China Regarding Mutual Assistance in Customs Matters (1999);

    (6) Article 7 of the Agreement Between the Government of the United States of America and the Government of the People's Republic of China on Mutual Legal Assistance in Criminal Matters (2000);

    (7) Article 25 of the Agreement Between the Government of the United States of America and the Government of the People's Republic of China for the Avoidance of Double Taxation and the Prevention of Tax Evasion with Respect to Taxes on Income (1987);

    (8) Article 2 of the Agreement Between the United States of America and the Macao Special Administrative Region of the People's Republic of China for Cooperation to Facilitate the Implementation of FATCA (2021);

    (9) Articles II, III, VII of the Protocol to Extend and Amend the Agreement Between the Department of Health and Human Services of the United States of America and the National Health and Family Planning Commission of the People's Republic of China for Cooperation in the Science and Technology of Medicine and Public Health (2013);

    (10) Article III of the Treaty Between the United States and Cuba for the Mutual Extradition of Fugitives from Justice (1905);

    (11) Articles 3, 4, 5, 7 of the Agreement Between the Government of the United States of America and the Government of the Russian Federation on Cooperation and Mutual Assistance in Customs Matters (1994);

    (12) Articles 1, 2, 5, 7, 13, and 16 of the Treaty Between the United States of America and the Russian Federation on Mutual Legal Assistance in Criminal Matters (1999);

    (13) Articles I, IV, IX, XV, and XVI of the Treaty Between the Government of the United States of America and the Government of the Republic of Venezuela on Mutual Legal Assistance in Criminal Matters (1997); and ( print page 86219)

    (14) Articles 5, 6, 7, 9, 11, 19, 35, and 45 of the International Health Regulations (2005).

    (b) Global health and pandemic preparedness. Subparts C and D of this part do not apply to data transactions to the extent they are required or authorized by the following:

    (1) The Pandemic Influenza Preparedness and Response Framework;

    (2) The Global Influenza Surveillance and Response System; and

    (3) The Agreement between the Government of the United States of America and the Government of the People's Republic of China on Cooperation in Science and Technology (1979).

    (c) Compliance with Federal law. Subparts C and D of this part do not apply to data transactions to the extent that they are ordinarily incident to and part of ensuring compliance with any Federal laws and regulations, including the Bank Secrecy Act, 12 U.S.C. 1829b, 1951 through 1960, 31 U.S.C. 310, 5311 through 5314, 5316 through 5336; the Securities Act of 1933, 15 U.S.C. 77a et seq.; the Securities Exchange Act of 1934, 15 U.S.C. 78a et seq.; the Investment Company Act of 1940, 15 U.S.C. 80a-1 et seq.; the Investment Advisers Act of 1940, 15 U.S.C. 80b-1 et seq.; the International Emergency Economic Powers Act, 50 U.S.C. 1701 et seq.; the Export Administration Regulations, 15 CFR 730 et seq.; or any notes, guidance, orders, directives, or additional regulations related thereto.

    (d) Examples— (1) Example 1. A U.S. bank or other financial institution engages in a covered data transaction with a covered person that is ordinarily incident to and part of ensuring compliance with U.S. laws and regulations (such as OFAC sanctions and anti-money laundering programs required by the Bank Secrecy Act). This is an exempt transaction.

    (2) [Reserved]

    Investment agreements subject to a CFIUS action.

    (a) Exemption. Subparts C and D of this part do not apply to data transactions to the extent that they involve an investment agreement that is subject to a CFIUS action.

    (b) Examples —(1) Example 1. A U.S. software provider is acquired in a CFIUS covered transaction by a foreign entity in which the transaction parties sign a mitigation agreement with CFIUS. The agreement has provisions governing the acquirer's ability to access the data of the U.S. software provider and their customers. The mitigation agreement contains a provision stating that it is a CFIUS action for purposes of this part. Before the effective date of the CFIUS mitigation agreement, the investment agreement is not subject to a CFIUS action and remains subject to these regulations to the extent otherwise applicable. Beginning on the effective date of the CFIUS mitigation agreement, the investment agreement is subject to a CFIUS action and exempt from this part.

    (2) Example 2. Same as Example 1, but CFIUS issues an interim order before entering a mitigation agreement. The interim order states that it constitutes a CFIUS action for purposes of this part. Before the effective date of the interim order, the investment agreement is not subject to a CFIUS action and remains subject to these regulations to the extent otherwise applicable. Beginning on the effective date of the interim order, the investment agreement is subject to a CFIUS action and is exempt from this part. The mitigation agreement also states that it constitutes a CFIUS action for purposes of this part. After the effective date of the mitigation agreement, the investment agreement remains subject to a CFIUS action and is exempt from this part.

    (3) Example 3. A U.S. biotechnology company is acquired by a foreign multinational corporation. CFIUS reviews this acquisition and concludes action without mitigation. This acquisition is not subject to a CFIUS action, and the acquisition remains subject to this part to the extent otherwise applicable.

    (4) Example 4. A U.S. manufacturer is acquired by a foreign owner in which the transaction parties sign a mitigation agreement with CFIUS. The mitigation agreement provides for supply assurances and physical access restrictions but does not address data security, and it does not contain a provision explicitly designating that it is a CFIUS action. This acquisition is not subject to a CFIUS action, and the acquisition remains subject to this part to the extent otherwise applicable.

    (5) Example 5. As a result of CFIUS's review and investigation of a U.S. human genomic company's acquisition by a foreign healthcare company, CFIUS refers the transaction to the President with a recommendation to require the foreign acquirer to divest its interest in the U.S. company. The President issues an order prohibiting the transaction and requiring divestment of the foreign healthcare company's interests and rights in the human genomic company. The presidential order itself does not constitute a CFIUS action. Unless CFIUS takes action, such as by entering into an agreement or imposing conditions to address risk prior to completion of the divestment, the transaction remains subject to this part to the extent otherwise applicable for as long as the investment agreement remains in existence following the presidential order and prior to divestment.

    (6) Example 6. A U.S. healthcare company and foreign acquirer announce a transaction that they believe will be subject to CFIUS jurisdiction and disclose that they intend to file a joint voluntary notice soon. No CFIUS action has occurred yet, and the transaction remains subject to this part to the extent otherwise applicable.

    (7) Example 7. Same as Example 6, but the transaction parties file a joint voluntary notice with CFIUS. No CFIUS action has occurred yet, and the transaction remains subject to this part to the extent otherwise applicable.

    (8) Example 8. Company A, a covered person, acquires 100% of the equity and voting interest of Company B, a U.S. business that maintains bulk U.S. sensitive personal data of U.S. persons. After completing the transaction, the parties fail to implement the security requirements and other conditions required under this part. Company A and Company B later submit a joint voluntary notice to CFIUS with respect to the transaction. Upon accepting the notice, CFIUS determines that the transaction is a covered transaction and takes measures to mitigate interim risk that may arise as a result of the transaction until such time that the Committee has completed action, pursuant to 50 U.S.C. 4565(l)(3)(A)(iii). The interim order states that it constitutes a CFIUS action for purposes of this part. Beginning on the effective date of these measures imposed by the interim order, the security requirements and other applicable conditions under this part no longer apply to the transaction. The Department of Justice, however, may take enforcement action under this part, in coordination with CFIUS, with respect to the violations that occurred before the effective date of the interim order issued by CFIUS.

    (9) Example 9. Same as Example 8, but before engaging in the investment agreement for the acquisition, Company A and Company B submit the joint voluntary notice to CFIUS, CFIUS determines that the transaction is a CFIUS covered transaction, CFIUS identifies a risk related to data security arising from the transaction, and CFIUS negotiates and enters into a mitigation agreement with the parties to resolve that risk. The mitigation agreement contains a provision stating that it is a CFIUS action for purposes of this part. Because a CFIUS action has occurred before the parties engage in the ( print page 86220) investment agreement, the acquisition is exempt from this part.

    (10) Example 10. Same as Example 8, but before engaging in the investment agreement for the acquisition, the parties implement the security requirements and other conditions required under these regulations. Company A and Company B then submit a joint voluntary notice to CFIUS, which determines that the transaction is a CFIUS covered transaction. CFIUS identifies a risk related to data security arising from the transaction but determines that the regulations in this part adequately resolve the risk. CFIUS concludes action with respect to the transaction without taking any CFIUS action. Because no CFIUS action has occurred, the transaction remains subject to this part.

    (11) Example 11. Same facts as Example 10, but CFIUS determines that the security requirements and other conditions applicable under this part are inadequate to resolve the national security risk identified by CFIUS. CFIUS negotiates a mitigation agreement with the parties to resolve the risk, which contains a provision stating that it is a CFIUS action for purposes of this part. The transaction is exempt from this part beginning on the effective date of the CFIUS mitigation agreement.

    Telecommunications services.

    (a) Exemption. Subparts C and D of this part do not apply to data transactions, other than those involving data brokerage, to the extent that they are ordinarily incident to and part of the provision of telecommunications services, including international calling, mobile voice, and data roaming.

    (b) Examples— (1) Example 1. A U.S. telecommunications service provider collects covered personal identifiers from its U.S. subscribers. Some of those subscribers travel to a country of concern and use their mobile phone service under an international roaming agreement. The local telecommunications service provider in the country of concern shares these covered personal identifiers with the U.S. service provider for the purposes of either helping provision service to the U.S. subscriber or receiving payment for the U.S. subscriber's use of the country of concern service provider's network under that international roaming agreement. The U.S. service provider provides the country of concern service provider with network or device information for the purpose of provisioning services and obtaining payment for its subscribers' use of the local telecommunications service provider's network. Over the course of 12 months, the volume of network or device information shared by the U.S. service provider with the country of concern service provider for the purpose of provisioning services exceeds the applicable bulk threshold. These transfers of bulk U.S. sensitive personal data are ordinarily incident to and part of the provision of telecommunications services and are thus exempt transactions.

    (2) Example 2. A U.S. telecommunications service provider collects precise geolocation data on its U.S. subscribers. The U.S. telecommunications service provider sells this precise geolocation data in bulk to a covered person for the purpose of targeted advertising. This sale is not ordinarily incident to and part of the provision of telecommunications services and remains a prohibited transaction.

    Drug, biological product, and medical device authorizations.

    (a) Exemption. Subparts C and D of this part do not apply to a data transaction that

    (1) Involves “regulatory approval data” as defined in this section and

    (2) Is necessary to obtain or maintain regulatory approval to market a drug, biological product, device, or a combination product in a country of concern, provided that the U.S. person complies with the recordkeeping and reporting requirements set forth in §§ 202.1101(a) and 202.1102 with respect to such transaction.

    (b) Regulatory approval data. For purposes of this section, the term regulatory approval data means de-identified sensitive personal data that is required to be submitted to a country of concern regulatory entity to obtain or maintain authorization or approval to research or market a drug, biological product, device, or combination product, including in relation to post-marketing studies and post-marketing product surveillance activities, and supplemental product applications for additional uses. The term excludes sensitive personal data not reasonably necessary for a regulatory entity to assess the safety and effectiveness of the drug, biological product, device, or combination product.

    (c) Other terms. For purposes of this section, the terms “drug,” “biological product,” “device,” and “combination product” have the meanings given to them in 21 U.S.C. 321(g)(1), 42 U.S.C. 262(i)(1), 21 U.S.C. 321(h)(1), and 21 CFR 3.2(e), respectively.

    (d) Examples —(1) Example 1. A U.S. pharmaceutical company seeks to market a new drug in a country of concern. The company submits a marketing application to the regulatory entity in the country of concern with authority to approve the drug in the country of concern. The marketing application includes the safety and effectiveness data reasonably necessary to obtain regulatory approval in that country. The transfer of data to the country of concern's regulatory entity is exempt from the prohibitions in this part.

    (2) Example 2. Same as Example 1, except the regulatory entity in the country of concern requires that the data be de-anonymized. The transfer of data is not exempt under this section, because the data includes sensitive personal data that is identified to an individual.

    (3) Example 3. Same as Example 1, except the U.S. company enters a vendor agreement with a covered person located in the country of concern to store, organize, and prepare the bulk U.S. sensitive personal data for submission to the regulatory agency. The transaction is not exempt under this section, because the use of a covered person to prepare the regulatory submission is not necessary to obtain regulatory approval.

    Other clinical investigations and post-marketing surveillance data.

    (a) Exemption. Subparts C and D of this part do not apply to data transactions to the extent that those transactions are:

    (1) Ordinarily incident to and part of clinical investigations regulated by the U.S. Food and Drug Administration (“FDA”) under sections 505(i) and 520(g) of the Federal Food, Drug, and Cosmetic Act (“FD&C Act”) or clinical investigations that support applications to the FDA for research or marketing permits for drugs, biological products, devices, combination products, or infant formula; or

    (2) Ordinarily incident to and part of the collection or processing of clinical care data indicating real-world performance or safety of products, or the collection or processing of post-marketing surveillance data (including pharmacovigilance and post-marketing safety monitoring), and necessary to support or maintain authorization by the FDA, provided the data is deidentified.

    (b) [Reserved]

    Subpart F—Determination of Countries of Concern

    Determination of countries of concern.

    (a) Countries of concern. Solely for purposes of the Order and this part, the ( print page 86221) Attorney General has determined, with the concurrence of the Secretaries of State and Commerce, that the following foreign governments have engaged in a long-term pattern or serious instances of conduct significantly adverse to the national security of the United States or security and safety of U.S. persons and pose a significant risk of exploiting government-related data or bulk U.S. sensitive personal data to the detriment of the national security of the United States or security and safety of U.S. persons:

    (1) China;

    (2) Cuba;

    (3) Iran;

    (4) North Korea;

    (5) Russia; and

    (6) Venezuela.

    (b) Effective date of amendments. Any amendment to the list of countries of concern will apply to any covered data transaction that is initiated, pending, or completed on or after the effective date of the amendment.

    Subpart G—Covered Persons

    Designation of covered persons.

    (a) Designations. The Attorney General may designate any person as a covered person for purposes of this part if, after consultation with other agencies as the Attorney General deems appropriate, the Attorney General determines the person meets any of the criteria set forth in § 202.211(a)(5) of this part.

    (b) Information considered. In determining whether to designate a person as a covered person, the Attorney General may consider any information or material the Attorney General deems relevant and appropriate, classified or unclassified, from any Federal department or agency or from any other source.

    (c) Covered Persons List. The names of persons designated as a covered person for purposes of this part, transactions with whom are prohibited or restricted pursuant to this part, are published in the Federal Register and incorporated into the National Security Division's Covered Persons List. The Covered Persons List is accessible through the following page on the National Security Division's website at https://www.justice.gov/​nsd.

    (d) Non-exhaustive. The list of designated covered persons described in this section is not exhaustive of all covered persons and supplements the categories in the definition of covered persons in § 202.211.

    (e) Effective date; actual and constructive knowledge. (1) Designation as a covered person will be effective from the date of any public announcement by the Department. Except as otherwise authorized in this part, a U.S. person with actual knowledge of a designated person's status is prohibited from knowingly engaging in a covered data transaction with that person on or after the date of the Department's public announcement.

    (2) Publication in the Federal Register is deemed to provide constructive knowledge of a person's status as a covered person.

    Procedures governing removal from the Covered Persons List.

    (a) Requests for removal from the Covered Persons List. A person may petition to seek administrative reconsideration of their designation, or may assert that the circumstances resulting in the designation no longer apply, and thus seek to be removed from the Covered Persons List pursuant to the following administrative procedures:

    (b) Content of requests. A covered person designated under paragraph (a) of this section may submit arguments or evidence that the person believes establish that insufficient basis exists for the designation. Such a person also may propose remedial steps on the person's part, such as corporate reorganization, resignation of persons from positions in a listed entity, or similar steps, that the person believes would negate the basis for designation.

    (c) Additional content; form and method of submission. Requests for removal from the Covered Persons List must be submitted in accordance with this section and with subpart L of this part.

    (d) Requests for more information. The information submitted by the listed person seeking removal will be reviewed by the Attorney General, who may request clarifying, corroborating, or other additional information.

    (e) Meetings. A person seeking removal may request a meeting with the Attorney General; however, such meetings are not required, and the Attorney General may, in the Attorney General's discretion, decline to conduct such a meeting prior to completing a review pursuant to this section.

    (f) Decisions. After the Attorney General has conducted a review of the request for removal, and after consultation with other agencies as the Attorney General deems appropriate, the Attorney General will provide a written decision to the person seeking removal. A covered person's status as a covered person—including its associated prohibitions and restrictions under this part—remains in effect during the pendency of any request to be removed from the Covered Persons List.

    Subpart H—Licensing

    General licenses.

    (a) General course of procedure. The Department may, as appropriate, issue general licenses to authorize, under appropriate terms and conditions, transactions that are subject to the prohibitions or restrictions in this part. In determining whether to issue a general license, the Attorney General may consider any information or material the Attorney General deems relevant and appropriate, classified or unclassified, from any Federal department or agency or from any other source.

    (b) Relationship with specific licenses. It is the policy of the Department not to grant applications for specific licenses authorizing transactions to which the provisions of a general license are applicable.

    (c) Reports. Persons availing themselves of certain general licenses may be required to file reports and statements in accordance with the instructions specified in those licenses, this part or the Order. Failure to file timely all required information in such reports or statements may nullify the authorization otherwise provided by the general license and result in apparent violations of the applicable prohibitions that may be subject to enforcement action.

    Specific licenses.

    (a) General course of procedure. Transactions subject to the prohibitions or restrictions in this part or the Order, and that are not otherwise permitted under this part or a general license, may be permitted only under a specific license, under appropriate terms and conditions.

    (b) Content of applications for specific licenses. Applications for specific licenses shall include, at a minimum, a description of the nature of the transaction, including each of the following requirements:

    (1) The types and volumes of government-related data or bulk U.S. sensitive personal data involved in the transactions;

    (2) The identity of the transaction parties, including any ownership of entities or citizenship or primary residence of individuals;

    (3) The end-use of the data and the method of data transfer; and

    (4) Any other information that the Attorney General may require.

    (c) Additional content; form and method of submissions. Requests for ( print page 86222) specific licenses must be submitted in accordance with this section and with subpart L of this part.

    (d) Additional conditions. Applicants should submit only one copy of a specific license application to the Department; submitting multiple copies may result in processing delays. Any person having an interest in a transaction or proposed transaction may file an application for a specific license authorizing such a transaction.

    (e) Further information to be supplied. Applicants may be required to furnish such further information as the Department deems necessary to assist in making a determination. Any applicant or other party-in-interest desiring to present additional information concerning a specific license application may do so at any time before or after the Department makes its decision with respect to the application. In unique circumstances, the Department may determine, in its discretion, that an oral presentation regarding a license application would assist in the Department's review of the issues involved. Any requests to make such an oral presentation must be submitted electronically by emailing the National Security Division at NSD.FIRS.datasecurity@usdoj.gov or using another official method to make such requests, in accordance with any instructions on the National Security Division's website.

    (f) Decisions. In determining whether to issue a specific license, the Attorney General may consider any information or material the Attorney General deems relevant and appropriate, classified or unclassified, from any Federal department or agency or from any other source. The Department will advise each applicant of the decision respecting the applicant's filed application. The Department's decision with respect to a license application shall constitute final agency action.

    (g) Time to issuance. The Department shall endeavor to respond to any request for a specific license within 45 days after receipt of the request and of any requested additional information and documents.

    (h) Scope. (1) Unless otherwise specified in the license, a specific license authorizes the transaction:

    (i) Only between the parties identified in the license;

    (ii) Only with respect to the data described in the license; and

    (iii) Only to the extent the conditions specified in the license are satisfied. The applicant must inform any other parties identified in the license of the license's scope and of the specific conditions applicable to them.

    (2) The Department will determine whether to grant specific licenses in reliance on representations the applicant made or submitted in connection with the license application, letters of explanation, and other documents submitted. Any license obtained based on a false or misleading representation in the license application, in any document submitted in connection with the license application, or during an oral presentation under this section shall be deemed void as of the date of issuance.

    (i) Reports under specific licenses. As a condition for the issuance of any specific license, the licensee may be required to file reports or statements with respect to the transaction or transactions authorized by the specific license in such form and at such times as may be prescribed in the license. Failure to file timely all required information in such reports or statements may nullify the authorization otherwise provided by the specific license and result in apparent violations of the applicable prohibitions that may be subject to enforcement action.

    (j) Effect of denial. The denial of a specific license does not preclude the reconsideration of an application or the filing of a further application. The applicant or any other party-in-interest may at any time request, by written correspondence, reconsideration of the denial of an application based on new facts or changed circumstances.

    General provisions.

    (a) Effect of license. (1) No license issued under this subpart, or otherwise issued by the Department, authorizes or validates any transaction effected prior to the issuance of such license or other authorization, unless specifically provided for in such license or authorization.

    (2) No license issued under this subpart authorizes or validates any transaction prohibited under or subject to this part unless the license is properly issued by the Department and specifically refers to this part.

    (3) Any license authorizing or validating any transaction that is prohibited under or otherwise subject to this part has the effect of removing or amending those prohibitions or other requirements from the transaction, but only to the extent specifically stated by the terms of the license. Unless the license otherwise specifies, such an authorization does not create any right, duty, obligation, claim, or interest in, or with respect to, any property that would not otherwise exist under ordinary principles of law.

    (4) Nothing contained in this part shall be construed to supersede the requirements established under any other provision of law or to relieve a person from any requirement to obtain a license or authorization from another department or agency of the United States Government in compliance with applicable laws and regulations subject to the jurisdiction of that department or agency. For example, issuance of a specific license authorizing a transaction otherwise prohibited by this part does not operate as a license or authorization to conclude the transaction that is otherwise required from the U.S. Department of Commerce, U.S. Department of State, U.S. Department of the Treasury, or any other department or agency of the United States Government.

    (b) Amendment, modification, or rescission. Except as otherwise provided by law, any licenses (whether general or specific), authorizations, instructions, or forms issued thereunder may be amended, modified, or rescinded at any time.

    (c) Consultation. The Department will issue, amend, modify, or rescind a general or specific license in concurrence with the Departments of State, Commerce, and Homeland Security and in consultation with other relevant agencies.

    (d) Exclusion from licenses and other authorizations. The Attorney General reserves the right to exclude any person, property, or transaction from the operation of any license or from the privileges conferred by any license. The Attorney General also reserves the right to restrict the applicability of any license to particular persons, property, transactions, or classes thereof. Such actions are binding upon all persons receiving actual or constructive notice of the exclusions or restrictions.

    Subpart I—Advisory Opinions

    Inquiries concerning application of this part.

    (a) General. Any U.S. person party to a transaction potentially regulated under the Order and this part, or an agent of the party to such a transaction on the party's behalf, may request from the Attorney General a statement of the present enforcement intentions of the Department of Justice under the Order with respect to that transaction that may be subject to the prohibitions or restrictions in the Order and this part (“advisory opinion”).

    (b) Anonymous, hypothetical, non-party and ex post facto review requests excluded. The entire transaction that is the subject of the advisory opinion ( print page 86223) request must be an actual, as opposed to hypothetical, transaction and involve disclosed, as opposed to anonymous, parties to the transaction. Advisory opinion requests must be submitted by a U.S. person party to the transaction or that party's agent and have no application to a party that does not join the request. The transaction need not involve only prospective conduct, but an advisory opinion request will not be considered unless that portion of the transaction for which an opinion is sought involves only prospective conduct.

    (c) Contents. Each advisory opinion request shall be specific and must be accompanied by all material information bearing on the conduct for which an advisory opinion is requested, and on the circumstances of the prospective conduct, including background information, complete copies of any and all operative documents, and detailed statements of all collateral or oral understandings, if any. Each request must include, at a minimum:

    (1) The identities of the transaction parties, including any ownership of entities or citizenship or primary residence of individuals;

    (2) A description of the nature of the transaction, including the types and volumes of government-related data or bulk U.S. sensitive personal data involved in the transaction, the end-use of the data, the method of data transfer, and any restrictions or requirements related to a party's right or ability to control, access, disseminate, or dispose of the data; and

    (3) Any potential basis for exempting or excluding the transaction from the prohibitions or restrictions imposed in the Order and this part.

    (d) Additional contents; format and method of submissions. Requests for advisory opinions must be submitted in accordance with this section and with subpart L of this part.

    (e) Further information to be supplied. Each party shall provide any additional information or documents that the Department of Justice may thereafter request in its review of the matter. Any information furnished orally shall be confirmed promptly in writing; signed by or on behalf of the party that submitted the initial review request; and certified to be a true, correct, and complete disclosure of the requested information. A request will not be deemed complete until the Department of Justice receives such additional information. In connection with an advisory opinion request, the Department of Justice may conduct any independent investigation it believes appropriate.

    (f) Outcomes. After submission of an advisory opinion request, the Department, in its discretion, may state its present enforcement intention under the Order and this part with respect to the proposed conduct; may decline to state its present enforcement intention; or, if circumstances warrant, may take such other position or initiate such other action as it considers appropriate. Any requesting party or parties may withdraw a request at any time prior to issuance of an advisory opinion. The Department remains free, however, to submit such comments to the requesting party or parties as it deems appropriate. Failure to take action after receipt of a request, documents, or information, whether submitted pursuant to this procedure or otherwise, shall not in any way limit or stop the Department from taking any action at such time thereafter as it deems appropriate. The Department reserves the right to retain any advisory opinion request, document, or information submitted to it under this procedure or otherwise, to disclose any advisory opinion and advisory opinion request, including the identities of the requesting party and foreign parties to the transaction, the general nature and circumstances of the proposed conduct, and the action of the Department in response to any advisory opinion request, consistent with applicable law, and to use any such request, document, or information for any governmental purpose.

    (g) Time for response. The Department shall endeavor to respond to any advisory opinion request within 30 days after receipt of the request and of any requested additional information and documents.

    (h) Written decisions only. The requesting party or parties may rely only upon a written advisory opinion signed by the Attorney General.

    (i) Effect of advisory opinion. Each advisory opinion can be relied upon by the requesting party or parties to the extent the disclosures made pursuant to this subpart were accurate and complete and to the extent the disclosures continue accurately and completely to reflect circumstances after the date of the issuance of the advisory opinion. An advisory opinion will not restrict enforcement actions by any agency other than the Department of Justice. It will not affect a requesting party's obligations to any other agency or under any statutory or regulatory provision other than those specifically discussed in the advisory opinion.

    (j) Amendment or revocation of advisory opinion. An advisory opinion may be amended or revoked at any time after it has been issued. Notice of such will be given in the same manner as notice of the advisory opinion was originally given or in the Federal Register . Whenever possible, a notice of amendment or revocation will state when the Department will consider a party's reliance on the superseded advisory opinion to be unreasonable, and any transition period that may be applicable.

    (k) Compliance. Neither the submission of an advisory opinion request, nor its pendency, shall in any way alter the responsibility or obligation of a requesting party to comply with the Order, this part, or any other applicable law.

    Subpart J—Due Diligence and Audit Requirements

    Due diligence for restricted transactions.

    (a) Data compliance program. By the effective date of this part, U.S. persons engaging in any restricted transactions shall develop and implement a data compliance program.

    (b) Requirements. The data compliance program shall include, at a minimum, each of the following requirements:

    (1) Risk-based procedures for verifying data flows involved in any restricted transaction, including procedures to verify and log, in an auditable manner, the following:

    (i) The types and volumes of government-related data or bulk U.S. sensitive personal data involved in the transaction;

    (ii) The identity of the transaction parties, including any ownership of entities or citizenship or primary residence of individuals; and

    (iii) The end-use of the data and the method of data transfer;

    (2) For restricted transactions that involve vendors, risk-based procedures for verifying the identity of vendors;

    (3) A written policy that describes the data compliance program and that is annually certified by an officer, executive, or other employee responsible for compliance;

    (4) A written policy that describes the implementation of the security requirements as defined in § 202.248 of this part and that is annually certified by an officer, executive, or other employee responsible for compliance; and

    (5) Any other information that the Attorney General may require.

    ( print page 86224)
    Audits for restricted transactions.

    (a) Audit required. U.S. persons that engage in any restricted transactions under § 202.401 of this part shall conduct an audit that complies with the requirements of this section.

    (b) Who may conduct the audit. The auditor:

    (1) Must be qualified and competent to examine, verify, and attest to the U.S. person's compliance with and the effectiveness of the security requirements, as defined in § 202.248 of this part, and all other applicable requirements, as defined in § 202.401 of this part, implemented for restricted transactions;

    (2) Must be independent and external; and

    (3) Cannot be a covered person or a country of concern.

    (c) When required. The audit must be performed once for each calendar year in which the U.S. person engages in any restricted transactions.

    (d) Timeframe. The audit must cover the preceding 12 months.

    (e) Scope. The audit must:

    (1) Examine the U.S. person's data transactions;

    (2) Examine the U.S. person's data compliance program required under § 202.1001 of this part and its implementation;

    (3) Examine relevant records required under § 202.1101 of this part;

    (4) Examine the U.S. person's security requirements, as defined by § 202.248 of this part; and

    (5) Use a reliable methodology to conduct the audit.

    (f) Report. (1) The auditor must prepare and submit a written report to the U.S. person within 60 days of the completion of the audit.

    (2) The audit report must:

    (i) Describe the nature of any prohibited transactions, restricted transactions, and exempt transactions engaged in by the U.S. person;

    (ii) Describe the methodology undertaken, including the policies and other documents reviewed, personnel interviewed, and any facilities, equipment, networks, or systems examined;

    (iii) Describe the effectiveness of the U.S. person's data compliance program and its implementation;

    (iv) Describe any vulnerabilities or deficiencies in the implementation of the security requirements that have affected or could affect access to government-related data or bulk U.S. sensitive personal data by a country of concern or covered person;

    (v) Describe any instances in which the security requirements failed or were otherwise not effective in mitigating access to government-related data or bulk U.S. sensitive personal data by a country of concern or covered person; and

    (vi) Recommend any improvements or changes to policies, practices, or other aspects of the U.S. person's business to ensure compliance with the security requirements.

    (3) U.S. persons engaged in restricted transactions must retain the audit report for a period of at least 10 years, consistent with the recordkeeping requirements in § 202.1101.

    Subpart K—Reporting and Recordkeeping Requirements

    Records and recordkeeping requirements.

    (a) Records. Except as otherwise provided, U.S. persons engaging in any transaction subject to the provisions of this part shall keep a full and accurate record of each such transaction engaged in, and such record shall be available for examination for at least 10 years after the date of such transaction.

    (b) Additional recordkeeping requirements. U.S. persons engaging in any restricted transaction shall create and maintain, at a minimum, the following records in an auditable manner:

    (1) A written policy that describes the data compliance program and that is certified annually by an officer, executive, or other employee responsible for compliance;

    (2) A written policy that describes the implementation of any applicable security requirements as defined in § 202.248 of this part and that is certified annually by an officer, executive, or other employee responsible for compliance;

    (3) The results of any annual audits that verify the U.S. person's compliance with the security requirements and any conditions on a license;

    (4) Documentation of the due diligence conducted to verify the data flow involved in any restricted transaction, including:

    (i) The types and volumes of government-related data or bulk U.S. sensitive personal data involved in the transaction;

    (ii) The identity of the transaction parties, including any direct and indirect ownership of entities or citizenship or primary residence of individuals; and

    (iii) A description of the end-use of the data;

    (5) Documentation of the method of data transfer;

    (6) Documentation of the dates the transaction began and ended;

    (7) Copies of any agreements associated with the transaction;

    (8) Copies of any relevant licenses or advisory opinions;

    (9) The document reference number for any original document issued by the Attorney General, such as a license or advisory opinion;

    (10) A copy of any relevant documentation received or created in connection with the transaction; and

    (11) An annual certification by an officer, executive, or other employee responsible for compliance of the completeness and accuracy of the records documenting due diligence.

    Reports to be furnished on demand.

    (a) Reports. Every person is required to furnish under oath, in the form of reports or otherwise, from time to time and at any time as may be required by the Department of Justice, complete information relative to any act or transaction or covered data transaction, regardless of whether such act, transaction, or covered data transaction is effected pursuant to a license or otherwise, subject to the provisions of this part. The Department of Justice may require that such reports include the production of any books, contracts, letters, papers, or other hard copy or electronic documents relating to any such act, transaction, or covered data transaction, in the custody or control of the persons required to make such reports. Reports may be required either before, during, or after such acts, transactions, or covered data transactions. The Department of Justice may, through any person or agency, conduct investigations, hold hearings, administer oaths, examine witnesses, receive evidence, take depositions, and require by subpoena the attendance and testimony of witnesses and the production of any books, contracts, letters, papers, and other hard copy or electronic documents relating to any matter under investigation, regardless of whether any report has been required or filed in connection therewith.

    (b) Definition of the term “document.” For purposes of paragraph (a) of this section, the term document includes any written, recorded, or graphic matter or other means of preserving thought or expression (including in electronic format), and all tangible things stored in any medium from which information can be processed, transcribed, or obtained directly or indirectly, including correspondence, memoranda, notes, messages, contemporaneous communications such as text and instant messages, letters, emails, spreadsheets, metadata, contracts, ( print page 86225) bulletins, diaries, chronological data, minutes, books, reports, examinations, charts, ledgers, books of account, invoices, air waybills, bills of lading, worksheets, receipts, printouts, papers, schedules, affidavits, presentations, transcripts, surveys, graphic representations of any kind, drawings, photographs, graphs, video or sound recordings, and motion pictures or other film.

    (c) Format. Persons providing documents to the Department of Justice pursuant to this section must produce documents in a usable format agreed upon by the Department of Justice. For guidance, see the Department of Justice's data delivery standards available on the National Security Division's website at https://www.justice.gov/​nsd.

    Annual reports.

    (a) Who must report. An annual report must be filed by any U.S. person that is engaged in a restricted transaction involving cloud-computing services, and that has 25% or more of the U.S. person's equity interests owned (directly or indirectly, through any contract, arrangement, understanding, relationship, or otherwise) by a country of concern or covered person.

    (b) Primary responsibility to report. A report may be filed on behalf of a U.S. person engaging in the data transaction described in § 202.1103(a) by an attorney, agent, or other person. Primary responsibility for reporting, however, rests with the actual U.S. person engaging in the data transaction. No U.S. person is excused from filing a report by reason of the fact that another U.S. person has submitted a report with regard to the same data transaction, except where the U.S. person has actual knowledge that the other U.S. person filed the report.

    (c) When reports are due. A report on the data transactions described in § 202.1103(a) engaged in as of December 31 of the previous year shall be filed annually by March 1 of the subsequent year.

    (d) Contents of reports. Annual reports on the data transactions described in § 202.1103(a) shall include the following:

    (1) The name and address of the U.S. person engaging in the covered data transaction, and the name, telephone number, and email address of a contact from whom additional information may be obtained;

    (2) A description of the covered data transaction, including:

    (i) The date of the transaction;

    (ii) The types and volumes of government-related data or bulk U.S. sensitive personal data involved in the transaction;

    (iii) The method of data transfer; and

    (iv) Any persons participating in the data transaction and their respective locations, including the name and location of each data recipient, the ownership of entities or citizenship or primary residence of individuals, the name and location of any covered persons involved in the transaction, and the name of any countries of concern involved in the transaction;

    (3) A copy of any relevant documentation received or created in connection with the transaction; and

    (4) Any other information that the Department of Justice may require.

    (e) Additional contents; format and method of submission. Reports required by this section must be submitted in accordance with this section and with subpart L of this part.

    Reports on rejected prohibited transactions.

    (a) Who must report. A report must be filed by any U.S. person that has received and affirmatively rejected (including automatically rejected using software, technology, or automated tools) an offer from another person to engage in a prohibited transaction involving data brokerage.

    (b) When reports are due. U.S. persons shall file reports within 14 days of rejecting a transaction prohibited by this part.

    (c) Contents of reports. Reports on rejected transactions shall include the following, to the extent known and available to the person filing the report at the time the transaction is rejected:

    (1) The name and address of the U.S. person that rejected the prohibited transaction, and the name, telephone number, and email address of a contact from whom additional information may be obtained;

    (2) A description of the rejected transaction, including:

    (i) The date the transaction was rejected;

    (ii) The types and volumes of government-related data or bulk U.S. sensitive personal data involved in the transaction;

    (iii) The method of data transfer;

    (iv) Any persons attempting to participate in the transaction and their respective locations, including the name and location of each data recipient, the ownership of entities or citizenship or primary residence of individuals, the name and location of any covered persons involved in the transaction, and the name of any countries of concern involved in the transaction;

    (v) A copy of any relevant documentation received or created in connection with the transaction; and

    (vi) Any other information that the Department of Justice may require.

    (d) Additional contents; format and method of submission. Reports required by this section must be submitted in accordance with this section and with subpart L of this part.

    Subpart L—Submitting Applications, Requests, Reports, and Responses

    Procedures.

    (a) Application of this subpart. This subpart applies to any submissions required or permitted by this part, including reports of known or suspected violations submitted pursuant to § 202.302, requests for removal from the Covered Persons List submitted pursuant to subpart G of this part, requests for specific licenses submitted pursuant to § 202.802, advisory opinion requests submitted pursuant to subpart I of this part, annual reports submitted pursuant to § 202.1103, reports on rejected prohibited transactions submitted pursuant to § 202.1104, and responses to pre-penalty notices and findings of violations submitted pursuant to § 202.1306 (collectively, “submissions”).

    (b) Form of submissions. Submissions must follow the instructions in this part and any instructions on the National Security Division's website. With the exception of responses to pre-penalty notices or findings of violations submitted pursuant to subpart M of this part, submissions must use the forms on the National Security Division's website or another official reporting option as specified by the National Security Division.

    (c) Method of submissions. Submissions must be made to the National Security Division electronically by emailing the National Security Division at NSD.FIRS.datasecurity@usdoj.gov or using another official electronic reporting option, in accordance with any instructions on the National Security Division's website.

    (d) Certification. If the submitting party is an individual, the submission must be signed by the individual or the individual's attorney. If the submitting party is not an individual, the submission must be signed on behalf of each submitting party by an officer, director, a person performing the functions of an officer or a director of, or an attorney for, the submitting party. Annual reports submitted pursuant to § 202.1103, and reports on rejected transactions submitted pursuant to ( print page 86226) § 202.1104, must be signed by an officer, a director, a person performing the functions of an officer or a director, or an employee responsible for compliance. In appropriate cases, the Department of Justice may require the chief executive officer of a requesting party to sign the request. Each such person signing a submission must certify that the submission is true, accurate, and complete.

    Subpart M—Penalties and Finding of Violation

    Penalties for violations.

    (a) Civil and criminal penalties. Section 206 of IEEPA, 50 U.S.C. 1705, is applicable to violations of the provisions of any license, ruling, regulation, order, directive, or instruction issued by or pursuant to the direction or authorization of the Attorney General pursuant to this part or otherwise under IEEPA.

    (1) A civil penalty not to exceed the amount set forth in section 206 of IEEPA may be imposed on any person who violates, attempts to violate, conspires to violate, or causes a violation of any license, order, regulation, or prohibition issued under IEEPA.

    (2) IEEPA provides for a maximum civil penalty not to exceed the greater of $368,136 or an amount that is twice the amount of the transaction that is the basis of the violation with respect to which the penalty is imposed.

    (3) A person who willfully commits, willfully attempts to commit, willfully conspires to commit, or aids or abets in the commission of a violation of any license, order, regulation, or prohibition issued under IEEPA shall, upon conviction, be fined not more than $1,000,000, or if a natural person, may be imprisoned for not more than 20 years, or both.

    (b) Adjustment of civil penalties. The civil penalties provided in IEEPA are subject to adjustment pursuant to the Federal Civil Penalties Inflation Adjustment Act of 1990 (Public Law 101-410, as amended, 28 U.S.C. 2461 note).

    (c) Adjustment of criminal penalties. The criminal penalties provided in IEEPA are subject to adjustment pursuant to 18 U.S.C. 3571.

    (d) False statements. Pursuant to 18 U.S.C. 1001, whoever, in any matter within the jurisdiction of the executive, legislative, or judicial branch of the Government of the United States, knowingly and willfully falsifies, conceals, or covers up by any trick, scheme, or device a material fact; or makes any materially false, fictitious, or fraudulent statement or representation; or makes or uses any false writing or document knowing the same to contain any materially false, fictitious, or fraudulent statement or entry shall be fined under title 18, United States Code, imprisoned, or both.

    (e) Other applicable laws. Violations of this part may also be subject to other applicable laws.

    Process for pre-penalty notice.

    (a) When and how issued. (1) If the Department of Justice has reason to believe that there has occurred a violation of any provision of this part or a violation of the provisions of any license, ruling, regulation, order, directive, or instruction issued by or pursuant to the direction or authorization of the Attorney General pursuant to this part or otherwise under IEEPA and determines that a civil monetary penalty is warranted, the Department of Justice will issue a pre-penalty notice informing the alleged violator of the agency's intent to impose a monetary penalty.

    (2) The pre-penalty notice shall be in writing.

    (3) The pre-penalty notice may be issued whether or not another agency has taken any action with respect to the matter.

    (4) The Department shall provide the alleged violator with the relevant information that is not privileged, classified, or otherwise protected, and that forms the basis for the pre-penalty notice, including a description of the alleged violation and proposed penalty amount.

    (b) Opportunity to respond. An alleged violator has the right to respond to a pre-penalty notice in accordance with § 202.1306 of this part.

    (c) Settlement. Settlement discussion may be initiated by the Department of Justice, the alleged violator, or the alleged violator's authorized representative.

    (d) Representation. A representative of the alleged violator may act on behalf of the alleged violator, but any oral communication with the Department of Justice prior to a written submission regarding the specific allegations contained in the pre-penalty notice must be preceded by a written letter of representation, unless the pre-penalty notice was served upon the alleged violator in care of the representative.

    Penalty imposition.

    If, after considering any written response to the pre-penalty notice and any relevant facts, the Department of Justice determines that there was a violation by the alleged violator named in the pre-penalty notice and that a civil monetary penalty is appropriate, the Department of Justice may issue a penalty notice to the violator containing a determination of the violation and the imposition of the monetary penalty. The Department shall provide the violator with any relevant, non-classified information that forms the basis of the penalty. The issuance of the penalty notice shall constitute final agency action. The violator has the right to seek judicial review of that final agency action in Federal district court.

    Administrative collection and litigation.

    In the event that the violator does not pay the penalty imposed pursuant to this part or make payment arrangements acceptable to the Department of Justice, the Department of Justice may refer the matter to the Department of the Treasury for administrative collection measures or take appropriate action to recover the penalty in any civil suit in Federal district court.

    Finding of violation.

    (a) When and how issued. (1) The Department of Justice may issue an initial finding of violation that identifies a violation if the Department of Justice:

    (i) Determines that there has occurred a violation of any provision of this part, or a violation of the provisions of any license, ruling, regulation, order, directive, or instruction issued by or pursuant to the direction or authorization of the Attorney General pursuant to this part or otherwise under IEEPA;

    (ii) Considers it important to document the occurrence of a violation; and

    (iii) Concludes that an administrative response is warranted but that a civil monetary penalty is not the most appropriate response.

    (2) An initial finding of violation shall be in writing and may be issued whether or not another agency has taken any action with respect to the matter.

    (3) The Department shall provide the alleged violator with the relevant information that is not privileged, classified, or otherwise protected, that forms the basis for the finding of violation, including a description of the alleged violation.

    (b) Opportunity to respond. An alleged violator has the right to contest an initial finding of violation in accordance with § 202.1306 of this part.

    (c) Determination— (1) Determination that a finding of violation is warranted. If, after considering the response, the Department of Justice determines that a final finding of violation should be issued, the Department of Justice will ( print page 86227) issue a final finding of violation that will inform the violator of its decision. The Department shall provide the violator with the relevant information that is not privileged, classified, or otherwise protected, that forms the basis for the finding of violation. A final finding of violation shall constitute final agency action. The violator has the right to seek judicial review of that final agency action in Federal district court.

    (2) Determination that a finding of violation is not warranted. If, after considering the response, the Department of Justice determines a finding of violation is not warranted, then the Department of Justice will inform the alleged violator of its decision not to issue a final finding of violation. A determination by the Department of Justice that a final finding of violation is not warranted does not preclude the Department of Justice from pursuing other enforcement actions.

    (d) Representation. A representative of the alleged violator may act on behalf of the alleged violator, but any oral communication with the Department of Justice prior to a written submission regarding the specific alleged violations contained in the initial finding of violation must be preceded by a written letter of representation, unless the initial finding of violation was served upon the alleged violator in care of the representative.

    Opportunity to respond to a pre-penalty notice or finding of violation.

    (a) Right to respond. An alleged violator has the right to respond to a pre-penalty notice or finding of violation by making a written presentation to the Department of Justice.

    (b) Deadline for response. A response to a pre-penalty notice or finding of violation must be electronically submitted within 30 days of electronic service of the notice or finding. The failure to submit a response within 30 days shall be deemed to be a waiver of the right to respond.

    (c) Extensions of time for response. Any extensions of time will be granted, at the discretion of the Department of Justice, only upon specific request to the Department of Justice.

    (d) Contents of response. Any response should set forth in detail why the alleged violator either believes that a violation of the regulations did not occur or why a finding of violation or penalty is otherwise unwarranted under the circumstances. The response should include all documentary or other evidence available to the alleged violator that supports the arguments set forth in the response. The Department of Justice will consider all relevant materials submitted in the response.

    Subpart N—Government-Related Location Data List

    Government-Related Location Data List.

    For each Area ID listed in this section, each of the latitude/longitude coordinate pairs forms a corner of the geofenced area.

    Area ID Latitude/longitude coordinates of geofenced area
    1 38.935624, −77.207888 38.931674, −77.199387 38.929289, −77.203229 38.932939, −77.209328
    2 38.950446, −77.125592 38.952077, −77.120947 38.947468, −77.120060 38.947135, −77.122809
    3 38.953191, −77.372792 38.953174, −77.369764 38.951148, −77.369759 38.951152, −77.372781
    4 39.113546, −76.777053 39.131086, −76.758527 39.100086, −76.749715 39.093304, −76.760882
    5 33.416299, −82.172772 33.416666, −82.164366 33.406350, −82.163645 33.406261, −82.172947
    6 21.525093, −158.019139 21.525362, −158.002575 21.518161, −158.002233 21.518010, −158.018364
    7 21.475012, −158.061844 21.483357, −158.057568 21.479226, −158.049881 21.472695, −158.052371
    8 29.449322, −98.646174 29.452872, −98.637623 29.448069, −98.637303 29.444547, −98.640607

Document Information

Published:
10/29/2024
Department:
Justice Department
Entry Type:
Proposed Rule
Action:
Proposed rule; request for comments.
Document Number:
2024-24582
Dates:
Written comments on this notice of proposed rulemaking (NPRM) must be received by November 29, 2024.
Pages:
86116-86227 (112 pages)
Docket Numbers:
Docket No. NSD 104
RINs:
1124-AA01: Provisions Regarding Access to Americans' Bulk Sensitive Personal Data and Government-Related Data by Countries of Concern
RIN Links:
https://www.federalregister.gov/regulations/1124-AA01/provisions-regarding-access-to-americans-bulk-sensitive-personal-data-and-government-related-data-by
Topics:
Computer technology, Health records, Incorporation by reference, Investments, Military personnel, Personally identifiable information, Privacy, Reporting and recordkeeping requirements, Security measures
PDF File:
2024-24582.pdf
Supporting Documents:
» DOJ and CISA Engagement on Data Security Proposed Rule (10.30.2024) (Life Sciences.Genomics.Healthcare)
» DOJ and CISA Engagement on Data Security Proposed Rule (10.30.2024) (Financial Services.Banking.and Insurance)
» DOJ and CISA Engagement on Data Security Proposed Rule (11.05.2024) (Civil Society.Tech.Cloud.and Software)
CFR: (1)
28 CFR 202