Which groups have the force of the law behind them with regard to regulating advertising?

Start Preamble

Show

AGENCY:

Federal Trade Commission.

ACTION:

Advance notice of proposed rulemaking; request for public comment; public forum.

SUMMARY:

The Federal Trade Commission (“FTC”) is publishing this advance notice of proposed rulemaking (“ANPR”) to request public comment on the prevalence of commercial surveillance and data security practices that harm consumers. Specifically, the Commission invites comment on whether it should implement new trade regulation rules or other regulatory alternatives concerning the ways in which companies collect, aggregate, protect, use, analyze, and retain consumer data, as well as transfer, share, sell, or otherwise monetize that data in ways that are unfair or deceptive.

DATES:

Comments due date: Comments must be received on or before October 21, 2022.

Meeting date: The Public Forum will be held virtually on Thursday, September 8, 2022, from 2 p.m. until 7:30 p.m. Members of the public are invited to attend at the website https://www.ftc.gov/​news-events/​events/​2022/​09/​commercial-surveillance-data-security-anpr-public-forum.

ADDRESSES:

Interested parties may file a comment online or on paper by following the instructions in the Comment Submissions part of the SUPPLEMENTARY INFORMATION section below. Write “Commercial Surveillance ANPR, R111004” on your comment, and file your comment online at https://www.regulations.gov. If you prefer to file your comment on paper, mail your comment to the following address: Federal Trade Commission, Office of the Secretary, 600 Pennsylvania Avenue NW, Suite CC-5610 (Annex B), Washington, DC 20580.

Start Further Info

FOR FURTHER INFORMATION CONTACT:

James Trilling, 202-326-3497; Peder Magee, 202-326-3538; Olivier Sylvain, 202-326-3046; or .

I. Overview

Whether they know it or not, most Americans today surrender their personal information to engage in the most basic aspects of modern life. When they buy groceries, do homework, or apply for car insurance, for example, consumers today likely give a wide range of personal information about themselves to companies, including their movements,[1] prayers,[2] friends,[3] menstrual cycles,[4] web-browsing,[5] and faces,[6] among other basic aspects of their lives.

Companies, meanwhile, develop and market products and services to collect and monetize this data. An elaborate and lucrative market for the collection, Start Printed Page 51274 retention, aggregation, analysis, and onward disclosure of consumer data incentivizes many of the services and products on which people have come to rely. Businesses reportedly use this information to target services—namely, to set prices,[7] curate newsfeeds,[8] serve advertisements,[9] and conduct research on people's behavior,[10] among other things. While, in theory, these personalization practices have the potential to benefit consumers, reports note that they have facilitated consumer harms that can be difficult if not impossible for any one person to avoid.[11]

Some companies, moreover, reportedly claim to collect consumer data for one stated purpose but then also use it for other purposes.[12] Many such firms, for example, sell or otherwise monetize such information or compilations of it in their dealings with advertisers, data brokers, and other third parties.[13] These practices also appear to exist outside of the retail consumer setting. Some employers, for example, reportedly collect an assortment of worker data to evaluate productivity, among other reasons [14] —a practice that has become far more pervasive since the onset of the COVID-19 pandemic.[15]

Many companies engage in these practices pursuant to the ostensible consent that they obtain from their consumers.[16] But, as networked devices and online services become essential to navigating daily life, consumers may have little choice but to accept the terms that firms offer.[17] Reports suggest that consumers have become resigned to the ways in which companies collect and monetize their information, largely because consumers have little to no actual control over what happens to their information once companies collect it.[18]

In any event, the permissions that consumers give may not always be meaningful or informed. Studies have shown that most people do not generally understand the market for consumer data that operates beyond their monitors and displays.[19] Most consumers, for example, know little about the data brokers and third parties who collect and trade consumer data or build consumer profiles [20] that can expose intimate details about their lives and, in the wrong hands, could expose unsuspecting people to future harm. [21] Start Printed Page 51275 Many privacy notices that acknowledge such risks are reportedly not readable to the average consumer.[22] Many consumers do not have the time to review lengthy privacy notices for each of their devices, applications, websites, or services,[23] let alone the periodic updates to them. If consumers do not have meaningful access to this information, they cannot make informed decisions about the costs and benefits of using different services.[24]

This information asymmetry between companies and consumer runs even deeper. Companies can use the information that they collect to direct consumers' online experiences in ways that are rarely apparent—and in ways that go well beyond merely providing the products or services for which consumers believe they sign up.[25] The Commission's enforcement actions have targeted several pernicious dark pattern practices, including burying privacy settings behind multiple layers of the user interface [26] and making misleading representations to “trick or trap” consumers into providing personal information.[27] In other instances, firms may misrepresent or fail to communicate clearly how they use and protect people's data.[28] Given the reported scale and pervasiveness of such practices, individual consumer consent may be irrelevant.

The material harms of these commercial surveillance practices may be substantial, moreover, given that they may increase the risks of cyberattack by hackers, data thieves, and other bad actors. Companies' lax data security practices may impose enormous financial and human costs. Fraud and identity theft cost both businesses and consumers billions of dollars, and consumer complaints are on the rise.[29] For some kinds of fraud, consumers have historically spent an average of 60 hours per victim trying to resolve the issue.[30] Even the nation's critical infrastructure is at stake, as evidenced by the recent attacks on the largest fuel pipeline,[31] meatpacking plants,[32] and water treatment facilities [33] in the United States.

Companies' collection and use of data have significant consequences for consumers' wallets, safety, and mental health. Sophisticated digital advertising systems reportedly automate the targeting of fraudulent products and services to the most vulnerable consumers.[34] Stalking apps continue to endanger people.[35] Children and teenagers remain vulnerable to cyber bullying, cyberstalking, and the distribution of child sexual abuse material.[36] Peer-reviewed research has linked social media use with depression, anxiety, eating disorders, and suicidal ideation among kids and teens.[37]

Finally, companies' growing reliance on automated systems is creating new Start Printed Page 51276 forms and mechanisms for discrimination based on statutorily protected categories,[38] including in critical areas such as housing,[39] employment,[40] and healthcare.[41] For example, some employers' automated systems have reportedly learned to prefer men over women.[42] Meanwhile, a recent investigation suggested that lenders' use of educational attainment in credit underwriting might disadvantage students who attended historically Black colleges and universities.[43] And the Department of Justice recently settled its first case challenging algorithmic discrimination under the Fair Housing Act for a social media advertising delivery system that unlawfully discriminated based on protected categories.[44] Critically, these kinds of disparate outcomes may arise even when automated systems consider only unprotected consumer traits.[45]

The Commission is issuing this ANPR pursuant to Section 18 of the Federal Trade Commission Act (“FTC Act”) and the Commission's Rules of Practice [46] because recent Commission actions, news reporting, and public research suggest that harmful commercial surveillance and lax data security practices may be prevalent and increasingly unavoidable.[47] These developments suggest that trade regulation rules reflecting these current realities may be needed to ensure Americans are protected from unfair or deceptive acts or practices. New rules could also foster a greater sense of predictability for companies and consumers and minimize the uncertainty that case-by-case enforcement may engender.

Countries around the world and states across the nation have been alert to these concerns. Many accordingly have enacted laws and regulations that impose restrictions on companies' collection, use, analysis, retention, transfer, sharing, and sale or other monetization of consumer data. In recognition of the complexity and opacity of commercial surveillance practices today, such laws have reduced the emphasis on providing notice and obtaining consent and have instead stressed additional privacy “defaults” as well as increased accountability for businesses and restrictions on certain practices.

For example, European Union (“EU”) member countries enforce the EU's General Data Protection Regulation (“GDPR”),[48] which, among other things, limits the processing of personal data to six lawful bases and provides consumers with certain rights to access, delete, correct, and port such data. Canada's Personal Information Protection and Electronic Documents Act [49] and Brazil's General Law for the Start Printed Page 51277 Protection of Personal Data [50] contain some similar rights.[51] Laws in California,[52] Virginia,[53] Colorado,[54] Utah,[55] and Connecticut,[56] moreover, include some comparable rights, and numerous state legislatures are considering similar laws. Alabama,[57] Colorado,[58] and Illinois,[59] meanwhile, have enacted laws related to the development and use of artificial intelligence. Other states, including Illinois,[60] Texas,[61] and Washington,[62] have enacted laws governing the use of biometric data. All fifty U.S. states have laws that require businesses to notify consumers of certain breaches of consumers' data.[63] And numerous states require businesses to take reasonable steps to secure consumers' data.[64]

Through this ANPR, the Commission is beginning to consider the potential need for rules and requirements regarding commercial surveillance and lax data security practices. Section 18 of the FTC Act authorizes the Commission to promulgate, modify, and repeal trade regulation rules that define with specificity acts or practices that are unfair or deceptive in or affecting commerce within the meaning of Section 5(a)(1) of the FTC Act.[65] Through this ANPR, the Commission aims to generate a public record about prevalent commercial surveillance practices or lax data security practices that are unfair or deceptive, as well as about efficient, effective, and adaptive regulatory responses. These comments will help to sharpen the Commission's enforcement work and may inform reform by Congress or other policymakers, even if the Commission does not ultimately promulgate new trade regulation rules.[66]

The term “data security” in this ANPR refers to breach risk mitigation, data management and retention, data minimization, and breach notification and disclosure practices.

For the purposes of this ANPR, “commercial surveillance” refers to the collection, aggregation, analysis, retention, transfer, or monetization of consumer data and the direct derivatives of that information. These data include both information that consumers actively provide—say, when they affirmatively register for a service or make a purchase—as well as personal identifiers and other information that companies collect, for example, when a consumer casually browses the web or opens an app. This latter category is far broader than the first.

The term “consumer” as used in this ANPR includes businesses and workers, not just individuals who buy or exchange data for retail goods and services. This approach is consistent with the Commission's longstanding practice of bringing enforcement actions against firms that harm companies [67] as well as workers of all kinds.[68] The FTC has frequently used Section 5 of the FTC Act to protect small businesses or individuals in contexts involving their employment or independent contractor status.[69]

This ANPR proceeds as follows. Item II outlines the Commission's existing authority to bring enforcement actions and promulgate trade regulation rules under the FTC Act. Item III sets out the wide range of actions against commercial surveillance and data security acts or practices that the Commission has pursued in recent years as well as the benefits and shortcomings of this case-by-case approach. Item IV sets out the questions on which the Commission seeks public comment. Finally, Item V provides instructions on the comment submission process, and Item VI describes a public forum that is scheduled to take place to facilitate public involvement in this rulemaking proceeding.

II. The Commission's Authority

Congress authorized the Commission to propose a rule defining unfair or Start Printed Page 51278 deceptive acts or practices with specificity when the Commission “has reason to believe that the unfair or deceptive acts or practices which are the subject of the proposed rulemaking are prevalent.” [70] A determination about prevalence can be made either on the basis of “cease-and-desist” orders regarding such acts or practices that the Commission has previously issued, or when it has “any other information” that “indicates a widespread pattern of unfair or deceptive acts or practices.” [71]

Generally, a practice is unfair under Section 5 if (1) it causes or is likely to cause substantial injury, (2) the injury is not reasonably avoidable by consumers, and (3) the injury is not outweighed by benefits to consumers or competition.[72] A representation, omission, or practice is deceptive under Section 5 if it is likely to mislead consumers acting reasonably under the circumstances and is material to consumers—that is, it would likely affect the consumer's conduct or decision with regard to a product or service.[73] Under the statute, this broad language is applied to specific commercial practices through Commission enforcement actions and the promulgation of trade regulation rules.

In addition to the FTC Act, the Commission enforces a number of sector-specific laws that relate to commercial surveillance practices, including: the Fair Credit Reporting Act,[74] which protects the privacy of consumer information collected by consumer reporting agencies; the Children's Online Privacy Protection Act (“COPPA”),[75] which protects information collected online from children under the age of 13; the Gramm-Leach-Bliley Act (“GLBA”),[76] which protects the privacy of customer information collected by financial institutions; the Controlling the Assault of Non-Solicited Pornography and Marketing (“CAN-SPAM”) Act,[77] which allows consumers to opt out of receiving commercial email messages; the Fair Debt Collection Practices Act,[78] which protects individuals from harassment by debt collectors and imposes disclosure requirements on related third-parties; the Telemarketing and Consumer Fraud and Abuse Prevention Act,[79] under which the Commission implemented the Do Not Call Registry; [80] the Health Breach Notification Rule,[81] which applies to certain health information; and the Equal Credit Opportunity Act,[82] which protects individuals from discrimination on the basis of race, color, religion, national origin, sex, marital status, receipt of public assistance, or good faith exercise of rights under the Consumer Credit Protection Act and requires creditors to provide to applicants, upon request, the reasons underlying decisions to deny credit.

III. The Commission's Current Approach to Privacy and Data Security

a. Case-By-Case Enforcement and General Policy Work

For more than two decades, the Commission has been the nation's privacy agency, engaging in policy work and bringing scores of enforcement actions concerning data privacy and security.[83] These actions have alleged that certain practices violate Section 5 of the FTC Act or other statutes to the extent they pose risks to physical security, cause economic or reputational injury, or involve unwanted intrusions into consumers' daily lives.[84] For example, the Commission has brought actions for:

• the surreptitious collection and sale of consumer phone records obtained through false pretenses; [85]

• the public posting of private health-related data online; [86]

• the sharing of private health-related data with third parties; [87]

• inaccurate tenant screening; [88]

• public disclosure of consumers' financial information in responses to consumers' critical online reviews of the publisher's services; [89]

• pre-installation of ad-injecting software that acted as a man-in-the-middle between consumers and all websites with which they communicated and collected and transmitted to the software developer consumers' internet browsing data; [90]

• solicitation and online publication of “revenge porn”—intimate pictures and videos of ex-partners, along with their personal information—and the collection of fees to take down such information; [91]

• development and marketing of “stalkerware” that purchasers surreptitiously installed on others' phones or computers in order to monitor them; [92]

Start Printed Page 51279

• retroactive application of material privacy policy changes to personal information that businesses previously collected from users; [93]

• distribution of software that caused or was likely to cause consumers to unwittingly share their files publicly; [94]

• surreptitious activation of webcams in leased computers placed in consumers' homes; [95]

• sale of sensitive data such as Social Security numbers to third parties who did not have a legitimate business need for the information,[96] including known fraudsters; [97]

• collection and sharing of sensitive television-viewing information to target advertising contrary to reasonable expectations; [98]

• collection of phone numbers and email addresses to improve social media account security, but then deceptively using that data to allow companies to target advertisements in violation of an existing consent order; [99]

• failure to implement reasonable measures to protect consumers' personal information,[100] including Social Security numbers and answers to password reset questions,[101] and later covering up an ensuing breach; [102] and

• misrepresentations of the safeguards employed to protect data.[103]

This is just a sample of the Commission's enforcement work in data privacy and security.[104]

The orders that the Commission has obtained in these actions impose a variety of remedies, including prohibiting licensing, marketing, or selling of surveillance products,[105] requiring companies under order to implement comprehensive privacy and security programs and obtain periodic assessments of those programs by independent third parties,[106] requiring deletion of illegally obtained consumer information [107] or work product derived from that data,[108] requiring companies to provide notice to consumers affected by harmful practices that led to the action,[109] and mandating that companies improve the transparency of their data management practices.[110] The Commission may rely on these orders to seek to impose further sanctions on firms that repeat their unlawful practices.[111]

Start Printed Page 51280

The Commission has also engaged in broader policy work concerning data privacy and security. For example, it has promulgated rules pursuant to the sector-specific statutes enumerated above.[112] It also has published reports and closely monitored existing and emergent practices, including data brokers' activities,[113] “dark patterns,” [114] facial recognition,[115] Internet of Things,[116] big data,[117] cross-device tracking,[118] and mobile privacy disclosures.[119] The Commission, furthermore, has invoked its authority under Section 6(b) to require companies to prepare written reports or answer specific questions about their commercial practices.[120]

b. Reasons for Rulemaking

The Commission's extensive enforcement and policy work over the last couple of decades on consumer data privacy and security has raised important questions about the prevalence of harmful commercial surveillance and lax data security practices. This experience suggests that enforcement alone without rulemaking may be insufficient to protect consumers from significant harms. First, the FTC Act limits the remedies that the Commission may impose in enforcement actions on companies for violations of Section 5.[121] Specifically, the statute generally does not allow the Commission to seek civil penalties for first-time violations of that provision.[122] The fact that the Commission does not have authority to seek penalties for first-time violators may insufficiently deter future law violations. This may put firms that are careful to follow the law, including those that implement reasonable privacy-protective measures, at a competitive disadvantage. New trade regulation rules could, by contrast, set clear legal requirements or benchmarks by which to evaluate covered companies. They also would incentivize all companies to invest in compliance more consistently because, pursuant to the FTC Act, the Commission may impose civil penalties for first-time violations of duly promulgated trade regulation rules.[123]

Second, while the Commission can enjoin conduct that violates Section 5, as a matter of law and policy enforcement, such relief may be inadequate in the context of commercial surveillance and lax data security practices. For instance, after a hacker steals personal consumer data from an inadequately secured database, an injunction stopping the conduct and requiring the business to take affirmative steps to improve its security going forward can help prevent future breaches but does not remediate the harm that has already occurred or is likely to occur.[124]

Third, even in those instances in which the Commission can obtain monetary relief for violations of Section 5, such relief may be difficult to apply to some harmful commercial surveillance or lax data security practices that may not cause direct financial injury or, in any given individual case, do not lend themselves to broadly accepted ways of quantifying harm.[125] This is a problem that is underscored by commercial surveillance practices involving automated decision-making systems where the harm to any given individual or small group of individuals might affect other consumers in ways that are opaque or Start Printed Page 51281 hard to discern in the near term,[126] but are potentially no less unfair or deceptive.

Finally, the Commission's limited resources today can make it challenging to investigate and act on the extensive public reporting on data security practices that may violate Section 5, especially given how digitized and networked all aspects of the economy are becoming. A trade regulation rule could provide clarity and predictability about the statute's application to existing and emergent commercial surveillance and data security practices that, given institutional constraints, may be hard to equal or keep up with, case-by-case.[127]

IV. Questions

The commercial surveillance and lax data security practices that this ANPR describes above are only a sample of what the Commission's enforcement actions, news reporting, and published research have revealed. Here, in this Item, the Commission invites public comment on (a) the nature and prevalence of harmful commercial surveillance and lax data security practices, (b) the balance of costs and countervailing benefits of such practices for consumers and competition, as well as the costs and benefits of any given potential trade regulation rule, and (c) proposals for protecting consumers from harmful and prevalent commercial surveillance and lax data security practices.

This ANPR does not identify the full scope of potential approaches the Commission might ultimately undertake by rule or otherwise. It does not delineate a boundary on the issues on which the public may submit comments. Nor does it constrain the actions the Commission might pursue in an NPRM or final rule. The Commission invites comment on all potential rules, including those currently in force in foreign jurisdictions, individual U.S. states, and other legal jurisdictions.[128]

Given the significant interest this proceeding is likely to generate, and in order to facilitate an efficient review of submissions, the Commission encourages but does not require commenters to (1) submit a short Executive Summary of no more than three single-spaced pages at the beginning of all comments, (2) provide supporting material, including empirical data, findings, and analysis in published reports or studies by established news organizations and research institutions, (3) consistent with the questions below, describe the relative benefits and costs of their recommended approach, (4) refer to the numbered question(s) to which the comment is addressed, and (5) tie their recommendations to specific commercial surveillance and lax data security practices.

a. To what extent do commercial surveillance practices or lax security measures harm consumers?

This ANPR has alluded to only a fraction of the potential consumer harms arising from lax data security or commercial surveillance practices, including those concerning physical security, economic injury, psychological harm, reputational injury, and unwanted intrusion.

1. Which practices do companies use to surveil consumers?

2. Which measures do companies use to protect consumer data?

3. Which of these measures or practices are prevalent? Are some practices more prevalent in some sectors than in others?

4. How, if at all, do these commercial surveillance practices harm consumers or increase the risk of harm to consumers?

5. Are there some harms that consumers may not easily discern or identify? Which are they?

6. Are there some harms that consumers may not easily quantify or measure? Which are they?

7. How should the Commission identify and evaluate these commercial surveillance harms or potential harms? On which evidence or measures should the Commission rely to substantiate its claims of harm or risk of harm?

8. Which areas or kinds of harm, if any, has the Commission failed to address through its enforcement actions?

9. Has the Commission adequately addressed indirect pecuniary harms, including potential physical harms, psychological harms, reputational injuries, and unwanted intrusions?

10. Which kinds of data should be subject to a potential trade regulation rule? Should it be limited to, for example, personally identifiable data, sensitive data, data about protected categories and their proxies, data that is linkable to a device, or non-aggregated data? Or should a potential rule be agnostic about kinds of data?

11. Which, if any, commercial incentives and business models lead to lax data security measures or harmful commercial surveillance practices? Are some commercial incentives and business models more likely to protect consumers than others? On which checks, if any, do companies rely to ensure that they do not cause harm to consumers?

12. Lax data security measures and harmful commercial surveillance injure different kinds of consumers ( e.g., young people, workers, franchisees, small businesses, women, victims of stalking or domestic violence, racial minorities, the elderly) in different sectors ( e.g., health, finance, employment) or in different segments or “stacks” of the internet economy. For example, harms arising from data security breaches in finance or healthcare may be different from those concerning discriminatory advertising on social media which may be different from those involving education technology. How, if at all, should potential new trade regulation rules address harms to different consumers across different sectors? Which commercial surveillance practices, if any, are unlawful such that new trade regulation rules should set out clear limitations or prohibitions on them? To what extent, if any, is a comprehensive regulatory approach better than a sectoral one for any given harm?

b. To what extent do commercial surveillance practices or lax data security measures harm children, including teenagers?

13. The Commission here invites comment on commercial surveillance practices or lax data security measures that affect children, including teenagers. Are there practices or measures to which children or teenagers are particularly vulnerable or susceptible? For instance, are children and teenagers more likely than adults to be manipulated by practices designed to Start Printed Page 51282 encourage the sharing of personal information?

14. What types of commercial surveillance practices involving children and teens' data are most concerning? For instance, given the reputational harms that teenagers may be characteristically less capable of anticipating than adults, to what extent should new trade regulation rules provide teenagers with an erasure mechanism in a similar way that COPPA provides for children under 13? Which measures beyond those required under COPPA would best protect children, including teenagers, from harmful commercial surveillance practices?

15. In what circumstances, if any, is a company's failure to provide children and teenagers with privacy protections, such as not providing privacy-protective settings by default, an unfair practice, even if the site or service is not targeted to minors? For example, should services that collect information from large numbers of children be required to provide them enhanced privacy protections regardless of whether the services are directed to them? Should services that do not target children and teenagers be required to take steps to determine the age of their users and provide additional protections for minors?

16. Which sites or services, if any, implement child-protective measures or settings even if they do not direct their content to children and teenagers?

17. Do techniques that manipulate consumers into prolonging online activity ( e.g., video autoplay, infinite or endless scroll, quantified public popularity) facilitate commercial surveillance of children and teenagers? If so, how? In which circumstances, if any, are a company's use of those techniques on children and teenagers an unfair practice? For example, is it an unfair or deceptive practice when a company uses these techniques despite evidence or research linking them to clinical depression, anxiety, eating disorders, or suicidal ideation among children and teenagers?

18. To what extent should trade regulation rules distinguish between different age groups among children ( e.g., 13 to 15, 16 to 17, etc.)?

19. Given the lack of clarity about the workings of commercial surveillance behind the screen or display, is parental consent an efficacious way of ensuring child online privacy? Which other protections or mechanisms, if any, should the Commission consider?

20. How extensive is the business-to-business market for children and teens' data? In this vein, should new trade regulation rules set out clear limits on transferring, sharing, or monetizing children and teens' personal information?

21. Should companies limit their uses of the information that they collect to the specific services for which children and teenagers or their parents sign up? Should new rules set out clear limits on personalized advertising to children and teenagers irrespective of parental consent? If so, on what basis? What harms stem from personalized advertising to children? What, if any, are the prevalent unfair or deceptive practices that result from personalized advertising to children and teenagers?

22. Should new rules impose differing obligations to protect information collected from children depending on the risks of the particular collection practices?

23. How would potential rules that block or otherwise help to stem the spread of child sexual abuse material, including content-matching techniques, otherwise affect consumer privacy?

c. How should the Commission balance costs and benefits?

24. The Commission invites comment on the relative costs and benefits of any current practice, as well as those for any responsive regulation. How should the Commission engage in this balancing in the context of commercial surveillance and data security? Which variables or outcomes should it consider in such an accounting? Which variables or outcomes are salient but hard to quantify as a material cost or benefit? How should the Commission ensure adequate weight is given to costs and benefits that are hard to quantify?

25. What is the right time horizon for evaluating the relative costs and benefits of existing or emergent commercial surveillance and data security practices? What is the right time horizon for evaluating the relative benefits and costs of regulation?

26. To what extent would any given new trade regulation rule on data security or commercial surveillance impede or enhance innovation? To what extent would such rules enhance or impede the development of certain kinds of products, services, and applications over others?

27. Would any given new trade regulation rule on data security or commercial surveillance impede or enhance competition? Would any given rule entrench the potential dominance of one company or set of companies in ways that impede competition? If so, how and to what extent?

28. Should the analysis of cost and benefits differ in the context of information about children? If so, how?

29. What are the benefits or costs of refraining from promulgating new rules on commercial surveillance or data security?

d. How, if at all, should the Commission regulate harmful commercial surveillance or data security practices that are prevalent?

i. Rulemaking Generally

30. Should the Commission pursue a Section 18 rulemaking on commercial surveillance and data security? To what extent are existing legal authorities and extralegal measures, including self-regulation, sufficient? To what extent, if at all, are self-regulatory principles effective?

ii. Data Security

31. Should the Commission commence a Section 18 rulemaking on data security? The Commission specifically seeks comment on how potential new trade regulation rules could require or help incentivize reasonable data security.

32. Should, for example, new rules require businesses to implement administrative, technical, and physical data security measures, including encryption techniques, to protect against risks to the security, confidentiality, or integrity of covered data? If so, which measures? How granular should such measures be? Is there evidence of any impediments to implementing such measures?

33. Should new rules codify the prohibition on deceptive claims about consumer data security, accordingly authorizing the Commission to seek civil penalties for first-time violations?

34. Do the data security requirements under COPPA or the GLBA Safeguards Rule offer any constructive guidance for a more general trade regulation rule on data security across sectors or in other specific sectors?

35. Should the Commission take into account other laws at the state and federal level ( e.g., COPPA) that already include data security requirements. If so, how? Should the Commission take into account other governments' requirements as to data security ( e.g., GDPR). If so, how?

36. To what extent, if at all, should the Commission require firms to certify that their data practices meet clear security standards? If so, who should set those standards, the FTC or a third-party entity? Start Printed Page 51283

iii. Collection, Use, Retention, and Transfer of Consumer Data

37. How do companies collect consumers' biometric information? What kinds of biometric information do companies collect? For what purposes do they collect and use it? Are consumers typically aware of that collection and use? What are the benefits and harms of these practices?

38. Should the Commission consider limiting commercial surveillance practices that use or facilitate the use of facial recognition, fingerprinting, or other biometric technologies? If so, how?

39. To what extent, if at all, should the Commission limit companies that provide any specifically enumerated services ( e.g., finance, healthcare, search, or social media) from owning or operating a business that engages in any specific commercial surveillance practices like personalized or targeted advertising? If so, how? What would the relative costs and benefits of such a rule be, given that consumers generally pay zero dollars for services that are financed through advertising?

40. How accurate are the metrics on which internet companies rely to justify the rates that they charge to third-party advertisers? To what extent, if at all, should new rules limit targeted advertising and other commercial surveillance practices beyond the limitations already imposed by civil rights laws? If so, how? To what extent would such rules harm consumers, burden companies, stifle innovation or competition, or chill the distribution of lawful content?

41. To what alternative advertising practices, if any, would companies turn in the event new rules somehow limit first- or third-party targeting?

42. How cost-effective is contextual advertising as compared to targeted advertising?

43. To what extent, if at all, should new trade regulation rules impose limitations on companies' collection, use, and retention of consumer data? Should they, for example, institute data minimization requirements or purpose limitations, i.e., limit companies from collecting, retaining, using, or transferring consumer data beyond a certain predefined point? Or, similarly, should they require companies to collect, retain, use, or transfer consumer data only to the extent necessary to deliver the specific service that a given individual consumer explicitly seeks or those that are compatible with that specific service? If so, how? How should it determine or define which uses are compatible? How, moreover, could the Commission discern which data are relevant to achieving certain purposes and no more?

44. By contrast, should new trade regulation rules restrict the period of time that companies collect or retain consumer data, irrespective of the different purposes to which it puts that data? If so, how should such rules define the relevant period?

45. Pursuant to a purpose limitation rule, how, if at all, should the Commission discern whether data that consumers give for one purpose has been only used for that specified purpose? To what extent, moreover, should the Commission permit use of consumer data that is compatible with, but distinct from, the purpose for which consumers explicitly give their data?

46. Or should new rules impose data minimization or purpose limitations only for certain designated practices or services? Should, for example, the Commission impose limits on data use for essential services such as finance, healthcare, or search—that is, should it restrict companies that provide these services from using, retaining, or transferring consumer data for any other service or commercial endeavor? If so, how?

47. To what extent would data minimization requirements or purpose limitations protect consumer data security?

48. To what extent would data minimization requirements or purpose limitations unduly hamper algorithmic decision-making or other algorithmic learning-based processes or techniques? To what extent would the benefits of a data minimization or purpose limitation rule be out of proportion to the potential harms to consumers and companies of such a rule?

49. How administrable are data minimization requirements or purpose limitations given the scale of commercial surveillance practices, information asymmetries, and the institutional resources such rules would require the Commission to deploy to ensure compliance? What do other jurisdictions have to teach about their relative effectiveness?

50. What would be the effect of data minimization or purpose limitations on consumers' ability to access services or content for which they are not currently charged out of pocket? Conversely, which costs, if any, would consumers bear if the Commission does not impose any such restrictions?

51. To what extent, if at all, should the Commission require firms to certify that their commercial surveillance practices meet clear standards concerning collection, use, retention, transfer, or monetization of consumer data? If promulgated, who should set those standards: the FTC, a third-party organization, or some other entity?

52. To what extent, if at all, do firms that now, by default, enable consumers to block other firms' use of cookies and other persistent identifiers impede competition? To what extent do such measures protect consumer privacy, if at all? Should new trade regulation rules forbid the practice by, for example, requiring a form of interoperability or access to consumer data? Or should they permit or incentivize companies to limit other firms' access to their consumers' data? How would such rules interact with general concerns and potential remedies discussed elsewhere in this ANPR?

iv. Automated Decision-Making Systems

53. How prevalent is algorithmic error? To what extent is algorithmic error inevitable? If it is inevitable, what are the benefits and costs of allowing companies to employ automated decision-making systems in critical areas, such as housing, credit, and employment? To what extent can companies mitigate algorithmic error in the absence of new trade regulation rules?

54. What are the best ways to measure algorithmic error? Is it more pronounced or happening with more frequency in some sectors than others?

55. Does the weight that companies give to the outputs of automated decision-making systems overstate their reliability? If so, does that have the potential to lead to greater consumer harm when there are algorithmic errors?

56. To what extent, if at all, should new rules require companies to take specific steps to prevent algorithmic errors? If so, which steps? To what extent, if at all, should the Commission require firms to evaluate and certify that their reliance on automated decision-making meets clear standards concerning accuracy, validity, reliability, or error? If so, how? Who should set those standards, the FTC or a third-party entity? Or should new rules require businesses to evaluate and certify that the accuracy, validity, or reliability of their commercial surveillance practices are in accordance with their own published business policies?

57. To what extent, if at all, do consumers benefit from automated decision-making systems? Who is most likely to benefit? Who is most likely to be harmed or disadvantaged? To what extent do such practices violate Section 5 of the FTC Act? Start Printed Page 51284

58. Could new rules help ensure that firms' automated decision-making practices better protect non-English speaking communities from fraud and abusive data practices? If so, how?

59. If new rules restrict certain automated decision-making practices, which alternatives, if any, would take their place? Would these alternative techniques be less prone to error than the automated decision-making they replace?

60. To what extent, if at all, should new rules forbid or limit the development, design, and use of automated decision-making systems that generate or otherwise facilitate outcomes that violate Section 5 of the FTC Act? Should such rules apply economy-wide or only in some sectors? If the latter, which ones? Should these rules be structured differently depending on the sector? If so, how?

61. What would be the effect of restrictions on automated decision-making in product access, product features, product quality, or pricing? To what alternative forms of pricing would companies turn, if any?

62. Which, if any, legal theories would support limits on the use of automated systems in targeted advertising given potential constitutional or other legal challenges?

63. To what extent, if at all, does the First Amendment bar or not bar the Commission from promulgating or enforcing rules concerning the ways in which companies personalize services or deliver targeted advertisements?

64. To what extent, if at all, does Section 230 of the Communications Act, 47 U.S.C. 230, bar the Commission from promulgating or enforcing rules concerning the ways in which companies use automated decision-making systems to, among other things, personalize services or deliver targeted advertisements?

v. Discrimination Based on Protected Categories

65. How prevalent is algorithmic discrimination based on protected categories such as race, sex, and age? Is such discrimination more pronounced in some sectors than others? If so, which ones?

66. How should the Commission evaluate or measure algorithmic discrimination? How does algorithmic discrimination affect consumers, directly and indirectly? To what extent, if at all, does algorithmic discrimination stifle innovation or competition?

67. How should the Commission address such algorithmic discrimination? Should it consider new trade regulation rules that bar or somehow limit the deployment of any system that produces discrimination, irrespective of the data or processes on which those outcomes are based? If so, which standards should the Commission use to measure or evaluate disparate outcomes? How should the Commission analyze discrimination based on proxies for protected categories? How should the Commission analyze discrimination when more than one protected category is implicated ( e.g., pregnant veteran or Black woman)?

68. Should the Commission focus on harms based on protected classes? Should the Commission consider harms to other underserved groups that current law does not recognize as protected from discrimination ( e.g., unhoused people or residents of rural communities)?

69. Should the Commission consider new rules on algorithmic discrimination in areas where Congress has already explicitly legislated, such as housing, employment, labor, and consumer finance? Or should the Commission consider such rules addressing all sectors?

70. How, if at all, would restrictions on discrimination by automated decision-making systems based on protected categories affect all consumers?

71. To what extent, if at all, may the Commission rely on its unfairness authority under Section 5 to promulgate antidiscrimination rules? Should it? How, if at all, should antidiscrimination doctrine in other sectors or federal statutes relate to new rules?

72. How can the Commission's expertise and authorities complement those of other civil rights agencies? How might a new rule ensure space for interagency collaboration?

vi. Consumer Consent

73. The Commission invites comment on the effectiveness and administrability of consumer consent to companies' commercial surveillance and data security practices. Given the reported scale, opacity, and pervasiveness of existing commercial surveillance today, to what extent is consumer consent an effective way of evaluating whether a practice is unfair or deceptive? How should the Commission evaluate its effectiveness?

74. In which circumstances, if any, is consumer consent likely to be effective? Which factors, if any, determine whether consumer consent is effective?

75. To what extent does current law prohibit commercial surveillance practices, irrespective of whether consumers consent to them?

76. To what extent should new trade regulation rules prohibit certain specific commercial surveillance practices, irrespective of whether consumers consent to them?

77. To what extent should new trade regulation rules require firms to give consumers the choice of whether to be subject to commercial surveillance? To what extent should new trade regulation rules give consumers the choice of withdrawing their duly given prior consent? How demonstrable or substantial must consumer consent be if it is to remain a useful way of evaluating whether a commercial surveillance practice is unfair or deceptive? How should the Commission evaluate whether consumer consent is meaningful enough?

78. What would be the effects on consumers of a rule that required firms to give consumers the choice of being subject to commercial surveillance or withdrawing that consent? When or how often should any given company offer consumers the choice? And for which practices should companies provide these options, if not all?

79. Should the Commission require different consent standards for different consumer groups ( e.g., parents of teenagers (as opposed to parents of pre-teens), elderly individuals, individuals in crisis or otherwise especially vulnerable to deception)?

80. Have opt-out choices proved effective in protecting against commercial surveillance? If so, how and in what contexts?

81. Should new trade regulation rules require companies to give consumers the choice of opting out of all or certain limited commercial surveillance practices? If so, for which practices or purposes should the provision of an opt-out choice be required? For example, to what extent should new rules require that consumers have the choice of opting out of all personalized or targeted advertising?

82. How, if at all, should the Commission require companies to recognize or abide by each consumer's respective choice about opting out of commercial surveillance practices—whether it be for all commercial surveillance practices or just some? How would any such rule affect consumers, given that they do not all have the same preference for the amount or kinds of personal information that they share?

vii. Notice, Transparency, and Disclosure

83. To what extent should the Commission consider rules that require companies to make information Start Printed Page 51285 available about their commercial surveillance practices? What kinds of information should new trade regulation rules require companies to make available and in what form?

84. In which contexts are transparency or disclosure requirements effective? In which contexts are they less effective?

85. Which, if any, mechanisms should the Commission use to require or incentivize companies to be forthcoming? Which, if any, mechanisms should the Commission use to verify the sufficiency, accuracy, or authenticity of the information that companies provide?

a. What are the mechanisms for opacity?

86. The Commission invites comment on the nature of the opacity of different forms of commercial surveillance practices. On which technological or legal mechanisms do companies rely to shield their commercial surveillance practices from public scrutiny? Intellectual property protections, including trade secrets, for example, limit the involuntary public disclosure of the assets on which companies rely to deliver products, services, content, or advertisements. How should the Commission address, if at all, these potential limitations?

b. Who should administer notice or disclosure requirements?

87. To what extent should the Commission rely on third-party intermediaries ( e.g., government officials, journalists, academics, or auditors) to help facilitate new disclosure rules?

88. To what extent, moreover, should the Commission consider the proprietary or competitive interests of covered companies in deciding what role such third-party auditors or researchers should play in administering disclosure requirements?

c. What should companies provide notice of or disclose?

89. To what extent should trade regulation rules, if at all, require companies to explain (1) the data they use, (2) how they collect, retain, disclose, or transfer that data, (3) how they choose to implement any given automated decision-making system or process to analyze or process the data, including the consideration of alternative methods, (4) how they process or use that data to reach a decision, (5) whether they rely on a third-party vendor to make such decisions, (6) the impacts of their commercial surveillance practices, including disparities or other distributional outcomes among consumers, and (7) risk mitigation measures to address potential consumer harms?

90. Disclosures such as these might not be comprehensible to many audiences. Should new rules, if promulgated, require plain-spoken explanations? How effective could such explanations be, no matter how plain? To what extent, if at all, should new rules detail such requirements?

91. Disclosure requirements could vary depending on the nature of the service or potential for harm. A potential new trade regulation rule could, for example, require different kinds of disclosure tools depending on the nature of the data or practices at issue ( e.g., collection, retention, or transfer) or the sector ( e.g., consumer credit, housing, or work). Or the agency could impose transparency measures that require in-depth accounting ( e.g., impact assessments) or evaluation against externally developed standards ( e.g., third-party auditing). How, if at all, should the Commission implement and enforce such rules?

92. To what extent should the Commission, if at all, make regular self-reporting, third-party audits or assessments, or self-administered impact assessments about commercial surveillance practices a standing obligation? How frequently, if at all, should the Commission require companies to disclose such materials publicly? If it is not a standing obligation, what should trigger the publication of such materials?

93. To what extent do companies have the capacity to provide any of the above information? Given the potential cost of such disclosure requirements, should trade regulation rules exempt certain companies due to their size or the nature of the consumer data at issue?

viii. Remedies

94. How should the FTC's authority to implement remedies under the Act determine the form or substance of any potential new trade regulation rules on commercial surveillance? Should new rules enumerate specific forms of relief or damages that are not explicit in the FTC Act but that are within the Commission's authority? For example, should a potential new trade regulation rule on commercial surveillance explicitly identify algorithmic disgorgement, a remedy that forbids companies from profiting from unlawful practices related to their use of automated systems, as a potential remedy? Which, if any, other remedial tools should new trade regulation rules on commercial surveillance explicitly identify? Is there a limit to the Commission's authority to implement remedies by regulation?

ix. Obsolescence

95. The Commission is alert to the potential obsolescence of any rulemaking. As important as targeted advertising is to today's internet economy, for example, it is possible that its role may wane. Companies and other stakeholders are exploring new business models.[129] Such changes would have notable collateral consequences for companies that have come to rely on the third-party advertising model, including and especially news publishing. These developments in online advertising marketplace are just one example. How should the Commission account for changes in business models in advertising as well as other commercial surveillance practices?

V. Comment Submissions

You can file a comment online or on paper. For the Commission to consider your comment, it must receive it on or before October 21, 2022. Write “Commercial Surveillance ANPR, R111004” on your comment. Your comment—including your name and your state—will be placed on the public record of this proceeding, including, to the extent practicable, on the https://www.regulations.gov website. The Commission strongly encourages you to submit your comments online through the https://www.regulations.gov website. To ensure the Commission considers your online comment, please follow the instructions on the web-based form.

If you file your comment on paper, write “Commercial Surveillance ANPR, R111004” on your comment and on the envelope, and mail your comment to the following address: Federal Trade Commission, Office of the Secretary, 600 Pennsylvania Avenue NW, Suite CC-5610 (Annex B), Washington, DC 20580.

Because your comment will be placed on the public record, you are solely responsible for making sure that your comment does not include any sensitive or confidential information. In particular, your comment should not contain sensitive personal information, such as your or anyone else's Social Security number; date of birth; driver's license number or other state identification number or foreign country equivalent; passport number; financial Start Printed Page 51286 account number; or credit or debit card number. You are also solely responsible for making sure your comment does not include any sensitive health information, such as medical records or other individually identifiable health information. In addition, your comment should not include any “[t]rade secret or any commercial or financial information which . . . is privileged or confidential”—as provided in Section 6(f) of the FTC Act, 15 U.S.C. 46(f), and FTC Rule 4.10(a)(2), 16 CFR 4.10(a)(2)—including in particular competitively sensitive information such as costs, sales statistics, inventories, formulas, patterns, devices, manufacturing processes, or customer names.

Comments containing material for which confidential treatment is requested must be filed in paper form, must be clearly labeled “Confidential,” and must comply with FTC Rule 4.9(c). In particular, the written request for confidential treatment that accompanies the comment must include the factual and legal basis for the request and must identify the specific portions of the comment to be withheld from the public record. See FTC Rule 4.9(c). Your comment will be kept confidential only if the General Counsel grants your request in accordance with the law and the public interest. Once your comment has been posted publicly at https://www.regulations.gov- as legally required by FTC Rule 4.9(b)—we cannot redact or remove your comment, unless you submit a confidentiality request that meets the requirements for such treatment under FTC Rule 4.9(c), and the General Counsel grants that request.

Visit the FTC website to read this document and the news release describing it. The FTC Act and other laws that the Commission administers permit the collection of public comments to consider and use in this proceeding as appropriate. The Commission will consider all timely and responsive public comments it receives on or before October 21, 2022. For information on the Commission's privacy policy, including routine uses permitted by the Privacy Act, see https://www.ftc.gov/​site-information/​privacy-policy.

VI. The Public Forum

The Commission will hold a public forum on Thursday, September 8, 2022, from 2 p.m. until 7:30 p.m. eastern time. In light of the ongoing COVID-19 pandemic, the forum will be held virtually, and members of the public are encouraged to attend virtually by visiting https://www.ftc.gov/​news-events/​events/​2022/​09/​commercial-surveillance-data-security-anpr-public-forum. The public forum will address in greater depth the topics that are the subject of this document as well as the rulemaking process with a goal of facilitating broad public participation in response to this ANPR and any future rulemaking proceedings the Commission undertakes. A complete agenda will be posted at the aforementioned website and announced in a press release at a future date. Individuals or entities that would like to participate in the public forum by offering two-minute public remarks, should email . Please note that this email is only for requests to participate in the public forum and is not a means of submitting comments in response to this ANPR. Please see Item V above for instructions on submitting public comments.

Forum panelists will be selected by FTC staff, and public remarks are first come, first serve. The Commission will place a recording of the proceeding on the public record. Requests to participate in the public remarks must be received on or before August 31, 2022. Individuals or entities selected to participate will be notified on or before September 2, 2022. Because disclosing sources of funding promotes transparency, ensures objectivity, and maintains the public's trust, prospective participants, if chosen, will be required to disclose the source of any support they received in connection with participation at the forum. This funding information will be included in the published biographies as part of the forum record.

Start Signature

By direction of the Commission.

Joel Christie,

Acting Secretary.

End Signature

Note:

The following statements will not appear in the Code of Federal Regulations:

Statement of Chair Lina M. Khan

Today, the Federal Trade Commission initiated a proceeding to examine whether we should implement new rules addressing data practices that are unfair or deceptive.

The Commission brought its first internet privacy case 24 years ago against GeoCities, one of the most popular websites at the time.[1] In the near quarter-century since, digital technologies and online services have rapidly evolved, with transformations in business models, technical capabilities, and social practices. These changes have yielded striking advancements and dazzling conveniences—but also tools that enable entirely new forms of persistent tracking and routinized surveillance. Firms now collect personal data on individuals on a massive scale and in a stunning array of contexts, resulting in an economy that, as one scholar put it, “represents probably the most highly surveilled environment in the history of humanity.” [2] This explosion in data collection and retention, meanwhile, has heightened the risks and costs of breaches—with Americans paying the price.[3]

As the country's de facto law enforcer in this domain, the FTC is charged with ensuring that our approach to enforcement and policy keeps pace with these new market realities. The agency has built a wealth of experience in the decades since the GeoCities case, applying our century-old tools to new products in order to protect Americans from evolving forms of data abuses.[4] Yet the growing digitization of our economy—coupled with business models that can incentivize endless hoovering up of sensitive user data and a vast expansion of how this data is used [5] —means potentially unlawful practices may be prevalent, with case-by-case enforcement failing to adequately deter lawbreaking or remedy the resulting harms.

Start Printed Page 51287

Indeed, a significant majority of Americans today feel they have scant control over the data collected on them and believe the risks of data collection by commercial entities outweigh the benefits.[6] Evidence also suggests the current configuration of commercial data practices do not actually reveal how much users value privacy or security.[7] For one, the use of dark patterns and other conduct that seeks to manipulate users underscores the limits of treating present market outcomes as reflecting what users desire or value.[8] More fundamentally, users often seem to lack a real set of alternatives and cannot reasonably forego using technologies that are increasingly critical for navigating modern life.[9]

The data practices of today's surveillance economy can create and exacerbate deep asymmetries of information—exacerbating, in turn, imbalances of power. And the expanding contexts in which users' personal data is used—from health care and housing to employment and education—mean what's at stake with unlawful collection, use, retention, or disclosure is not just one's subjective preference for privacy, but one's access to opportunities in our economy and society, as well as core civil liberties and civil rights.

The fact that current data practices can have such consequential effects heightens both the importance of wielding the full set of tools Congress has given us, as well as the responsibility we have to do so. In particular, Section 18 of the FTC Act grants us clear authority to issue rules that identify specific business practices that are unlawful by virtue of being “unfair” or “deceptive.” [10] Doing so could provide firms with greater clarity about the scope of their legal obligations. It could also strengthen our ability to deter lawbreaking, given that first-time violators of duly promulgated trade regulation rules—unlike most first-time violators of the FTC Act [11] —are subject to civil penalties. This would also help dispense with competitive advantages enjoyed by firms that break the law: all companies would be on the hook for civil penalties for law violations, not just repeat offenders.

Today's action marks the beginning of the rulemaking proceeding. In issuing an Advance notice of proposed rulemaking (ANPR), the Commission is seeking comments from the public on the extent and effects of various commercial surveillance and data security practices, as well as on various approaches to crafting rules to govern these practices and the attendant tradeoffs. Our goal at this stage is to begin building a rich public record to inform whether rulemaking is worthwhile and the form potential proposed rules should take. Robust public engagement will be critical—particularly for documenting specific harmful business practices and their prevalence, the magnitude and extent of the resulting consumer harm, the efficacy or shortcomings of rules pursued in other jurisdictions, and how to assess which areas are or are not fruitful for FTC rulemaking.

Because Section 18 lays out an extensive series of procedural steps, we will have ample opportunity to review our efforts in light of any new developments. If Congress passes strong federal privacy legislation—as I hope it does—or if there is any other significant change in applicable law, then the Commission would be able to reassess the value-add of this effort and whether continuing it is a sound use of resources. The recent steps taken by lawmakers to advance federal privacy legislation are highly encouraging, and our agency stands ready to continue aiding that process through technical assistance or otherwise sharing our staff's expertise.[12] At minimum, the record we will build through issuing this ANPR and seeking public comment can serve as a resource to policymakers across the board as legislative efforts continue.

The ANPR poses scores of broad and specific questions to help elicit and encourage responses from a diverse range of stakeholders. I look forward to engaging with and learning from the record we develop on the wide range of issues covered. Highlighted below are a few topics from the ANPR on which I am especially eager for us to build a record:

• Procedural protections versus substantive limits: Growing recognition of the limits of the “notice and consent” framework prompts us to reconsider more generally the adequacy of procedural protections, which tend to create process requirements while sidestepping more fundamental questions about whether certain types of data collection and processing should be permitted in the first place.[13] Are there contexts in which our unfairness authority reaches a greater set of substantive limits on data collection? [14] When might bans and prohibitions on certain data practices be most appropriate? [15]

• Administrability: Information asymmetries between enforcers and market participants can be especially stark in the digital economy. How can Start Printed Page 51288 we best ensure that any rules we pursue can be easily and efficiently administered and that these rules do not rest on determinations we are not well positioned to make or commitments we are not well positioned to police? How have jurisdictions successfully managed to police obligations such as “data minimization”? [16]

• Business models and incentives: How should we approach business models that are premised on or incentivize persistent tracking and surveillance, especially for products or services consumers may not be able to reasonably avoid? [17]

• Discrimination based on protected categories: Automated systems used by firms sometimes discriminate based on protected categories—such as race, color, religion, national origin, or sex—including in contexts where this discrimination is unlawful.[18] How should we consider whether new rules should limit or forbid discrimination based on protected categories under our Section 5 unfairness authority? [19]

• Workplace surveillance: Reports suggest extensive tracking, collection, and analysis of consumer data in the workplace has expanded exponentially.[20] Are there particular considerations that should govern how we consider whether data abuses in the workplace may be deceptive or unfair? [21]

To facilitate wide-ranging participation, we are seeking to make this process widely accessible. Our staff has published a “frequently asked questions” resource to demystify the rulemaking process and identify opportunities for the public to engage.[22] We will also host a virtual public forum on September 8, where people will be able to provide oral remarks that will be part of the ANPR record.[23]

I am grateful to our agency staff for their work on this ANPR and my colleagues on the Commission for their engagement and input. Protecting Americans from unlawful commercial surveillance and data security practices is critical work, and I look forward to undertaking this effort with both the necessary urgency and rigor.

Statement of Commissioner Rebecca Kelly Slaughter

Three years ago, I gave a speech outlining: why I believed that case-by-case enforcement in the space of data abuses was not effective; how I hoped to see Congress pass a long-overdue federal privacy law; and that, until such a law is signed, the Commission should use its authority under Section 18 to initiate a rulemaking process.[1] I am delighted that Congress appears to be making substantial and unprecedented progress toward a meaningful privacy law, which I am eager to see pass.[2] Nonetheless, given the uncertainty of the legislative process and the time a Section 18 rulemaking necessarily takes, the Commission should not wait any longer than it already has to develop a public record that could support enforceable rules. So I am equally delighted that we are now beginning the Section 18 process by issuing this advance notice of proposed rulemaking (“ANPR”) on commercial surveillance and data security.[3]

It is indisputable that the Federal Trade Commission has expertise in regulating this sector; it is widely recognized as the nation's premier “privacy enforcer.” [4] I commend agency staff for their dogged application of our nearly 100-year-old consumer-protection statute (and handful of sector-specific privacy laws) to build that reputation.

Historically, much of that work operated through the straightforward application of those basic consumer-protection principles to privacy. The FTC ensured that companies told users what they were doing with the users' data, insisted that they secure users' consent, and policed companies' promises. But case-by-case enforcement has not systemically deterred unlawful behavior in this market. As our own reports make clear, the prevailing notice-and-choice regime has failed to protect users,[5] and the modes by which sensitive information can be discovered, Start Printed Page 51289 derived, and disclosed have only grown in number and complexity.[6]

Data abuses such as surreptitious biometric or location tracking,[7] unaccountable and discriminatory algorithmic decision-making,[8] or lax data security practices [9] have been either caused by, exacerbated by, or are in service of nearly unfettered commercial data collection, retention, use, and sharing. It is up to the Commission to use the tools Congress explicitly gave us, however rusty we are at wielding them, to prevent these unlawful practices. That is why I have consistently, for years, called for the Commission to begin the process to consider clear, bright-line rules against unfair or deceptive data practices pursuant to our Section 18 authority.[10]

Section 18 rulemaking's virtue lies in being open, iterative, and public. By the same token it is, by congressional design, laborious and time-consuming. But we intend to follow the record where it leads and, if appropriate, issue Trade Regulation Rules to proscribe unlawful conduct. The Commission has proactively taken steps to use this authority as Congress directed. During my time as Acting Chair, we created a Rulemaking Group within the Office of General Counsel, which has already been indispensable in building the agency's capacity during this process.[11] Working with that Group, the Commission updated our Rules of Practice to enhance transparency and shed self-imposed roadblocks to avoid unnecessary and costly delay in these proceedings.[12]

As happy as I am to see us finally take this first step of opening this record, it is not something I take lightly. An initiative like this entails some risk, though I believe further inaction does as well. I have heard arguments, including from my fellow Commissioners, that conducting a rulemaking in the data space is inappropriate, either because Congress is currently debating privacy legislation or even because the topic is simply too consequential or the issues too vast for the Commission to appropriately address. In this statement, I challenge some of these assumptions and then raise some of the issues in which I am especially interested.

On Timing

The best time to initiate this lengthy process was years ago, but the second-best time is now. Effective nationwide rules governing the collection and use of data are long overdue. As the nation's principal consumer-protection agency, we have a responsibility to act.

Restoring Effective Deterrence

The question of effective enforcement is central to this proceeding. Case-by-case enforcement, while once considered a prudent expression of our statutory authority, has not proved effective at deterring illegal conduct in the data space. Trade Regulation Rules can help remedy this problem by providing clear and specific guidance about what conduct the law proscribes and attaching financial consequences to violations of the law.

Providing a financial penalty for first-time lawbreaking is now, in the wake of the loss of our Section 13(b) authority, a particular necessity. Last year, the Supreme Court ruled that we can no longer seek monetary relief in federal court for violations of the FTC Act under our 13(b) authority.[13] I have testified in Congress that the loss of this authority is devastating for consumers who now face a significantly steeper uphill battle to be made whole after suffering a financial injury stemming from illegal conduct.[14] But the loss of 13(b) also hampers our ability to deter unlawful conduct in the first place. In its absence, and without a statutory fix, first-time violators of the FTC Act are unlikely to face monetary consequences for their unlawful practices.[15] Trade Regulation Rules enforced under Start Printed Page 51290 Section 19 can enable such consequences.[16]

Rulemaking in the Time of ADPPA

For years, Congress has nibbled around the edges of comprehensive federal privacy legislation; it is now engaged in the advanced stages of consideration of such legislation. All members of the Commission have repeatedly called on Congress to act in this space. I have advocated for legislation that sets clear rules regarding data minimization, use restrictions, and secondary uses; that gives us the ability to seek civil penalties for law violations; that gives us flexible APA rulemaking authority so we can act swiftly to address new conduct; and most importantly gives the agency the resources to meaningfully enforce the law.

The House may be the closest it has been in years to seeing legislation like this reach the finish line.[17] I not only welcome it—I prefer Congressional action to strengthen our authority. But I know from personal experience that the road for a bill to become a law is not a straight or easy one.[18] In the absence of that legislation, and while Congress deliberates, we cannot sit idly by or press pause indefinitely on doing our jobs to the best of our ability. As I mentioned above, I believe that we have a duty to use the authorities Congress has already given us to prevent and address these unfair or deceptive practices how we best see fit.

I am certain that action by the Federal Trade Commission will not clip the wings of Congressional ambition. Our work here is complementary to Congress' efforts.[19] The bills supported by the leaders of both Commerce Committees empower the FTC to be a more effective privacy regulator,[20] as will the record we develop pursuant to this ANPR. Section 18 rulemaking, even more so than more common APA rulemaking, gives members of the public the opportunity to be active participants in the policy process. The open record will allow us to hear from ordinary people about the data economy harms they have experienced. We can begin to flex our regulatory muscle by evaluating which of those harms meet the statutory prohibitions on unfair or deceptive conduct and which of those are prevalent in the market. The study, public commentary, and dialogue this proceeding will launch can meaningfully inform any superseding rulemaking Congress eventually directs us to take as well as the Congressional debate should the current legislative progress stall.

Our Authority and the Scope of This Proceeding

Some have balked at this ANPR as overly ambitious for an agency that has not previously issued rules in this area, or as coloring outside the lines of our statute in the topics it addresses, especially in light of the Supreme Court decision in West Virginia v. EPA. But our authority is as unambiguous as it is limited, and so our regulatory ambit is rightfully constrained—the questions we ask in the ANPR and the rules we are empowered to issue may be consequential, but they do not implicate the “major questions doctrine.” [21]

Section 18 Rulemaking

In its grant of Section 18 rulemaking authority to the Commission in 1975 under the Magnuson-Moss Warranty—Federal Trade Commission Improvement Act, Congress explicitly empowered the FTC to “define with specificity acts or practices which are unfair or deceptive acts or practices in or affecting commerce . . . .” [22] Those terms, and therefore our delegated authority, are not defined by “modest words,” “vague terms,” “subtle devices,” or “oblique or elliptical language.” [23] Determining what acts “in commerce” are unfair or deceptive is central to our statutory mission and their meaning is prescribed by our statutes and nearly 100 years of judicial interpretation.

It is worth reiterating these standards, both as a matter of legal principle and as a note for those participating in this process. A “deceptive” act is one that (1) makes a “representation, omission, or practice that is likely to mislead the consumer” (2) who is “acting reasonably in the circumstances” and (3) is “material,” meaning it would “affect the consumer's conduct or decision with regard to a product or service.” [24]

Congress updated the FTC Act in 1994, adopting into statute the Commission's policy statement on “unfairness.” An act may be “unfair” and in violation of the FTC Act if that act (1) “causes or is likely to cause substantial injury to consumers,” (2) “is not reasonably avoidable by consumers themselves,” and (3) is not “not outweighed by countervailing benefits to consumers or to competition.” [25]

Even after finding that a practice is unfair or deceptive we face an additional hurdle to issuing a Notice of proposed rulemaking leading to a possible Trade Regulation Rule. We may issue proposed rules to prevent unfair or deceptive practices only if we find that such practices are “prevalent.” We can find a practice prevalent if the FTC has “issued cease and desist orders regarding such acts or practices,” or we can determine prevalence through “any other information available to the Commission” that “indicates a widespread pattern of unfair or deceptive acts or practices.” [26]

We cannot invent the law here. I want to underscore this. In this rulemaking we can address only unfair or deceptive practices that we could have otherwise found unlawful in the ordinary enforcement of our Section 5 authority on a case-by-case basis. But the purpose of Section 18 rulemaking is not merely to memorialize unlawful activity that we have already fully adjudicated. 27 Start Printed Page 51291 The ANPR allows us to look at harms systematically and address the root of that unlawful activity. The limiting principle for the scope of conduct we may regulate is the contours of the law itself: acts that are both deceptive or unfair and prevalent.

Scope of the ANPR

The scope of the ANPR is reflective of the broad set of issues that arise from unfettered commercial data collection and use. That a public inquiry into this market asks a wide range of questions—inquiring about issues like collection and consent, algorithms, ad-delivery, demographic data, engagement, and the ecosystem's effects on kids and teens—should not be surprising. This is broadly the same scope of issues the Commission is currently examining in our social media and video streaming study initiated under Chair Simons in 2020.[28]

I believe it is appropriate ask those questions, and more, in this ANPR. I expect that the record will alert us, and Congress, to widespread harms that may otherwise have not reached our attention. Some of those harms may be better addressed under our other sector-specific privacy authorities or under our competition authority. A holistic look at the data economy allows us to better understand the interplay between our consumer protection and competition missions and, should we get to that stage, propose better and more effective rules.

Are data abuse rules different?

Some have argued that this exercise of our rulemaking authority is permissible to address some unfair or deceptive practices in some other sector of the market but not this one.[29] The rules the agency has historically issued already touch hundreds of millions of Americans' lives. FTC rules cover business conduct in funerals,[30] the marketing of new opportunities to consumers,[31] the eyeglasses market,[32] and unfair credit practices.[33] These rules cover sectors with hundreds of billions in economic output. The Franchise Rule,[34] for example, helps govern the business conduct of a sector that employs over 8 million people and contributes over 3% to the country's GDP.[35] This is all to say that the “bigness” of an industry, or the potential significance of rulemaking in that industry, should have little bearing on the legal question about the scope of our authority.[36] As a policy matter, “bigness,” if anything, should compel extra scrutiny of business practices on our part, not a free pass, kid gloves, or a punt to Congress. Though their products and services touch all our lives, technology companies are not exempt from generally applicable laws. If we have the authority to police their business practices by case-by-case enforcement to protect the public from potentially unfair or deceptive practices, and we do, then we have the authority to examine how ex ante rules may also govern those practices.

Issues of Particular Interest

I want to encourage public participation in this comment period, especially from the voices we hear from less at the Commission. Having information in the record from a diverse set of communities and commenters will strengthen the record and help lay a firm foundation for potential agency action. I encourage the public to engage with all the issues we have teed up in the ANPR and to think about how commercial surveillance and abusive data practices affect them not only as consumers of products and services but also as workers, small business owners, and potential competitors to dominant firms.[37] I'm eager to see and evaluate the record in its entirety, but there are some issues I have had a particular interest in during my time at the Commission. I've highlighted some of them below.

Minimization and Purpose and Use Specifications

I have spoken at length about my interest in ideas around data minimization.[38] The ANPR asks several questions related to the concept, and I am eager to see comments about potentially unlawful practices in this area, the state of data collection in the industry, and how that relates to user expectations of the products or services on offer.[39]

Civil Rights, Vulnerable Populations, and Discriminatory Algorithms

Data abuses are a civil rights issue, and commercial surveillance can be especially harmful from a civil rights and equity perspective. The FTC's own reports have explored these issues for years.[40] The FTC's mission to protect consumers from unfair or deceptive practices in commerce must include examining how commercial practices affect the marginalized and vulnerable. Discrimination based on protected-class status is obviously unfair in the colloquial sense and may sometimes be unfair in Section 5 terms as well.[41] As I have written, failure to closely scrutinize the impact of data-driven decision-making tools can create discriminatory outcomes.[42] The ANPR Start Printed Page 51292 asks several questions about the prevalence of such practices, the extent of our authority in this area, and how the FTC, working with other enforcement agencies, may ameliorate those potential harms.[43]

Kids and Teens

As I remarked at COPPA's 20th anniversary, our experience enforcing the Children's Online Privacy Protection Act (“COPPA”) surely has lessons for any potential rulemaking.[44] What can the statutory scheme in COPPA tell us about how to structure potential rules? As a parent, I also have concerns for children as they pass outside the COPPA safety zone of under-13 years old. Are there harms we should examine that affect young teenagers in particular? [45]

Conclusion

The path the Commission is heading down by opening this rulemaking process is not an easy one. But it is a necessary one. The worst outcome, as I said three years ago, is not that we get started and then Congress passes a law; it is that we never get started and Congress never passes a law. People have made it clear that they find this status quo unacceptable.[46] Consumers and businesses alike deserve to know, with real clarity, how our Section 5 authority applies in the data economy. Using the tools we have available benefits the whole of the Commission's mission; well-supported rules could facilitate competition, improve respect for and compliance with the law, and relieve our enforcement burdens.

I have an open mind about this process and no certainty about where our inquiry will lead or what rules the record will support, as I believe is my obligation. But I do know that it is past time for us to begin asking these questions and to follow the facts and evidence where they lead us. I expect that the Commission will take this opportunity to think deeply about people's experiences in this market and about how to ensure that the benefits of progress are not built on an exploitative foundation. Clear rules have the potential for making the data economy more fair and more equitable for consumers, workers, businesses, and potential competitors alike.

I am grateful to the Commission staff for their extensive work leading up to the issuance of this ANPR,[47] as well as to the Chair for her leadership in pushing this project across the starting line, and to my fellow Commissioners for their thoughtful engagement with the document. Both the Chair and Commissioner Bedoya brought their expertise and vision to this endeavor, which is reflected throughout the final product. And, although I do not agree with my dissenting colleagues Commissioners Phillips and Wilson, I very much appreciate their constructive engagement, which has helped improve not only my own thinking but also the substance of the ANPR. I look forward to continued dialogue with all of them.

Statement of Commissioner Alvaro M. Bedoya

Our nation is the world's unquestioned leader on technology. We are the world's unquestioned leader in the data economy. And yet we are almost alone in our lack of meaningful protections for this infrastructure. We lack a modern data security law. We lack a baseline consumer privacy rule. We lack civil rights protections suitable for the digital age. This is a landscape ripe for abuse.

Now it is time to act. Today, we are beginning the hard work of considering new rules to protect people from unfair or deceptive commercial surveillance and data security practices.

My friend Commissioner Phillips argues that this advance notice of proposed rulemaking (“ANPR”) “recast[s] the Commission as a legislature,” and “reaches outside the jurisdiction of the FTC.” [1] I respectfully disagree. Today, we're just asking questions, exactly as Congress has directed us to do.[2] At this most preliminary step, breadth is a feature, not a bug. We need a diverse range of public comments to help us discern whether and how to proceed with notices of proposed rulemaking. There is much more process to come.

In 1975, Congress passed the Magnuson-Moss Warranty—Federal Trade Commission Improvement Act (the “Magnuson-Moss Act”).[3] That Act made explicit the Commission's authority to prescribe rules prohibiting unfair or deceptive trade practices. It also set out steps for doing so, including providing informal oral hearings with a limited right of cross examination, which were consistent with best practices of that time.[4] In the decade following its passage, the Magnuson-Moss Act was viewed as “substantially increasing the agency's rulemaking powers.” [5]

Together with Congress's modest amendments to this process in 1980 [6] and 1994,[7] federal law now gives us a clear roadmap for this work.[8] We will follow it to the letter.

The bipartisan American Data Privacy and Protection Act (ADPPA) is the strongest privacy bill that has ever been this close to passing. I hope it does pass. I hope it passes soon. What Chairman Frank Pallone, Ranking Member Cathy McMorris Rodgers, Senator Roger Start Printed Page 51293 Wicker and their colleagues have accomplished is formidable and promising. This ANPR will not interfere with that effort. I want to be clear: Should the ADPPA pass, I will not vote for any rule that overlaps with it. There are no grounds to point to this process as reason to delay passage of that legislation.

Turning finally to the substance of the ANPR itself: It is a priority for me that the Commission, throughout this rulemaking process, stays focused on the needs of people who are most at risk of being left behind by new technology in the modern economy.[9] So while I will be interested in answers to all of our questions, I am keenly interested to learn about:

1. Emerging discrimination issues (Questions 65-72), especially from civil rights experts and affected communities. I agree with Commissioner Slaughter and Chair Khan that our unfairness authority is a powerful tool for combatting discrimination.[10] It clearly is.[11] Given significant gaps in federal antidiscrimination laws, especially related to internet platforms and technology companies,[12] I believe the Commission must act to protect people's civil rights.

2. The mental health of kids and teens (Question 17), especially from youth development experts and psychologists. A growing body of evidence suggests that teenagers, particularly teenage girls, who spend more than two or three hours daily on social media, suffer from increased rates of depression, anxiety, and thoughts of suicide and self-harm.[13] This is a nuanced issue, and peer-reviewed research is still developing.[14] But this nuance does not diminish the urgency of this work, and in fact heightens our need for comments on it. I appreciate especially the partnership of Commissioner Wilson in this area.

3. How to protect non-English speaking communities from fraud and other abusive data practices (Question 58), especially from affinity groups, internet platforms, and experts in fraud prevention practices. We know that many non-English language communities are disproportionately targeted in the offline world, and I am worried the story is even worse online. I'd like to hear more about how new rules might encourage more effective enforcement by both the Commission and private firms against scams and fraud.

4. How to protect against unfair or deceptive practices related to biometrics (Questions 37-38). A new generation of remote biometric technology is transforming our ability to move in public with some semblance of privacy. I'd welcome proposals for how rules may address and prevent abuse and harmful invasions of privacy.

I want to recognize Commissioner Slaughter for her early vision on this rulemaking process,[15] Chair Khan for her leadership in moving this effort forward, and all the agency staff who worked on it. Although my Republican colleagues are voting against this ANPR, I want them and the public to know I'll still seek their input throughout the process that follows.

I am most grateful to the members of the public, civil society, and small businesses community who will take the time to comment on this ANPR. We need your input. We will read it carefully and with interest.

Dissenting Statement of Commissioner Noah Joshua Phillips

Legislating comprehensive national rules for consumer data privacy and security is a complicated undertaking. Any law our nation adopts will have vast economic significance. It will impact many thousands of companies, millions of citizens, and billions upon billions of dollars in commerce. It will involve real trade-offs between, for example, innovation, jobs, and economic growth on the one hand and protection from privacy harms on the other. (It will also require some level of social consensus about which harms the law can and should address.) Like most regulations, comprehensive rules for data privacy and security will likely displace some amount of competition. Reducing the ability of companies to use data about consumers, which today facilitates the provision of free services, may result in higher prices—an effect that policymakers would be remiss not to consider in our current inflationary environment.[1]

National consumer privacy laws pose consequential questions, which is why I have said, repeatedly,[2] that Congress— Start Printed Page 51294 not the Federal Trade Commission (“FTC” or “Commission”)—is where national privacy law should be enacted. I am heartened to see Congress considering just such a law today,[3] and hope this Commission process does nothing to upset that consideration.

So I don't think we should do this. But if you're going to do it, do it right. The Commercial Surveillance and Data Security advance notice of proposed rulemaking (“ANPR”) issued today by a majority of commissioners provides no notice whatsoever of the scope and parameters of what rule or rules might follow; thereby, undermining the public input and congressional notification processes. It is the wrong approach to rulemaking for privacy and data security.

What the ANPR does accomplish is to recast the Commission as a legislature, with virtually limitless rulemaking authority where personal data are concerned. It contemplates banning or regulating conduct the Commission has never once identified as unfair or deceptive. That is a dramatic departure even from recent Commission rulemaking practice. The ANPR also contemplates taking the agency outside its bailiwick. At the same time, the ANPR virtually ignores the privacy and data security concerns that have animated our enforcement regime for decades. A cavalcade of regulations may be on the way, but their number and substance are a mystery.

The ANPR Fails To Provide Notice of Anything and Will Not Elicit a Coherent Record

The ANPR fails to live up to the promise in its name, to give advance notice to the public (and Congress) of what the Commission might propose. The FTC Act requires an ANPR to “contain a brief description of the area of inquiry under consideration, the objective which the Commission seeks to achieve, and possible regulatory alternatives under consideration by the Commission.” [4] This ANPR flunks even that basic test. The areas of inquiry are vast and amorphous, and the objectives and regulatory alternatives are just not there. It is impossible to discern from this sprawling document—which meanders in and out of the jurisdiction of the FTC and goes far afield from traditional data privacy and security—the number and scope of rules the Commission envisions.[5] The document stands in stark contrast to the focus that characterizes recent ANPRs issued by the Commission, which addressed far more limited topics like impersonating a government entity or private business, deceptive earnings claims, or the scope of the Telemarketing Sales Rule.[6] I supported each of those.

A well-crafted ANPR is calibrated to develop a thorough record. But this ANPR addresses too many topics to be coherent. It requests information ranging from what practices companies currently use to “surveil consumers” [7] to whether there should be a rule granting teens an “erasure mechanism,” [8] what extent any new commercial surveillance rule would impede or enhance innovation,[9] the administrability of any data minimization or purpose limitation requirements,[10] the “nature of the opacity of different forms of commercial surveillance practices,” [11] and whether the Commission has “adequately addressed indirect pecuniary harms, including . . . psychological harms.” [12]

The ANPR provides no clue what rules the FTC might ultimately adopt. In fact, the Commission expressly states that the ANPR does not identify the full scope of approaches it could undertake, does not delineate a boundary on issues on which the public can comment, and in no way constrains the actions it might take in an NPRM or final rule.[13] This scattershot approach creates two obvious problems: stakeholders cannot discern how to engage meaningfully and provide comment, and the lack of focus for their comments will give the Commission a corollary ability to proceed in any direction it chooses. I earnestly cannot see how this document furthers an effort to fashion discrete and durable privacy and data security rules.

The ANPR poses some 95 questions about the myriad topics it purports to address, but many simply fail to provide the detail necessary for commenters to prepare constructive responses. Take the ANPR's blanket request for cost-benefit analyses:

[T]he Commission invites public comment on (a) the nature and prevalence of harmful commercial surveillance and lax data security practices, (b) the balance of costs and countervailing benefits of such practices for consumers and competition, as well as the costs and benefits of any given potential trade regulation rule, and (c) proposals for protecting consumers from harmful and prevalent commercial surveillance and lax data security practices.[14]

This question asks the public to comment on the costs and benefits of any business practice and any possible regulation involving “commercial surveillance,” a term defined so broadly (and with such foreboding [15] ) that it captures any collection or use of consumer data.[16] It goes on to ask commenters how the Commission should evaluate the answers, as if the FTC Act does not provide a framework for fashioning such regulations (it does) and the Commission does not know how to apply it (I hope we do).[17]

These kinds of questions are not conducive to stakeholders submitting data and analysis that can be compared and considered in the context of a Start Printed Page 51295 specific rule. The Commission would be more likely to receive helpful data if it asked commenters for the costs and benefits of some defined kind of conduct, or a particular rule to regulate it—say, information collected by exercise apps, or a rule limiting the use of third-party analytics by those apps.[18] Without specific questions about business practices and potential regulations, the Commission cannot hope for tailored responses providing a full picture of particular practices. Determining the appropriateness and scope of any subsequent proposed rule will prove difficult.

The ANPR Recasts the FTC as a Legislature

The ANPR kickstarts the circumvention of the legislative process and the imposition upon the populace of the policy preferences of a majority of unelected FTC commissioners. The Supreme Court recently noted “a particular and recurring problem [of] agencies asserting highly consequential power beyond what Congress could reasonably be understood to have granted.” [19] Apparently, the FTC is next up to the plate. Our Section 18 authority to regulate “unfair or deceptive acts or practices” [20] goes only so far; and the ANPR contemplates reaching well beyond, including to common business practices we have never before even asserted are illegal. Reading the FTC Act to provide the Commission with the “sweeping and consequential authority” [21] to mandate changes across huge swaths of the economy will test the limits of our congressional delegation.

The ANPR's many references to international and state privacy laws signal the majority's view that the scope of the rules passed by the unelected commissioners of an independent agency should be on par with statutes passed by elected legislators. Even as we vote, Congress is considering actively legislation concerning the very matters the ANPR purports to address.[22] I sincerely hope that this ill-advised process does not upset that very much needed one.

The ANPR colors well outside the lines of conduct that has been the subject of many (or, in a number of prominent cases, any) [23] enforcement actions, where real world experience provides a guide.[24] Unlike our December 2021 ANPR targeting fraudsters that impersonate the government, for example, the Commission does not have 20 years of cases covering the same conduct.[25] The Auto Rule NPRM issued last month also targeted conduct that was the basis of repeated Commission enforcement.[26]

This ANPR, meanwhile, attempts to establish the prevalence necessary to justify broad commercial surveillance rulemaking by citing an amalgam of cases concerning very different business models and conduct.[27] Under Section 18, the agency must show that the unfair acts or practices in question are prevalent, a determination that can only be made if the Commission has previously “issued cease and desist orders regarding such acts or practices,” or if it has any other information that “indicates a widespread pattern of unfair or deceptive acts or practices.” [28] Where the agency has little (or no) experience, prudence counsels in favor of investigation to explore costs and benefits and to determine illegality. The ANPR aims for regulation without even any experience, to say nothing of court decisions ratifying the application of Section 5 to the business conduct in question. As this process moves forward, the Commission would do well to keep in mind that “[a]gencies have only those powers given to them by Congress, and `enabling legislation' is generally not an `open book to which the agency [may] add pages and change the plot line.'” [29]

Take, for example, the ANPR's treatment of “personalized” or “targeted” advertising.[30] The majority seems open to banning—ahem, “limiting”— targeted advertising. Limiting or banning targeted advertising will be a heavy lift for many reasons, not the least of which is that we have never brought a case alleging that targeted advertising is unfair. The Commission has brought cases where companies deceptively collected, used, or shared personal data for purposes including targeted advertising, but that is not the same.[31] Perhaps in recognition of these potential difficulties, the ANPR requests ideas on what potential legal theories might support limits on the use of automated systems in targeted advertising.[32]

Consider also the ANPR's discussion of consent, one of the traditional bedrocks of privacy policy. Whether notice and consent is the optimal approach to consumer privacy in every context is worthy of serious debate. Instead of discussing the merits and shortcomings of transparency and choice, the majority simply concludes that “consent may be irrelevant.” [33] The ANPR bolsters this view with claims that other privacy regimes are moving away from an emphasis on consent. Really? While there are certainly privacy laws that include data Start Printed Page 51296 minimization requirements or restrict secondary uses of data, many still allow for consent. For example, the Children's Online Privacy Protection Act of 1998 requires parents to give verified parental consent before a business collects information from a child.[34] The European Union's General Data Protection Regulation (“GDPR”) allows businesses to process data if they have the consumer's consent, which must be freely given, specific, informed, and unambiguous.[35]

The ANPR appears skeptical that consumers can be trusted to make their own choices, seeking information on what “commercial surveillance” practices are illegal, “irrespective of whether consumers consent to them.” [36] Should the majority be thwarted in its quest to make consent passé, the ANPR contemplates at least having different consent standards for individuals “in crisis” or “especially vulnerable to deception.” [37] This is paternalistic to say the least: Heaven forfend adults make decisions and permit companies to use their data to serve them targeted ads. But even if you disagree with that view, the point is that a consequential decision to take away that choice from individuals—like many of the decisions that need to be weighed in creating a national privacy law—is best left to Congress. The FTC is not a legislature.

The ANPR also contemplates rewriting the Children's Online Privacy Protection Act (“COPPA”).[38] Consistent with its dismissal of consent as a legal basis for collecting data, its discussion of children and teens is hostile to the idea that parents can consent to the collection, use, or sharing of data about their children.[39] In enacting COPPA, with its explicit provision for verifiable parental consent, Congress determined that parents can make decisions about the collection and sharing of their children's personal data.[40] The FTC cannot and should not attempt to overrule Congress through rulemaking—or parents, who routinely have to make all sorts of decisions about our children.

To be fair, the ANPR raises the important issue of whether there should be more rules that protect the privacy of teenagers. COPPA only covers children under thirteen, and there are plenty of data privacy and security issues that impact youth ages 13 to 16 online. But here the ANPR is out of order. Just days ago, the Senate Commerce Committee considered legislation to amend COPPA, including to extend protections to minors up to age 16.[41] Congress is working on these answers. And, lest we forget, so are we. The privacy of children was a central concern of the social media 6(b)s, a project we have not yet completed.[42] The Commission also has had ongoing for years a review of the COPPA Rule. The Commission received over 170,000 comments upon it, the most of any request for input issued in the history of the agency. This ANPR threatens to supersede that process. We should first complete our homework on those projects before starting over the process of writing new rules.

The ANPR is FTC Overreach

The ANPR reaches outside the jurisdiction of the FTC. It seeks to recast the agency as a civil rights enforcer, contemplating policing algorithms for disparate impact without a statutory command.[43] This raises immediate concerns. First, do we have the authority? When Congress seeks to ban discrimination, it says so directly.[44] The FTC Act does not mention discrimination. Second, the civil rights laws Congress has adopted to fight discrimination delineate the bases upon which discrimination is illegal.[45] The FTC Act does not. Third, our antidiscrimination laws cover aspects of commerce where Congress has expressed concern about the impact of discrimination, for example housing, employment, and the extension of credit.[46] The FTC Act applies broadly to any unfair or deceptive act or practice in or affecting commerce. Finally, the FTC Act does not specify whether it is a regime of disparate treatment or disparate impact.

When determining what conduct violates an antidiscrimination law, all of these questions are critical. The FTC Act, which is not such a law, answers none of them. All of that raises the prospect of interpreting the FTC Act to bar disparate impact, including on bases that most would regard as perfectly reasonable or at the very least benign. So, for example, an algorithm resulting in ads for concert tickets being shown more often to music lovers would constitute illegal discrimination against those who are not music lovers. So might a dating app that uses an algorithm to help users find people of the same faith. Under the theory presupposed in the ANPR, such conduct would be illegal.

The ANPR seeks comment on whether the Commission might bar or limit the deployment of any system that produces disparate outcomes, irrespective of the data or processes on which the outcomes were based. (Is this what people mean when they say “algorithmic justice”? [47] ) This could very well mean barring or limiting any technology that uses algorithms to make decisions that apply to people. The ANPR requests comment on whether the FTC should “forbid or limit the development, design, and use of automated decision-making systems that generate or otherwise facilitate outcomes that violate Section 5.” [48] In other words, the Commission wonders if it should put the kibosh on the development of artificial intelligence. Stopping American innovation in its tracks seems to me neither to reflect the law nor to be sound public policy.

The Chair's statement suggests that, through this process, we can and should regulate the relations between Start Printed Page 51297 employers and employees where data are concerned.[49] The only related question in the ANPR asks “[h]ow, if at all, should potential new trade regulation rules address harms to different consumers across different sectors.” [50] That question does not seem designed to obtain the information that would be necessary to regulate employers' use of data concerning their employees, so perhaps the concept is off the table right out of the gate. But if not, I disagree with the premise that the FTC Act confers upon us jurisdiction to regulate any aspect of the employer-employee relationship that happens to involve data.[51]

But wait, there's more. The Commission is also apparently considering prohibiting social media, search, or other companies from owning or operating any business that engages in activities such as personalized advertising.[52] The ANPR seeks comment on whether we should limit finance, healthcare, and search services from cross-selling commercial products.[53] It contemplates requiring companies to disclose their intellectual property and trade secrets.[54] How any of these naked restraints on competition fall within our ken of policing “unfair or deceptive acts or practices” is completely unclear.

My preference would be that before we draft an ANPR, we be clear about the scope of our legal authority and that our proposal would be guided by those limitations. The ANPR looks instead like a mechanism to fish for legal theories that might justify outlandish regulatory ambition outside our jurisdiction and move far beyond where Commission enforcement has tread. Any ideas of how we might have the authority to ban targeted advertising? [55] Are we constrained by the First Amendment or Section 230 of the Communications Decency Act? [56] The ANPR is open to all creative ideas.[57]

The ANPR Gives Short Shrift to Critical Policy Issues Within its Scope

The ANPR lavishes attention on areas that have not been a focus of our enforcement and policy work, but shortchanges data security, one area ripe for FTC rulemaking. Over the past 20 years, the Commission has brought around 80 data security cases, hosted workshops, and done significant outreach to the business community on the topic of data security. A data security rule could protect consumers from the harms stemming from data breaches and provide businesses with greater clarity about their obligation to protect personal data. It could incentivize better data security by increasing the cost of bad security. I would welcome such a rulemaking if fashioned well. Instead of focusing on this important area, the ANPR gives data security short shrift. Six questions. That's it. A data security ANPR would surely have been more than six questions, a good indication that this ANPR is just not enough to make a data security rule. For example, our ANPR on impersonation fraud asked 13 questions about a far narrower topic. This is a missed opportunity to develop the record needed for a rule requiring companies to implement data security safeguards to protect consumers' personal data.

Perhaps the most shocking aspect of this ANPR is not what it contains, but what it leaves out: privacy. Missing from this document is any meaningful discussion about whether there should be different rules based on the sensitivity of data, a traditional area of privacy concern reflected in particular federal laws, which provide greater protection for data considered more sensitive, like health data, financial data, and data collected from children.[58] Almost as an afterthought, the ANPR asks “which kinds of data” might be subject to any potential rules, but there is no attempt at real engagement on the topic.[59] There is no question asking how “sensitive data” should be defined. The ANPR seeks information about whether the Commission should put restrictions on fingerprinting,[60] but is incurious about whether a rule should treat medical history and a social security number differently than an IP address or zip code.[61] ANPR questions focused on treating data differently based on sectors rather than on the sensitivity of the data itself fail to recognize that health data is collected and held across multiple sectors. One of the first steps in any serious attempt to develop a baseline privacy standard should be to determine what information is sensitive and might justify higher levels of protection.

In another departure from most privacy frameworks, the ANPR includes little discussion of how a rule should incorporate important principles like access, correction, deletion, and portability. The majority is so focused on justifying limiting or banning conduct now apparently disfavored that they spare no thought for how best to empower consumers. If you were hoping that the FTC would use its expertise and experience to develop rules that would give consumers greater transparency and control over their personal data, you must be very disappointed.

Conclusion

When adopting regulations, clarity is a virtue. But the only thing clear in the ANPR is a rather dystopic view of modern commerce. This document will certainly spark some spirited conversations, but the point of an ANPR is not simply to pose provocative questions. This is not an academic symposium. It is the first step in a rulemaking process, and the law entitles the public to some sense of where the FTC is going.

I would have supported an ANPR for a data security rule. I would have been more sympathetic to an ANPR that was focused on consumer privacy as reflected in our long record of enforcement and policy advocacy—say, a rule that, for example, would require transparency or that would, depending Start Printed Page 51298 on the sensitivity of the information or the purposes for which it was collected, put some limits on the collection and use of consumer information. These ideas would be consistent with, among other things, Commission enforcement experience. I cannot support an ANPR that is the first step in a plan to go beyond the Commission's remit and outside its experience to issue rules that fundamentally alter the internet economy without a clear congressional mandate. That's not “democratizing” the FTC or using all “the tools in the FTC's toolbox.” It's a naked power grab. I dissent.

Dissenting Statement of Commissioner Christine S. Wilson

Throughout my tenure as an FTC Commissioner, I have encouraged Congress to pass comprehensive privacy legislation.[1] While I have great faith in markets to produce the best results for consumers, Econ 101 teaches that the prerequisites of healthy competition are sometimes absent. Markets do not operate efficiently, for example, when consumers do not have complete and accurate information about the characteristics of the products and services they are evaluating.[2] Neither do markets operate efficiently when the costs and benefits of a product are not fully borne by its producer and consumers—in other words, when a product creates what economists call externalities.[3] Both of these shortcomings are on display in the areas of privacy and data security. In the language of economists, both information asymmetries and the presence of externalities lead to inefficient outcomes with respect to privacy and data security.

Federal privacy legislation would provide transparency to consumers regarding the full scope of data collection, and how collected data are used, shared, sold, and otherwise monetized. In addition, a comprehensive privacy law would give businesses much-needed clarity and certainty regarding the rules of the road in this important area, particularly given the patchwork of state laws that is emerging. And Congressional action would help fill the emerging gaps in sector-specific approaches created by evolving technologies and emerging demands for information. Perhaps most importantly, a national privacy law would help curb violations of our civil liberties.[4]

While I have long been concerned about data collection and usage, the events of 2020 laid bare new dangers and served only to deepen my concerns. During that tumultuous year, I wrote and spoke on several occasions regarding pressing privacy and civil liberties issues.[5] In the face of continued Congressional inaction, I became willing to consider whether the Commission should undertake a Section 18 rulemaking to address privacy and data security. But even then, I emphasized that an FTC rulemaking would be vastly inferior to federal privacy legislation.[6] And I continue to believe that Congressional action is the best course.

I am heartened that Congress is now considering a bipartisan, bicameral bill that employs a sound, comprehensive, and nuanced approach to consumer privacy and data security. The American Data Privacy and Protection Act (ADPPA) rightly has earned broad acclaim in the House Committee on Energy and Commerce and the Subcommittee on Consumer Protection and Commerce, and is moving to a floor vote in the House.[7] I am grateful to Ranking Member Roger Wicker, Chairman Frank Pallone, Chair Jan Schakowsky, Ranking Member Cathy McMorris Rodgers, and Ranking Member Gus Bilirakis for their thoughtful work, and I hope to see this bill become a law. The momentum of ADPPA plays a significant role in my “no” vote on the advance notice of proposed rulemaking (ANPRM) announced today. I am gravely concerned that opponents of the bill will use the ANPRM as an excuse to derail the ADPPA.

While the potential to derail the ADPPA plays a large role in my decision to dissent, I have several other misgivings about proceeding with the ANPRM. First, in July 2021, the Commission made changes to the Section 18 Rules of Practice that decrease opportunities for public input and vest significant authority for the rulemaking proceedings solely with the Chair.[8] Second, the Commission is authorized to issue a notice of proposed rulemaking when it “has reason to believe that the unfair or deceptive acts or practices which are the subject of the proposed rulemaking are prevalent.” [9] Many practices discussed in this ANPRM are presented as clearly deceptive or unfair despite the fact that they stretch far beyond practices with which we are familiar, given our extensive law enforcement experience. Indeed, the ANPRM wanders far afield of areas for which we have clear evidence of a widespread pattern of unfair or deceptive practices. Third, regulatory [10] and enforcement [11] overreach increasingly has drawn sharp criticism from courts. Recent Supreme Court decisions indicate FTC rulemaking overreach likely will not Start Printed Page 51299 fare well when subjected to judicial review. And fourth, Chair Khan's public statements [12] give me no basis to believe that she will seek to ensure that proposed rule provisions fit within the Congressionally circumscribed jurisdiction of the FTC. Neither has Chair Khan given me reason to believe that she harbors any concerns about harms that will befall the agency (and ultimately consumers) as a consequence of her overreach.

While baseline privacy legislation is important, I am pleased that Congress also is considering legislation that would provide heightened privacy protections for children.[13] Recent research reveals that platforms use granular data to track children's online behavior, serve highly curated feeds that increase engagement, and (in some instances) push kids towards harmful content.[14] More broadly, the research reveals a “catastrophic wave of mood disorders (anxiety and depression) and related behaviors (self-harm and suicide)” among minors, and particularly teenage girls, who spend a significant amount of time on social media daily.[15] The Kids Online Safety Act makes particularly noteworthy contributions, and I applaud Senators Richard Blumenthal and Marsha Blackburn on their work.

I appreciate that my newest colleague, Commissioner Alvaro Bedoya, brings to the Commission deep experience in the field of privacy and data security and shares my concerns about protecting children online.[16] I look forward to working with him, FTC staff, and our fellow Commissioners to take constructive steps in this area, including advancing key research, heightening awareness, bringing enforcement actions, and concluding the Commission's ongoing review of the Children's Online Privacy Protection Act.

End Further Info End Preamble

Who is responsible for regulating advertising?

Established by the Federal Trade Commission Act (1914), the Federal Trade Commission (FTC) regulates advertising, marketing, and consumer credit practices and also prevents antitrust agreements and other unfair practices.

Which organizations are considered key players in the self regulation of Internet advertising?

The industry self-regulatory rules are administered and enforced by the Digital Advertising Alliance (DAA). The Council of Better Business Bureaus (CBBB) and The Direct Marketing Association (DMA) work cooperatively to ensure accountability and enforcement of the DAA Self-Regulatory Program Principles.

Which of the following is a federal agency that regulates advertising activities quizlet?

The federal agency that has the broad goal and authority to prevent unfair and deceptive advertising is the: Federal Trade Commission (FTC).

What regulates advertising in the US?

The major regulatory body for the advertising market is the Federal trade commission (the FTC).