On Electronic Word of Mouth and Consumer Protection: A Legal and Economic Analysis by Jens Dammann

Article | Computer & Internet Law
On Electronic Word of Mouth and Consumer Protection: A Legal and Economic Analysis
by Jens Dammann*

From Vol. 94, No. 3
94 S. Cal. L. Rev. 423 (2021)

Keywords: Internet and Technology Law, Product Reviews, Consumer Protection

The most fundamental challenge in consumer protection law lies in the information asymmetry that exists between merchants and consumers. Merchants typically know far more about their products and services than consumers do, and this imbalance threatens the fairness of consumer contracts.

However, some scholars now argue that online consumer reviews play a crucial role in bridging the information gap between merchants and consumers. According to this view, consumer reviews are an adequate substitute for some of the legal protections that consumers currently enjoy.

This Article demonstrates that such optimism is unfounded. Consumer reviews are—and will remain—a highly flawed device for protecting consumers, and their availability therefore cannot justify dismantling existing legal protections.

This conclusion rests on three main arguments. First, there are fundamental economic reasons why even well-designed consumer review systems cannot eliminate information asymmetries between merchants and consumers.

Second, unscrupulous merchants undermine the usefulness of reviews by manipulating the review process. While current efforts to stamp out fake reviews may help to eliminate some of the most blatant forms of review fraud, sophisticated merchants can easily resort to more refined forms of manipulation that are much more difficult to address.

Third, even if the firms operating consumer review systems were able to remedy all the various shortcomings that such systems have, it is highly unlikely that they would choose to do so: by and large, the firms using review systems lack the right incentives to optimize them.

*. Ben H. and Kitty King Powell Chair in Business and Commercial Law, The University of Texas School of Law. For research assistance or editing, or both, I am grateful to Jael Dammann, Elizabeth Hamilton, Stella Fillmore-Patrick, and Jean Raveney.

View Full PDF

Strategic Law Avoidance Using the Internet: A Short History – Postscript (Response) by Tim Wu

From Volume 90, Number 3 (March 2017)
DOWNLOAD PDF

 

We are now some twenty years into the story of the Internet’s bold challenge to law and the legal system. In the early 2000s, Jack Goldsmith and I wrote Who Controls the Internet, a book that might be understood as a chronicle of some the early and more outlandish stages of the story. Professors Pollman and Barry’s excellent article, Regulatory Entrepreneurship, adds to and updates that story with subsequent chapters and a sophisticated analysis of the strategies more recently employed to avoid law using the Internet in some way. While Pollman and Barry’s article stands on its own, I write this Article to connect these two periods. I also wish to offer a slightly different normative assessment of the legal avoidance efforts described here, along with my opinion as to how law enforcement should conduct itself in these situations.

Behind regulatory entrepreneurship lies a history, albeit a short one, and one that has much to teach us about the very nature of law and the legal system as it interacts with new technologies. Viewed in context, Pollman and Barry’s “regulatory entrepreneurs” can be understood as, in fact, a second generation of entrepreneurs who learned lessons from an earlier generation that was active in the late 1990s and early 2000s. What both generations have in common is the idea that the Internet might provide profitable opportunities at the edges of the legal system. What has changed is the abandonment of so-called “evasion” strategies—ones that relied on concealment or geography (described below)—and a migration to strategies depending on “avoidance,” that is, avoiding the law’s direct application. In particular, the most successful entrepreneurs have relied on what might be called a mimicry strategy: they shape potentially illegal or regulated conduct to make it look like legal or unregulated conduct, thereby hopefully avoiding the weight of laws and regulatory regimes.

I take a different, though not necessarily inconsistent, normative position than do Pollman and Barry. Law avoidance is a complex phenomenon. Some of it is undignified avoidance of burdens faced by others, and it is not much different, normatively, from securities fraud or tax evasion. But it is also true that, over the long history of the Anglo-American system, efforts to avoid the law have played an important, and sometimes essential, role in the process of legal evolution; that is, in the process of the salutary adaption of our legal system to our current normative and technological environment. Sometimes technologies may genuinely make laws obsolete or unnecessary. Sometimes it is changing social norms that prompt challenges to the law: the best of such efforts, like forms of legal disobedience during the civil rights era, have become understood as dignified and justified.

But laws do not challenge themselves: someone or something must prompt a reevaulation of an existing regime, which I think is the strongest normative case for some tolerance of regulatory entrepreneurship and other forms of law avoidance. That said, for such a reexamination to provoke a full debate, I think it essential that law enforcement play its part in the dialogue. Sometimes it should vigorously enforce “old laws,” unless the law in question is so obviously moribund that doing so would be ridiculous. Enforcement creates an adversarial process where we, the public, can reexamine whether the values and goals that motivated the law’s enactment remain important or valuable today. This is, of course, necessarily an imperfect process, but one that I think is part of the poorly understood path of legal evolution. The struggle surrounding the Internet’s challenge to law provides a good opportunity to consider these questions afresh.

 

 

90_PS7

Cyberstalking, Twitter, and the Captive Audience: A First Amendment Analysis of 18 U.S.C. 2261A(2) – Note by John B. Major

From Volume 86, Number 1 (November 2012)
DOWNLOAD PDF

This Note will analyze how the cyberstalking statute applies to a particular form of new media, Twitter, within the framework of a First Amendment analysis. While the analysis within this Note is limited to the interplay between Twitter and the cyberstalking statute, the principles discussed, policies weighed, and doctrines explored also apply to the regulation of distressing speech on the Internet generally. Part II examines Twitter, focusing on how Twitter users interact and the effect this has on First Amendment principles. Part III looks closely at the crime of cyberstalking and the cyberstalking statute. It explores the definition of cyberstalking, the difficult nature of cyberstalking regulation, and the harms cyberstalking can cause. It then discusses the cyberstalking statute (including the 2006 amendment at issue in Cassidy), how courts have construed the statute, and what speech the statute criminalizes. Part IV applies First Amendment doctrine to the cyberstalking statute’s regulation of Twitter. This part analyzes the following: how the First Amendment applies to Internet fora; vagueness and overbreadth challenges; the protection of speech covered by the statute; what level of scrutiny should apply to the statute; whether the statute serves to protect a “captive audience”; and how the statute holds up under each level of scrutiny. Further, after laying out these First Amendment principles, Part IV critiques the district court opinion issued in the Cassidy case. Part V proposes potential changes to the statute to ensure it does not run afoul of the First Amendment. Part VI concludes by refocusing on general First Amendment principles and the interests at issue in this case, and it emphasizes that protecting the captive audience may be the most appropriate role for cyberstalking laws to serve.


 

86_117

From Bombs and Bullets to Botnets and Bytes: Cyber War and the Need for a Federal Cybersecurity Agency – Postscript (Note) by Danielle Warner

From Volume 85, Number 1 (November 2011)
DOWNLOAD PDF

In September 2010, Iranian engineers detected that a sophisticated computer worm, known as Stuxnet, had infected and damaged industrial sites across Iran, including its uranium enrichment site, Natanz. In just a few days, a sophisticated computer code was able to accomplish what six years of United Nations Security Council resolutions could not. Not a single missile was launched, nor any tanks deployed, yet the computer worm effectively set back the Islamic Republic’s nuclear program by two years and destroyed roughly one-fifth of its nuclear centrifuges. The worm itself included two major components. One was designed to send Iran’s nuclear centrifuges spinning out of control, damaging them. The other component seemed right out of the movies; “the computer program . . . secretly recorded what normal operations at the nuclear plant looked like, then played those readings back to plant operators, like a pre-recorded security tape in a bank heist, so that it would appear that everything was operating normally while the centrifuges were actually tearing themselves apart.”

Stuxnet, to date, is the most sophisticated cyber weapon ever deployed. It acted as a “collective digital Sputnik moment,” bringing to light the important cybersecurity challenges the world faces. What makes cyber attacks so destructive is their ability to travel through the Internet and attack the structures it rests upon. Governments, industrial and financial companies, research institutions, and billions of citizens worldwide heavily populate these global networks. In fact, much of public and private life depends on functioning telecommunications and information-technology infrastructures. Thus, what we deemed to be one of the greatest successes of the twenty-first century, a global communication infrastructure, has now become our biggest vulnerability.


 

85_PS1

Depictions of the Pig Roast: Restricting Violent Speech Without Burning the House – Note by Michael Reynolds

From Volume 82, Number 2 (January 2009)
DOWNLOAD PDF

Pornography dominates the discussion about free speech on the Internet. Congress has twice enacted legislation aimed at preventing minors from getting access to online pornography. Federal and local law enforcement agencies have dramatically increased efforts to combat the spread of child pornography. The Department of Justice has renewed attempts to crack down on obscene material after years of lax enforcement.

Yet the debate about online pornography has overshadowed another disturbing Internet phenomenon. The Internet has facilitated growth in the availability of extremely violent images and videos. A little online searching reveals depictions of torture, of both humans and animals; videos depicting murders and executions, including beheadings by Islamic militants; videos of brutal amateur street fights, some consensual, but many not; videos of minors engaged in schoolyard fights and beatings, some posted to humiliate the victims; and videos of cockfighting. Online retailers have sold videos of dog fights and extremely violent video games, including one in which the player is tasked with making graphic snuff videos and another which allows the player to play fetch with dogs using human heads.


 

82_341

Clearing a Path for Digital Development: Taking Patents in Eminent Domain Through the Adoption of Mandatory Standards – Note by Brian Cook

From Volume 82, Number 1 (November 2008)
DOWNLOAD PDF

Though largely unnoticed by the public, March 1, 2007, marked the transition from traditional analog television to digital broadcast television (“DTV”), a move some have characterized as the most significant change to the television broadcast industry since color replaced black and white. On that date, Federal Communications Commission (“FCC”) regulations went into effect mandating that all televisions sold in the United States contain a digital tuner capable of receiving DTV broadcast signals. If consumers are unaware of the change now, it will not escape their attention on February 17, 2009, when their old analog sets go dark as broadcasters comply with further FCC regulations mandating the cessation of all analog television broadcasts. Ultimately, the government intends to profit by auctioning off the additional frequency spectrum freed up by the more efficient digital use of the broadcast spectrum.


 

82_97

Cyber Crime 2.0: An Argument to Update the United States Criminal Code to Reflect the Changing Nature of Cyber Crime – Note by Charlotte Decker

From Volume 81, Number 5 (July 2008)
DOWNLOAD PDF

In 1945, two engineers at the University of Pennsylvania invented the first general-purpose electronic computing device—the Electronic Numerical Integrator and Computer (“ENIAC”). The ENIAC was capable of 5000 simple calculations a second, yet it took up the space of an entire room, “weighed 30 tons, and contained over 18,000 vacuum tubes, 70,000 resistors, and almost 5 million hand-soldered joints.” This machine cost over $1 million dollars, equivalent to roughly $9 million today. Over the next thirty years integrated circuits shrunk, yielding microprocessors able to perform millions and billions of calculations per second with new storage media able to hold megabits and gigabits of data. As a result, computers became smaller, more advanced, and dramatically less expensive. Still, prior to the late-1980s, these and other computers were “solely the tool[s] of a few highly trained technocrats.” In the mid-1980s, only 8.2 percent of American households contained computers. American public businesses, universities, and research organizations used only 56,000 large “general purpose” computers and 213,000 smaller “business computers”; private businesses used another 570,000 “mini-computers” and 2.4 million desktop computers; and the federal government employed between 250,000 and 500,000 computers.


81_959

Reservoirs of Danger: The Evolution of Public and Private Law at the Dawn of the Information Age – Article by Danielle Keats Citron

From Volume 80, Number 2 (January 2007)
DOWNLOAD PDF

A defining problem of the Information Age is securing computer databases of ultrasensitive personal information. These reservoirs of data fuel our Internet economy but endanger individuals when their information escapes into the hands of cyber-criminals. This juxtaposition of opportunities for rapid economic growth and novel dangers recalls similar challenges society and law faced at the outset of the Industrial Age. Then, reservoirs collected water to power textile mills: the water was harmless in repose but wrought havoc when it escaped. After initially resisting Rylands v. Fletcher’s strict-liability standard as undermining economic development, American courts and scholars embraced it once the economy matured and catastrophes such as the Johnstown Flood made those hazards impossible to ignore.

Public choice analysis suggests that a meaningful public law response to insecure databases is as unlikely now as it was in the early Industrial Age. The Industrial Age’s experience can, however, help guide us to an appropriate private law remedy for the new risks and new types of harm of the early Information Age. Just as the Industrial Revolution’s maturation tipped the balance in favor of early tort theorists arguing that America needed, and could afford, a Rylands solution, so too the Information Revolution’s deep roots in American society and many strains of contemporary tort theory support strict liability for bursting cyber-reservoirs of personal data instead of a negligence regime overmatched by fast-changing technology. More broadly, the early Industrial Age offers valuable lessons for addressing other important Information Age problems.


 

80_241

Pacifica is Dead. Long Live Pacifica: Formulating a New Argument Structure to Reserve Government Regulation of Indecent Broadcasts – Note by Joshua B. Gordon

From Volume 79, Number 6 (September 2006)
DOWNLOAD PDF

At 9:00 PM on April 7, 2003, Fox Broadcasting (“Fox”) aired the penultimate episode of Married by America, a reality television show that allowed the public to select potential spouses for its contestants. Six minutes of the episode detailed the remaining two couples’ bachelor and bachelorette parties, during which strippers attempted to “lure participants into sexual activities.” Of the five million people who watched the broadcast, ninety complaints were filed with the Federal Communications Commission (“FCC” or “Commission”), the government agency that regulates television communications. In October 2004, the FCC determined that the six-minute segment contained explicit and patently offensive depictions of sexual activities. It thus determined that the content was indecent and in violation of federal law. For this violation, the FCC penalized both Fox and 169 Fox affiliates by issuing a Notice of Apparent Liability for $1,183,000 in fines. At the time, this was the largest proposed fine, or “forfeiture,” in FCC history.


 

79_1451

Wireless Telecommunications: Spectrum as a Critical Resource – Article by Gerald R. Faulhaber

From Volume 79, Number 3 (March 2006)
DOWNLOAD PDF

Telecommunications services have always been a mix of wireline services, such as wireline telephone, cable television, and Internet access, and wireless services, such as AM/FM radio, broadcast television, and microwave-satellite transmission of electronic signals. Each mode of service has certain properties, both beneficial and detrimental. Wireline has the potential for almost unlimited capacity, such as the use of multigigabit fiber optics, but requires that the service be delivered to a particular location. Wireless frees the customer from being tied to a specific location, allowing service to be rendered wherever the customer is, but suffers from fading or nonexistent connections and possible privacy concerns. The mix between wireline mode and wireless mode is in constant flux; recently, however, the focus of the market has been shifting toward wireless. Cellular telephony has exploded worldwide, and after a slow start, the market penetration has increased dramatically. Meanwhile, the number of wired access lines in the United States has been declining, for the first time since the Great Depression.

The ability of engineers and innovative firms to bring new and compelling wireless telecommunications applications to an ever-communicating market is very impressive, and bodes well for even greater applications in the future. But even the cleverest of engineers cannot escape the one critical resource absolutely required for wireless services to be deployed: electromagnetic spectrum. Wireless services and devices are all radios, emitting electromagnetic radiation into free space and receiving such radiation. If other nearby transmitters are emitting radiation at the same frequency, the intended receivers will be unable to disentangle the signal they wish to receive from the spurious “interfering” signal. Fundamental to wireless technology is the need to solve this potential interference problem. Since the birth of radio in the 1920s, the interference problem has been solved by government licensing of transmission rights; each licensee is permitted to transmit from a particular place at a particular frequency at a maximum power for a particular application (for example, broadcast radio or police dispatch). Licensing has traditionally been a highly bureaucratic and political process. The outcome, all agree, has been a hugely inefficient use of spectrum resulting from this “command and control” regulatory allocation of a scarce resource.


 

79_537