Recent articles:

CCPA claim against Apple thrown out on Section 230 grounds

Plaintiffs sued Apple after downloading a malicious app from the App Store. The claims included violation of the Computer Fraud and Abuse Act (“CFAA”), the Electronic Communications Privacy Act (“ECPA”), and the California Consumer Privacy Act (“CCPA). (Alphabet soup, anyone?)

The lower court granted Apple’s motion to dismiss these claims. Plaintiffs sought review with the Ninth Circuit Court of Appeals. On appeal, the court held that the lower court properly applied Section 230 immunity to dismiss these claims.

What Section 230 does

Section 230 (47 U.S.C. § 230) instructs that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” A defendant is not liable if it can show that (1) it is a provider of “interactive computer services” as defined by the statute, (2) the claim relates to “information provided by another content provider,” and (3) the claim seeks to hold defendant liable as the “publisher or speaker” of that information.

Why the CFAA and ECPA claims were dismissed

In this case, concerning the CFAA and ECPA claims, the court looked to Barnes v. Yahoo!, Inc., 570 F.3d 1096 (9th Cir. 2009) and concluded that the lower court properly found Section 230 immunity to apply. The duty that plaintiffs alleged Apple violated derived from Apple’s status or conduct as a “publisher or speaker.” It found that the claims referred, as the basis for culpability, to Apple’s authorization, monitoring, or failure to remove the offending app from the App Store. “Because these are quintessential “publication decisions” under  Barnes, 570 F.3d at 1105, liability is barred by  section 230(c)(1).”

Section 230 knocked out CCPA claim too

The data privacy count included allegations that Apple violated duties to “implement reasonable security procedures and practices” to protect the personal information of App Store users, in violation of  Cal. Civ. Code § 1798.100(e). The court said that it need not decide whether violations of such duties can be boiled down to publication activities in every instance or whether implementation of reasonable security policies and practices would always necessarily require an internet company to monitor third-party content. Citing to Lemmon v. Snap, Inc., 995 F.3d 1085 (9th Cir. 2021) the court found that in this case, at least, plaintiffs failed to plead adequately a theory of injury under CCPA that was “fully independent of [Apple’s] role in monitoring or publishing third-party content.”

Diep v. Apple, Inc., 2024 WL 1299995 (9th Cir. March 27, 2024)

Negligence claim against Roblox for minors’ gambling moves forward

roblox negligence

Plaintiffs sued defendant Roblox asserting various claims, including under RICO and California unfair competition law. Plaintiffs also claimed that Roblox was negligent by providing a system whereby minors were lured into online gambling.

Roblox moved to dismiss the negligence claim for failure to state a claim upon which relief may be granted. The court denied the motion to dismiss, allowing the negligence claim to move forward.

Roblox’s alleged involvement with online casinos

The gaming platform uses a virtual currency called “Robux” for transactions within its ecosystem. Users can purchase Robux for in-game enhancements and experiences created by developers, who can then convert their earned Robux into real money through Roblox. However, plaintiffs allege that online casinos accept Robux for gambling, thereby targeting minors. These casinos allegedly conduct sham transactions on Roblox to access a minor’s Robux, allowing the minors to gamble. When minors lose Robux in these casinos, the lost currency is converted back into cash, with Roblox allegedly facilitating these transactions and profiting from them. Plaintiffs claim Roblox is fully aware that its services are being exploited to enable illegal gambling activities involving minors in this way.

Why the negligence claim survived

The court observed that under California law, there is a fundamental obligation for entities to act with reasonable care to prevent foreseeable harm. Roblox argued that it was exempt from this duty. But the court rejected this argument, holding that that Roblox did indeed owe a duty to manage its platform responsibly to avoid harm, including by alerting parents about gambling risks.

Colvin v. Roblox Corporation, — F.Supp.3d —, 2024 WL 1268420 (N.D. Cal. March 26, 2024)

See also:

VIDEO: AI and Voice Clones – Tennessee enacts the ELVIS Act

 

Generative AI enables people to clone other peoples’ voices. And that can lead to fraud, identity theft, or intellectual property infringement.

On March 21, 2024, Tennessee enacted new law called the ELVIS Act that seeks to tackle this problem. What does the law say?

The law adds a person’s voice to the list of things over which that person has a property interest. And what is a “voice” under the law? It is any sound that is readily identifiable and attributable to a particular individual. It can be an actual voice or a simulation of the voice. A person can be liable for making available another person’s voice without consent. And one could also be liable for making available any technology having the “primary purpose or function” of producing another’s voice without permission.

 

Section 230 immunity protected provider of ringless voicemail services to telemarketers

Defendant telecommunication services provider provided ringless voicemail services and VoIP services to telemarketers. These services enabled telemarketers to mass deliver prerecorded messages directly to recipients’ voicemail inboxes without causing the recipients’ phones to ring or giving recipients the opportunity to answer or block the call.

The federal government sued a couple of telemarketers and defendant alleging violation of the FTC Act, which prohibits unfair or deceptive acts or practices in commerce. Defendant moved to dismiss the action, arguing that Section 230 provided it immunity from liability. The court granted the motion.

Section 230 immunity

Section 230(c) (at 47 U.S.C. 230(c)) provides that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

Defendant asserted it met the criteria for Section 230 immunity because of (1) its role as an interactive computer service, (2) the way the government’s claims sought to treat it as a publisher or speaker of the allegedly unlawful calls, and (3) the potential liability was based on third party content (the calls being placed by the other telemarketing defendants).

Ringless voicemail services were an “interactive computer service”

The government argued defendant was not an “interactive computer service” because recipients accessed their voicemails through their telephones rather than a computer. The court rejected this argument, finding that defendant had shown that it transmitted content and provided access to multiple users to a computer server, thereby meeting the statutory definition of an interactive computer service.

Lawsuit sought to treat defendant as a publisher or speaker

The government next argued that its claims against defendant did not seek to treat defendant as the publisher or speaker of content, because defendant’s liability did not depend on the content of the transmitted messages. The court likewise rejected this argument as well because it was indeed the content that gave rise to liability – had the voicemails at issue not been for commercial purposes, they would not have been unlawful, and the matter would not have been brought in the first place.

Allegations related to the content of unlawful voicemails

Finally, as for the third element of Section 230 immunity – the offending content being provided by a third party – the court also sided with defendant. “While [defendant] developed the ringless voicemail technology at issue, that development goes to how the third-party content is distributed rather than the content itself.”

United States v. Stratics Networks Inc., 2024 WL 966380 (S.D. Cal., March 6, 2024)

See also:

New Jersey judiciary taking steps to better understand Generative AI in the practice of law

We are seeing the state of New Jersey take strides to make “safe and effective use of Generative AI” in the practice of law. The state’s judiciary’s Acting Administrative Director recently sent an email to New Jersey attorneys acknowledging the growth of Generative AI in the practice of law, recognizing its positive and negative uses.

The correspondence included a link to a 23-question online survey designed to gauge New Jersey attorneys’ knowledge about and attitudes toward Generative AI, with the aim of designing seminars and other training.

The questions seek to gather information on topics including the age and experience of the attorneys responding, attitudes toward Generative AI both in the and out of the practice of law, the levels of experience in using Generative AI, and whether Generative AI should be a part of the future of the practice of law.

This initiative signals the state may be taking a  proactive approach toward attorneys’ adoption of these newly-available technologies.

See also:

 

Lawyer gets called out a second time for using ChatGPT in court brief

You may recall the case of Park v. Kim, wherein the Second Circuit excoriated an attorney for using ChatGPT to generate a brief that contained a bunch of fake cases. Well, the same lawyer responsible for that debacle has been found out again, this time in a case where she is the pro se litigant.

Plaintiff sued Delta Airlines for racial discrimination. She filed a motion for leave to amend her complaint, which the court denied. In discussing the denial, the court observed the following:

[T]he Court maintains serious concern that at least one of Plaintiff’s cited cases is non-existent and may have been a hallucinated product of generative artificial intelligence, particularly given Plaintiff’s recent history of similar conduct before the Second Circuit. See Park v. Kim, 91 F.4th 610, 612 (2d Cir. 2024) (“We separately address the conduct of Park’s counsel, Attorney Jae S. Lee. Lee’s reply brief in this case includes a citation to a non-existent case, which she admits she generated using the artificial intelligence tool ChatGPT.”).

In Park v. Kim, the court referred plaintiff for potential disciplinary action. The court in this case  was more lenient, by just denying her motion for leave to amend, and eventually dismissing the case on summary judgment.

Jae Lee v. Delta Air Lines, Inc., 2024 WL 1230263 (E.D.N.Y. March 22, 2024)

See also:

AI and voice clones: Three things to know about Tennessee’s ELVIS Act

On March 21, 2024, the governor of Tennessee signed the ELVIS Act (the Ensuring Likeness, Voice, and Image Security Act of 2024) which is aimed at the problem of people using AI to simulate voices in a way not authorized by the person whose voice is being imitated.

Here are three key things to know about the new law:

(1) Voice defined.

The law adds the following definition to existing Tennessee law:

“Voice” means a sound in a medium that is readily identifiable and attributable to a particular individual, regardless of whether the sound contains the actual voice or a simulation of the voice of the individual;

There are a couple of interesting things to note. One could generate or use the voice of another without using the other person’s name. The voice simply has to be “readily identifiable” and “attributable” to a particular human. Those seem to be pretty open concepts and we could expect quite a bit of litigation over what it takes for a voice to be identifiable and attributable to another. Would this cover situations where a person naturally sounds like another, or is just trying to imitate another musical style?

(2) Voice is now a property right.

The following underlined words were added to the existing statute:

Every individual has a property right in the use of that individual’s name, photograph, voice, or likeness in any medium in any manner.

The word “person’s” was changed to “individual’s” presumably to clarify that this is a right belonging to a natural person (i.e., real human beings and not companies). And of course the word “voice” was added to expressly include that attribute as something in which the person can have a property interest.

(3) Two new things are banned under law.

The following two paragraphs have been added:

A person is liable to a civil action if the person publishes, performs, distributes, transmits, or otherwise makes available to the public an individual’s voice or likeness, with knowledge that use of the voice or likeness was not authorized by the individual or, in the case of a minor, the minor’s parent or legal guardian, or in the case of a deceased individual, the executor or administrator, heirs, or devisees of such deceased individual.

A person is liable to a civil action if the person distributes, transmits, or otherwise makes available an algorithm, software, tool, or other technology, service, or device, the primary purpose or function of which is the production of an individual’s photograph, voice, or likeness without authorization from the individual or, in the case of a minor, the minor’s parent or legal guardian, or in the case of a deceased individual, the executor or administrator, heirs, or devisees of such deceased individual.

With this language, we see the heart of the new law’s impact. One can sue another for making his or her voice publicly available without permission. Note that this restriction is not only on commercial use of another’s voice. Most states’ laws discussing name, image and likeness restrict commercial use by another. This statute is broader, and would make more things unlawful, for example, creating a deepfaked voice simply for fun (or harassment, of course), if the person whose voice is being imitated has not consented.

Note the other interesting new prohibition, the one on making available tools having as their “primary purpose or function” the production of another’s voice without authorization. If you were planning on launching that new app where you can make your voice sound like a celebrity’s voice, consider whether this Tennessee statute might shut you down.

See also:

VIDEO: What is the Apple antitrust lawsuit about?

On March 21, 2024, the U.S. government, 15 states and the District of Columbia filed an antitrust lawsuit against Apple. What is the case about?

The government says Apple built a dominant iPhone ecosystem, driving its high valuation. But Apple faced threats from other products, particularly Android devices. And in response, it didn’t offer lower prices or offer better terms to developers and consumers. Instead, it imposed complex rules and fees through its App Store and developer agreements, stifling innovation and limiting competition.

Apple’s actions have increased its smartphone dominance and expanded its control to digital wallets and smartwatches by restricting their compatibility with non-Apple products. And this has had broader implications in other industries. The government claims Apple has stifled innovation and competition tied to smartphone technology, such as financial services, entertainment, and more.

So the case seeks to address Apple’s anticompetitive behavior. It aims to restore competition, lower prices for consumers, reduce fees for developers, and encourage innovation. The case is particularly interesting in how it highlights the contrast between Apple’s early days as an innovative startup and its current status as a monopolist.

The government says that this has drastically hurt market competition and consumers.

Bitcoin miner denied injunction against colocation service provider accused of removing rigs

Plaintiff Bitcoin miner sued defendant colocation hosting provider for  breach of contract, conversion, and trespass to chattels under Washington law. After filing suit, plaintiff filed a motion for temporary restraining order against defendant, seeking to require defendant to restore plaintiff’s access to the more than 1,000 mining rigs that defendant allegedly removed from its hosting facility. The court denied the motion, finding that plaintiff had established only possible economic injury, not the kind of irreparable harm required for the issuance of a temporary restraining order.

The underlying agreement

In July 2021, the parties entered into an agreement whereby plaintiff would collocate 1,610 cryptocurrency mining rigs at defendant’s facility. Plaintiff had obtained a loan to purchase the rigs for over $6 million. Defendant was to operate the rigs at a high hash rate to efficiently mine Bitcoin, with defendant earning a portion of the mined BTC.

After plaintiff defaulted on its loan, however, in early 2023, defendant allegedly reduced the available power to the rigs, despite plaintiff having cured the delinquency. Plaintiff claimed this reduced power likewise reduced the amount of Bitcoin that imined, and claims that defendant reallocated resources to other miners in its facility from whom it could earn more money.

The discord between the parties continued through late 2023 and early 2024, with 402 rigs being removed, and then defendant’s eventual termination of the agreement. The parties then began disputing over the removal of the remaining rigs and alleged unpaid fees by plaintiff. In early March 2024, plaintiff attempted to retake possession of its rigs, only to allegedly find defendant’s facility empty and abandoned. This lawsuit followed.

No irreparable harm

The court observed that under applicable law, a party seeking injunctive relief must proffer evidence sufficient to establish a likelihood of irreparable harm and mere speculation of irreparable harm does not suffice. Moreover, the court noted, irreparable harm is traditionally defined as harm for which there is no adequate legal remedy, such as an award of damages. Further, the court stated that it is well established that economic injury alone does not support a finding of irreparable harm, because such injury can be remedied by a damage award.

In this situation, the court found there to be no problem of irreparable harm to plaintiff. The court distinguished this case from the case of EZ Blockchain LLC v. Blaise Energy Power, Inc., 589 F. Supp. 3d 1102 (D.N.D. 2022), in which a court granted a temporary restraining order against a datacenter provider who had threatened to sell its customer’s rigs. In that case, the court found irreparable harm based on the fact that the miners were sophisticated technology and could not be easily replaced.

The court in this case found there was no evidence defendant was going to sell off plaintiff’s equipment. It was similarly unpersuaded that the upcoming Bitcoin halving (anticipated in April 2024) created extra urgency for plaintiffs to have access to their rigs prior to such time, after which mining Bitcoin will be less profitable. Instead, the court found that any losses could be compensated via money damages. And since plaintiff had not provided any evidence to support the idea it would be forced out of business in these circumstances, the court found it appropriate to deny plaintiff’s motion for a temporary restraining order.

Block Mining, Inc. v. Hosting Source, LLC, 2024 WL 1156479 (W.D. Washington, March 18, 2024)

See also: 

Scroll to top