Now before the Court is Apple Inc.’s motion for a protective order as well as Apple’s
administrative motion to file a supplemental declaration from Isabelle L. Ord. The Court
GRANTS the administrative motion. (Dkt. No. 203.) The Court will address the remaining
pending motions to seal in a separate order.
In this putative class action, Plaintiffs allege that Defendant Apple, Inc. (“Apple”) violated
their rights to privacy and misused their data when Plaintiffs accidentally activated Defendant’s
“Siri” product in a “False Accept” that then led to a recording. (Dkt. Nos. 1, 48, 70.) Specifically,
Apple had represented to users of Apple’s devices that contained Siri that the devices
would only listen to, record, and share their conversations with their
consent, which can be given only: (i) by uttering an activation
command, like “Hey, Siri” (the “hot word”); (ii) by manually
pressing a button on the device; and (iii) in case of the AppleWatch,
by raising the AppleWatch to one’s mouth and beginning to talk.
(Dkt. No. 70 (Second Amended Class Action Complaint at ¶4).) Plaintiffs allege that, despite
these promises, Apple, in an attempt to improve its products, in fact recorded conversations,
listened to them, and then shared them with third party contractors without obtaining consent of
the users, even when users did not utter the activation command (“Hey, Siri”) or follow other steps to activate Siri. (Dkt. No. 70 (Second Amended Class Action Complaint) at ¶ 4.) These events of
“False Accept” occurred when the devices with Siri were “accidentally woken up.” (Dkt. No. 70
at ¶ 70.) For example instead of activating by the activation command, the “sound of a zip”
accidentally woke up Siri to listen to and record the following conversation. (Dkt. No. 70 at ¶ 70.)
Plaintiffs also allege that the reason Apple sends these recordings “where no hot word has been
uttered or button pushed to improve the functionality of Siri, and thereby market and sell more Siri
Devices.” (Dkt. No. 70 at ¶ 75.) Plaintiffs allege and Apple does not dispute that Apple has in the
past given these accidental recordings based on False Accepts to third parties to determine why
Siri was activated without the proper activation phrase or activation method and to improve the
product to prevent future false prompts. Apple contends with no dispute from Plaintiffs that the
recordings are anonymized so that there is no way to link them to a specific user. Plaintiffs assert
claims for violation of the Wiretap Act (18 U.S.C. section 2510 et seq.), violation of California
Penal Code § 632, violation of Article I, Section 1 of the California Constitution, breach of
contract, and declaratory relief. (Dkt. Nos. 70, 77 (Order dismissing some claims).)
Apple now seeks a protective order from this Court to follow its data retention policy of
deleting the recordings from all interactions with Siri from all sources worldwide. Although
Apple records a very short portion of all interactions with Siri, Apple’s data retention policy
provides that Apple does not preserve all Siri data but instead preserves only a subset of data that
is collected for, among other reasons, to study the issue of false prompts.
Apple argues that, because the recordings at issue in this case constitute only a small part
of all the recordings that Apple makes and because the cost to Apple of maintaining all recordings
is enormous (more than [REDACTED]), the burden of maintaining the recordings outweighs the need
for the recordings. Apple estimates that “the [REDACTED] ingests more than [REDACTED] of data on an average day including more than [REDACTED] Siri audio recordings.” (Dkt.
No. 176-1 (Motion for Protective Order filed as redacted at Dkt. No. 204 and unredacted under
seal at 177-2) (citing Declaration of Isabel Schunemann in Support of Apple’s Motion for
Protective Order, ¶ 10).) “If played consecutively, the estimated length of [REDACTED] Siri audio recordings would exceed [REDACTED] . . . . Apple further estimates that the [REDACTED] independently ingests approximately [REDACTED] of data on an average day.” (Id. (citing
Schunemann Decl., ¶ 12).) The cost for storing this data over two years is [REDACTED]. (Id.
(citing Schunemann Decl., ¶¶ 11-12).) More than [REDACTED] of data is stored every day. (Dkt.
No. 176-1 (Declaration of Isabelle L. Ord in Support of Apple’s Motion for Protective Order filed
as redacted at Dkt. No. 204 and unredacted under seal at 177-2), ¶ 4.)
The class is limited to citizens of the United States, but Apple presents evidence – again
with no conflicting evidence from Plaintiffs – that there is currently no method of segregating its
recordings made from U.S. citizens from the recordings made worldwide. (Schunemann Decl. ¶
5.) Apple also argues that creating such a system would require a massive amount of work by
many people and again is not cost effective. (Id.)
Apple argues that the relevance of the recordings going forward is low, given that Apple
has already provided data about the previously-made recordings as part of the sampling method
and proposes to maintain the sample of data going forward, as part of its usual process, to learn
why a false prompt occurred. Plaintiffs argue that Apple has the burden of showing that there is
no relevance for all the recordings, but Plaintiffs do not explain why they need all the recordings
made by Siri. Here, the relevance of all recordings made by Siri is low. The main issue in this
case is that Apple took a small sampling of the recordings made by Siri and allowed third parties
to listen to them in an attempt to improve the product, and this action allegedly violated the
contract between Apple and its users and violated the users’ rights to privacy. That Apple made
millions of other recordings, some with and some without false prompts, is not relevant to this
case. Thus, forcing Apple to maintain all recordings made by Siri does not satisfy the objectives
of Federal Rule of Civil Procedure 26, which requires the Court to balance the cost against the
relevance of the evidence.
[1] For these reasons, the Court GRANTS Apple’s motion for protective order to allow Apple to continue to use its retention policy for recordings made by Siri.
Finally, Plaintiffs complain that Apple spoliated evidence in the past by adhering to its
retention policy, but this motion merely addresses the future. Apple filed this motion for relief
going forward, and this Order thus does not address any past conduct.
Because this Order cites portions of documents filed under seal, the parties may request
that a redacted version of this Order be filed on the public docket. The parties must file such a
request by March 5, 2024. Failure to do so will result in the filing of the unredacted version of
this Order on the public docket.
IT IS SO ORDERED.
Footnotes
Rule 26(b) allows a party to obtain discovery concerning any nonprivileged matter that is
relevant to any party’s claim or defense and that is “proportional to the needs of the case,
considering the importance of the issues at stake in the action, the amount in controversy, the
parties’ relative access to relevant information, the parties’ resources, the importance of the
discovery in resolving the issues, and whether the burden or expense of the proposed discovery
outweighs its likely benefit.” Fed. R. Civ. P. 26(b)(1).