U.S. v. Tennant
U.S. v. Tennant
2023 WL 6978405 (N.D.N.Y. 2023)
October 10, 2023
Sannes, Brenda K., United States District Judge
Summary
The Court found that the private search doctrine applied to the ESI (ESPs) flagged by the National Center for Missing and Exploited Children (NCMEC). The ESPs used hash matching tools to compare hash values of images and videos on their platform with the hash values of “known” Child Sexual Exploitation and Abuse Imagery. The Court denied the Defendant's motion to suppress, as suppression of the evidence would have a heavy toll on society and potentially allow a pernicious crime against children to go unpunished.
UNITED STATES OF AMERICA,
v.
AUSTIN TENNANT, Defendant
v.
AUSTIN TENNANT, Defendant
5:23-cr-79 (BKS)
United States District Court, N.D. New York
Filed October 10, 2023
Counsel
For the United States of America: Carla B. Freedman, United States Attorney, Adrian S. LaRochelle, Paul J. Tuck, Assistant United States Attorneys, 100 South Clinton Street, Syracuse, NY 13261For Defendant: Lisa A. Peebles, Federal Public Defender, Randi J. Bianco, Supervising Assistant Federal Public Defender, 4 Clinton Square, 3rd Floor, Syracuse, NY 13202
Sannes, Brenda K., United States District Judge
MEMORANDUM-DECISION AND ORDER
I. INTRODUCTION
*1 Defendant Austin Tennant is charged with four counts of Distribution of Child Pornography in violation of 18 U.S.C. § 2252A(a)(2)(A), three counts of Receipt of Child Pornography in violation of 18 U.S.C. § 2252A(a)(2)(A), and one count of Possession of Child Pornography in violation of 18 U.S.C. § 2252A(a)(5)(B). (Dkt. No. 40). Defendant moves to suppress all evidence against him that was allegedly obtained through warrantless searches of his social media accounts. (Dkt. No. 38). The Government opposes the motion, (Dkt. No. 46), and Defendant has submitted a reply brief, (Dkt. No. 49). Defendant's motion to suppress is denied, for the reasons that follow.
II. FACTS[1]
A. Background
The charges in this case stem from Defendant's alleged use of social media accounts with three Electronic Service Providers (“ESPs”): SnapChat, Instagram, and Discord. From March 2021 to November 2022, these ESPs flagged numerous files of suspected child pornography that were uploaded to their respective platforms, and as required by law[2], they sent a series of related reports (“CyberTips”) to the CyberTipline of the National Center for Missing and Exploited Children (“NCMEC”). (Dkt. No. 46-2—46-13). According to Susan LaFontant, a Records Specialist at NCMEC, the organization is “a private, nonprofit corporation, whose mission is to help find missing children, reduce child sexual exploitation, and prevent child victimization. (Dkt. No. 46-14, ¶ 2). NCMEC operates a CyberTipline to allow persons to make online reports of activities including enticement of children for sexual acts, child sexual molestation, child pornography, child sex tourism, child sex trafficking, and unsolicited obscene materials sent to a child. (Id., ¶ 3). The majority of CyberTipline reports NCMEC receives relate to “apparent child pornography” and are from ESPs. (Id.). “Except for the incident type, date, and time, NCMEC does not direct or mandate the type of information that a member of the public or an ESP may choose to submit in a CyberTipline report.” (Id., ¶ 4). Rather, “NCMEC provides voluntary reporting fields that a member of the public or an ESP may choose to populate with information.” (Id.).
B. The ESPs
Snapchat is a “visual messaging app that allows users to communicate by sending text messages, photos and short videos taken within the Snapchat app (‘Snaps’), and photos and videos taken outside the app with the camera native to a mobile device (‘Camera Roll media’).” (Dkt. No. 46-16, at 1). In order to use Snapchat, an individual must first register for a Snapchat account; SnapChat “requires all users to agree to and comply with its Terms of Service and associated Community Guidelines when they create an account.” (Id., at 2).
During the relevant time, SnapChat's Community Guidelines stated the following: “We report child sexual exploitation to authorities. Never post, save, or send nude or sexually explicit content involving anyone under the age of 18 — even of yourself. Never ask a minor to send explicit imagery or chats.” (Dkt. No. 46-16, at 23). SnapChat's Terms of Service also stated that if a user failed to comply with the Community Guidelines, “we reserve the right to remove any offending content, terminate or limit the visibility of your account, and notify third parties—including law enforcement—and provide those third parties with information relating to your account.” (Id., at 12). In addition, SnapChat reserved the right to “access, review, screen, and delete your content at any time and for any reason, including to provide and develop the Services or if we think your content violates these Terms.” (Id., at 9).
*2 According to SnapChat employee Alexander Brian Barczak, SnapChat “strives to ensure that its products are free of illegal content, and in particular, Child Sexual Exploitation and Abuse Imagery (‘CSEAI’).” (Dkt. No. 46-16, at 2). Barczak states that SnapChat “uses software tools such as PhotoDNA and Child Sexual Abuse Imager (‘CSAI’) Match technology to detect known illegal images and videos of CSEAI.” (Id., at 3). These tools “create a unique digital fingerprint (a ‘hash’ value or ‘hash’) of images and videos and enable the comparison of those results with the hash values of known CSEAI (i.e., images and videos that were previously determined to be CSEAI).” (Id., at 3 n. 1).
Instagram is an image and video sharing and messaging application owned by Meta, Inc.[3] During the relevant time period, Meta's Community Standards stated that: “We do not allow content that sexually exploits or endangers children. When we become aware of apparent child exploitation, we report it to [NCMEC] in compliance with applicable law.” (Dkt. No. 46-15, at 1). Meta does not indicate how users agree to these Community Standards.
According to Meta employee Tyler Harmon, “Meta identifies content that might violate its Community Standards in various ways,” including “proprietary hash technology to find exact or near exact copies of images or videos that a Meta employee or contractor previously viewed and confirmed as apparent child pornography as provided in 18 U.S.C. § 2256.” (Id., at 2). Harmon states that “[a] hash is a unique string of letters and numbers that reflects the content of an image or video file,” which can be used “to identify exact copies or, nearly exact copies, of the image or video by comparing the hash value of an image or video that a Meta employee or contractor previously viewed and labeled with the hash value of other image or video files.” (Id.). “Before Meta creates a hash for an image or video, at least one Meta employee or contractor who is a member of its content review team must view and verify that the image or video is apparent, reportable child pornography as provided in 18 U.S.C. § 2256.” (Id., at 2–3).
Discord is a voice, video, and text communication platform that allows its users the ability to interact with one another about common interests. (Dkt. No. 50-1, at 1). Discord's Terms of Service, “which a user must accept as part of registering a Discord account, among other things, prohibits users from using Discord's services to do anything illegal.” (Id.). In addition, the Terms of Service provide Discord “the right to monitor, block, remove, and/or permanently delete users’ content if it is in breach of the terms, Community Guidelines, other policies, or any applicable law or regulation.” (Id., at 1–2). Discord's Community Guidelines state in part: “You cannot share content or links which depict children in a pornographic, sexually suggestive, or violent manner, including illustrated or digitally altered pornography that depicts children (such as lolicon, shotacon, or cub) and conduct grooming behaviors. We report illegal content and grooming to [NCMEC].” (Dkt. No. 50-2, at 24).
According to Discord employee Rolando Vega, Discord “contributes to the NCMEC-hosted Industry list of known CSAM (‘Child Sexual Abuse Material’).” (Dkt. No. 50-1, at 2). Vega indicates that Discord uses PhotoDNA, a hash matching technology, to search its platform for known CSAM. (Id.). “When apparent CSAM is reported to Discord, either through an automated scan of content, by a user or otherwise, Discord undertakes a manual, human review, to confirm that the image contains apparent CSAM.” (Id., at 3). Vega further states that Discord “trains a team of employees on the legal obligation to report apparent CSAM,” and the “statutory definition of CSAM and how to recognize it.” (Id.).
C. The CyberTips
*3 On March 3, 2021, NCMEC received a CyberTip from SnapChat regarding an Incident Type of “Child Pornography,” involving seven “jpg” image files that were uploaded to its platform. (Dkt. No. 46-2, at 3–4). In the section of the CyberTip form for “Reported Information,” SnapChat identified a “Suspect” by date of birth, screen/username, and IP Address.[4] (Id., at 3). SnapChat also indicated that the “entire contents” of the uploaded files were “publicly available,” and provided an “MD5” hash value for each file.[5] (Id., at 3–4). In the section of the CyberTip form for “Automated Information Added by NCMEC Systems,” NCMEC provided “categorizations” for each file: “Apparent Child Pornography” for two files; “CP (Unconfirmed)” for two files; “Child Unclothed” for two files; and “Child Clothed” for one file. (Id., at 6). These categorizations were based on a “Hash Match” of “one or more uploaded files to visually similar files that were previously viewed and categorized by NCMEC[.]” (Id.). NCMEC also indicated the files were linked to an IP Address in the Syracuse, New York metropolitan area. (Id.). NCMEC added the following classification: “Apparent Child Pornography Files Not Reviewed by NCMEC, Hash Match.” (Id., at 7). On March 22, 2021, NCMEC forwarded a CyberTipline Report (#87232205) with the above information to the New York State Police Department. (Id., at 1–9).
On April 13, 2021, NCMEC received a CyberTip from SnapChat regarding two more images files of “Child Pornography” that were uploaded to its platform. (Dkt. No. 46-3, at 3–4). SnapChat indicated that the entire contents of these files were “publicly available,” and provided MD5 hash values for each file. (Id., at 3–4). The files were linked to an IP Address in the Syracuse area. (Id., at 5). NCMEC did not view the uploaded files; they were classified as “Apparent Child Pornography Not Reviewed by NCMEC, Hash Match.” (Id., at 6). On April 19, 2021, NCMEC forwarded the CyberTipline Report (#88898921) to the New York State Police. (Id., at 1–8).
On June 10, 2021, NCMEC received a CyberTip from SnapChat regarding another image file of “Child Pornography” that was uploaded to its platform. (Dkt. No. 46-4, at 3–4). SnapChat viewed the entire contents of the uploaded file and provided an MD5 hash value; the entire contents of the file were “publicly available.” (Id., at 3). The file was linked to an IP Address in the Syracuse area. (Id., at 6). NCMEC staff did not view the uploaded file; it was classified as “Apparent Child Pornography (Unconfirmed) Files Not Reviewed by NCMEC, Hash Match.” (Id., at 7). On June 22, 2021, NCMEC forwarded the CyberTipline Report (#92616570) to the New York State Police. (Id., at 1–9).
On January 25, 2022, NCMEC received a CyberTip from Instagram regarding an “mp4” video file of “Child Pornography” that was uploaded to its platform. (Dkt. No. 46-5, at 3–4). Instagram listed an MD5 hash value for the file. (Id., at 4). According to Meta, the file was an “exact match” to a hash in Meta's repository. (Dkt. No. 46-15, at 2). Instagram did not view the contents of the file and did not indicate whether the file was publicly available. (Dkt. No. 46-5, at 4). NCMEC staff did not view the uploaded file and classified it as “Apparent Child Pornography (Unconfirmed) Files Not Reviewed by NCMEC.” (Id., at 8). On December 21, 2022, NCMEC forwarded the CyberTipline Report (#116612117) to the New York State Police. (Id., at 1–9).
On April 29, 2022, NCMEC received a CyberTip from SnapChat regarding three video files of “Child Pornography” that were uploaded to its platform. (Dkt. No. 46-6, at 3–4). SnapChat listed MD5 hash values for each file and viewed the entire contents of each file; the entire contents were not “publicly available.” (Id., at 3–4). The files were linked to an IP Address in the Syracuse area. (Id., at 5). NCMEC viewed one of the uploaded files and classified all three as “Apparent Child Pornography.” (Id., at 7). On May 13, 2022, NCMEC forwarded the CyberTipline Report (#123623081) to the New York State Police. (Id., at 1–8).
*4 On May 3, 2022, NCMEC received a CyberTip from SnapChat regarding one video file of “Child Pornography” that was uploaded to its platform. (Dkt. No. 46-7, at 3). SnapChat listed the MD5 hash value of the file and viewed the entire contents of the file; the contents were not “publicly available.” (Id., at 3). The file was linked to an IP Address in the Syracuse area. (Id., at 5). NCMEC did not view the contents of the file; it was classified as “Apparent Child Pornography Files Not Reviewed by NCMEC, Hash Match.” (Id., at 7). On May 13, 2022, NCMEC forwarded the CyberTipline Report (#123940318) to the New York State Police. (Id., at 1–8).
On May 5, 2022, NCMEC received a CyberTip from Instagram regarding one video file of “Child Pornography” that was uploaded to its platform. (Dkt. No. 46-8, at 3–4). Instagram listed the MD5 hash value of the file; Instagram did not view the contents of the file and did not indicate whether the contents were publicly available. (Id.). According to Meta, the file was an “exact match” to a hash in Meta's repository. (Dkt. No. 46-15, at 2). NCMEC did not view the contents of the file; it was classified as “Apparent Child Pornography Files Not Reviewed by NCMEC, Hash Match.” (Dkt. No. 46-8, at 7). On May 13, 2022, NCMEC forwarded the CyberTipline Report (#124080432) to the New York State Police. (Id., at 1–8).
On May 29, 2022, NCMEC received a CyberTip from SnapChat regarding one video file of “Child Pornography” that was uploaded to its platform. (Dkt. No. 46-9, at 3). SnapChat listed the MD5 hash value of the file and viewed its entire contents; the entire contents were not “publicly available.” (Id.). The file was linked to an IP Address in the Syracuse area. (Id., at 5). NCMEC also viewed the contents of the file and classified it as “Apparent Child Pornography.” (Id., at 7). On June 17, 2022, NCMEC forwarded the CyberTipline Report (#126172992) to the New York State Police Department. (Id., at 1–8).
On October 10, 2022, NCMEC received a CyberTip from Discord regarding one image file of “Child Pornography” that was uploaded to its platform. (Dkt. No. 46-10, at 3–4). Discord listed the MD5 hash value of the file and viewed its entire contents, which were “publicly available.” (Id.). According to Discord, when it states in a CyberTip that a file was publicly available, “it means that, in general, anyone that was a member of the server or direct message in which the content was posted or was provided with the CDN link for the content would have access to view the image.” (Dkt. No. 50-1, at 4). “When Discord includes a statement or indication in a CyberTip that an image was viewed or reviewed by Discord, it is referring to a viewing of that image by a human reviewer concurrent to or immediately preceding making the report.” (Id.). The file was linked to an IP Address in the Syracuse area. (Dkt. No. 46-10, at 5). NCMEC also viewed the contents of the file and classified it as “Apparent Child Pornography.” (Id., at 7). On November 2, 2022, NCMEC forwarded the CyberTipline Report (#136275941) to the New York State Police. (Id., at 1–8).
On October 10, 2022, NCMEC received a CyberTip from Discord regarding another image file of “Child Pornography” that was uploaded to its platform. (Dkt. No. 46-11, at 3–4). Discord listed the MD5 hash value of the file and viewed its entire contents, which were “publicly available.” (Id.). The file was linked to an IP Address in the Syracuse area. (Id., at 5). NCMEC also viewed the contents of the file and classified it as “Apparent Child Pornography.” (Id., at 7–8). On November 2, 2022, NCMEC forwarded the CyberTipline Report (#136275942) to the New York State Police. (Id., at 1–9).
*5 On November 1, 2022, NCMEC received a CyberTip from SnapChat regarding another video file of “Child Pornography” that was uploaded to its platform. (Dkt. No. 46-12, at 3). SnapChat listed the files MD5 hash value and viewed its entire contents; the contents were not “publicly available.” (Id.). The file was linked to an IP Address in the Syracuse area. (Id., at 5). NCMEC also viewed the contents of the file and classified it as “Apparent Child Pornography.” (Id., at 6). On November 2, 2022, NCMEC forwarded the CyberTipline Report (#138051748) to the New York State Police. (Id., at 1–7).
On November 22, 2022, NCMEC received a CyberTip from Discord regarding one image file of “Child Pornography” that was uploaded to its platform. (Dkt. No. 46-13, at 3–4). Discord listed the file's MD5 hash value and viewed its entire contents, which were “publicly available.” (Id.). The file was linked to an IP Address in the Syracuse area. (Id., at 5). NCMEC did not view the file; it was classified as “Apparent Child Pornography Not Reviewed by NCMEC, Hash Match.” (Id., at 7). On November 22, 2022, NCMEC forwarded the CyberTipline Report (#139232954) to the New York State Police. (Id., at 1–8).
D. The Law Enforcement Investigation
On December 9, 2021, Oswego County Sherriff's Office Criminal Investigator Robert J. Obrist reviewed three CyberTips from Snapchat (#87232205, #88898921, and #92616570); Investigator Obrist opened and viewed several attached image files which appeared to show child pornography. (Dkt. No. 38-1, at 7–8). Specifically, for CyberTip #87232205, Investigator Obrist viewed the files and found:
Two of the images were duplicates of a naked preteen girl with her legs spread, showing her vagina, in front of a camera. One image was of two naked preteen girls, one of whom was being vaginally penetrated by what appears to be a naked preteen boy. Only the boy's bottom half is visible in the image. One image was of a preteen girl, naked from the waist down, performing oral sex on what appears to be an adult male. The other three images depict preteen girls partially clothed with their vaginas exposed.
(Id., at 7). For CyberTip #88898921, Investigator Obrist viewed the files and found “the same images of the naked preteen girl spreading her legs and the half-naked preteen girl performing oral sex that had been reported in the previous cybertip.” (Id.). The file associated with CyberTip #92616570 was “the same image of [a] naked preteen girl spreading her legs that had been seen in the two previous cybertips.” (Id., at 8). Investigator Obrist tracked the IP Address associated with these files to a residence in Fulton, New York belonging to Sean Tennant and learned that Defendant was also associated with the address. (Id.). On February 14, 2022, Investigator Obrist applied for and obtained a search warrant for Defendant's cellphone, as well as several other electronic devices. (Id., at 9).
On March 21, 2022, Investigator Obrist was assigned two more CyberTips from Snapchat; he opened and viewed several attached image files which appeared to show child pornography, specifically “two prepubescent girls spreading their legs and exposing their vaginas,” and another image “of a prepubescent girl naked from the waist down with her legs spread in front of the camera.”[6] (Id., at 10). The IP address for these files was tracked to the same residence in Fulton. (Id., at 11). On April 29, 2022, Investigator Obrist reviewed numerous images of suspected child pornography that had been extracted from Defendant's cellphone. (Id., at 11). On May 1 and 2, 2022, Investigator Obrist found additional images on Defendant's cellphone that had been reported from the earlier CyberTips. (Id.). On June 2, 2022, Investigator Obrist applied for and obtained a search warrant for records from Meta as to three Instagram accounts he previously found referenced on Defendant's cellphone. (Id., at 12).
*6 On July 27, 2022, Investigator Obrist reviewed two additional CyberTips from SnapChat (#123623081 and #123940318), which attached videos depicting suspected child pornography (Id., at 14–15). For CyberTip #123623081, Investigator Obrist opened and viewed the files and found: “three videos of prepubescent girls in sexual performances. One was of an underage girl performing oral sex, another was of an underage girl masturbating, and the third was of an underage girl lewdly displaying her vagina and anus for the camera.” (Id., at 14). For CyberTip #123940318, Investigator Obrist opened and viewed the file and found “the same video of an underage girl masturbating as the other cybertip.” (Id., at 15). Both CyberTips were associated with IP addresses tracked to the residence in in Fulton. (Id.). On August 3, 2022, Investigator Obrist applied for and obtained a search warrant for records from nine SnapChat accounts, including those associated with the CyberTips that Investigator Obrist had already opened and viewed. (Id.). In reviewing the returned records, Investigator Obrist saw that the accounts matched up with the sender of suspected child pornography from the CyberTips. (Id., at 16).
On August 9, 2022, Investigator Obrist applied for and obtained additional search warrants for SnapChat records. (Id.). On August 11, 2022, Defendant was arrested and charged with seven counts of Promoting a Sexual Performance by a child, one count for each of the CyberTips reviewed by Investigator Obrist so far. (Id., at 16–17). On August 15, 2022, Investigator Obrist was assigned another CyberTip from SnapChat (#126172992); he opened and viewed a video of suspected child pornography that had been included with two previous CyberTips. (Id., at 17). The IP address associated with this file was tracked to the same residence in Fulton. (Id.). On August 30, 2022, Investigator Obrist reviewed records obtained from the search warrant to Snapchat and found nude pictures and videos of a girl who appeared to be a minor. (Id.).
On January 9, 2023, Investigator Obrist was assigned another CyberTip from Discord (#136275942); he opened and viewed an attached image which appeared to be child pornography, specifically “a pre-pubescent female resting her face on the groin of an underage male, with his penis against her lips.” (Id., at 20). The IP address associated with this file was tracked to the same residence in Fulton. (Id., at 21). On January 10, 2023, Investigator Obrist reviewed additional CyberTips from Instagram, SnapChat, and Discord linked to the previous one (#116612117, #12408432, #136275941, #138051748, #139232954); he opened and viewed attached images and videos depicting suspected child pornography. (Id., at 21–22). For CyberTip #116612117, Investigator Obrist found a video of “a girl masturbating and playing with her vagina in front of a camera.” (Id., at 21). For CyberTip #12408432, Investigator Obrist found a video of “a girl approximately 6–8 years old in a blue dress performing oral sex on an adult male.” (Id.). For CyberTip #136275941, Investigator Obrist found an image of an “underage girl, naked from the waist down, performing oral sex on adult male, who is also nude from the waist down.” (Id., at 22). For CyberTip # 138051748, Investigator Obrist found a video of “an underage boy being anally penetrated by an adult penis.” (Id.). And for CyberTip #139232954, Investigator Obrist found the same image from another CyberTip. (Id.).
These CyberTips were all linked to Defendant by either email address or IP Address. (Id.). On January 12, 2023, Investigator Obrist applied for and obtained search warrants for three accounts on SnapChat and Discord which had been associated with previous CyberTips. (Id., at 22). On January 17, 2023, Investigator Obrist reviewed the returned records from Discord, which contained information linking Defendant to the accounts. (Id., at 22–23).
On January 24, 2023, Investigator Obrist executed a search warrant at Defendant's residence and seized a tablet electronic device, a search of which yielded numerous images and videos depicting suspected child pornography. (Dkt. No. 1, at 3–7). On February 16, 2023, a Criminal Complaint was filed in this Court alleging that Defendant had violated 18 U.S.C. §§ 2252A(a)(2)(A) (Distribution of Child Pornography) and 2252A(a)(5)(B) (Possession of Child Pornography). (Dkt. No. 1). The Indictment and Superseding Indictment followed. (Dkt. Nos. 25, 40).
III. DISCUSSION
*7 Defendant seeks suppression of evidence that was obtained through warrantless searches of his social media accounts on the grounds that these searches violated his rights under the Fourth Amendment. (Dkt. No. 38-2, at 2). Specifically, Defendant claims that Investigator Obrist violated the Fourth Amendment when he opened and viewed images from the CyberTips reported by the ESPs and NCMEC without first obtaining a warrant. (Id., at 3). Defendant's suppression theory rests on three arguments: 1) the ESPs, NCMEC, and Investigator Obrist acted on behalf of the Government; 2) they effected searches by intruding on Defendant's reasonable expectation of privacy and/or protected interest in his “private messages”[7]; and 3) the searches were unreasonable because the alleged Government actors did not obtain warrants. (Id., at 3–9). Finally, Defendant argues that the Government cannot meet its burden of showing an applicable exception to the warrant requirement. (Id., at 9).
In response, the Government argues that Defendant's Fourth Amendment rights were not violated. (Dkt. No. 46, at 9). Among other things, the Government contends that Defendant did not have a reasonable expectation of privacy in the child pornography files, the ESPs and NCMEC are not Government actors, and their private searches were simply reviewed by law enforcement. (Id., at 9–19). Thus, the Government asserts that the private party exception applies to all of the CyberTips. (Id., at 19). Further, the Government suggests that suppression is not warranted even if there was a Fourth Amendment violation because Investigator Obrist acted in good faith. (Id., at 22–23).
In general, the Fourth Amendment prohibits “unreasonable searches” by the government. U.S. CONST. amend. IV. On a motion to suppress, a defendant bears the burden of establishing, by a preponderance of the evidence, “that his own Fourth Amendment rights were violated by the challenged search or seizure.” United States v. Osorio, 949 F.2d 38, 40 (2d Cir. 1991). “A defendant can establish that their Fourth Amendment rights were violated by showing they had a ‘reasonable expectation of privacy’ in the area searched, or that the Government has ‘physically intruded on constitutionally protected areas’ to which they have a property entitlement.” United States v. Lewis, 62 F.4th 733, 741 (2d Cir. 2023) (cleaned up) (citing Katz v. United States, 389 U.S. 347, 360 (1967) (Harlan, J., concurring) and Florida v. Jardines, 569 U.S. 1, 11 (2013)). “Once the movant establishes some basis for the suppression motion, for example a search or seizure conducted without a warrant, the burden of proof shifts to the Government.” United States v. Murphy, 778 F. Supp. 2d 237, 240 (N.D.N.Y. 2011) (citing cases). “The Government then carries the burden to demonstrate by a preponderance of the evidence that the search or seizure did not violate the Fourth Amendment.” Id.
Here, as a threshold matter, Defendant argues that the Government effectuated a search for purposes of the Fourth Amendment when it opened and viewed his “private messages” because doing so: 1) invaded his reasonable expectation of privacy; and 2) trespassed on a constitutionally protected area. (Dkt. No. 38-2, at 7–8). The Court will address each argument in turn.
A. Expectation of Privacy
To establish a Fourth Amendment violation via this theory, Defendant must make a two-part showing: 1) that the searches at issue “invade[d] an object or area where [he] has a subjective expectation of privacy,” and 2) “that society is prepared to accept [that expectation] as objectively reasonable.” United States v. Hayes, 551 F.3d 138, 143 (2d Cir. 2008). The Court finds that Defendant has failed to meet his burden. First, Defendant has failed to adduce any evidence claiming ownership of the social media accounts or any evidence that he believed his communications on the ESPs were private. Notably, Defendant has not submitted an affidavit in support of his motion.[8] Defendant argues that “[p]rivate messages on social media are, like email, ‘inherently private’ because such messages ‘are not readily accessible to the public.’ ” (Dkt. No. 38-2, at 7) (citation omitted). But Defendant carries the burden of production on a motion to suppress evidence, and he cannot substitute legal conclusions for evidence. See also Rawlings v. Kentucky, 448 U.S. 98, 104 (1980) (recognizing that the defendant bears the burden of proving that he had a legitimate expectation of privacy).
*8 For example, Defendant has not attested as to how he used the social media accounts, what if any privacy settings he employed on the ESPs, whether he read and understood the Terms of Service and Community Standards, and whether or not he believed his communications were public, private, or subject to monitoring. Thus, there is no factual basis to conclude that he had a subjective expectation of privacy. See also United States v. Ulbricht, No. 14-CR-68, 2014 WL 5090039, at *6, 13, 2014 U.S. Dist. LEXIS 145553, at *16, 36 (S.D.N.Y. Oct. 10, 2014) (recognizing a privacy interest “must be established by a declaration or other affirmative statement of the person seeking to vindicate his or her personal Fourth Amendment interest in the thing or place searched,” and that “[t]he Court cannot just assume a subjective expectation of privacy”), aff'd, 858 F.3d 71 (2d Cir. 2017); see also United States v. Weber, 599 F. Supp. 3d 1025, 1038 (D. Mont. 2022) (finding that the defendant failed to show that he had a subjective expectation of privacy in the content of his Instagram accounts where, among other things, “he has not introduced any evidence regarding whether the image and video files found on his accounts by Instagram were in public or private parts of his accounts,” and he failed to show “whether or not they had been shared with other users, either through a direct message or posting”).
Second, the Court finds that Defendant has failed to show that he had a reasonable expectation of privacy in the images and videos at issue.[9] While Defendant asserts that his expectation of privacy was reasonable, he has not provided any supporting evidence to suggest that society would agree. Once again, such evidence would logically include a declaration from Defendant attesting to his use of the social media accounts and his understanding of how the ESPs operate and handle images and videos shared on their platforms, including privacy settings. The only evidence cited by Defendant are the search warrant applications completed by Investigator Obrist, which describe his review of the CyberTips and the related investigation targeting Defendant. (Dkt. No. 38-1). Notably, Defendant does not explain how these applications support finding a reasonable expectation of privacy. Therefore, Defendant has once again not carried his burden. See United States v. Westley, No. 17-CR-171, 2018 WL 3448161, at *7, 2018 U.S. Dist. LEXIS 118571, at *20 (D. Conn. July 17, 2018) (“Because Defendants have not submitted any information regarding steps they took to keep their Facebook content private, they have not met their burden to demonstrate that they had a reasonable expectation of privacy in any of the information searched.”); see also United States v. Watson, 404 F.3d 163, 166 (2d Cir. 2005) (finding that a defendant could not show a legitimate expectation of privacy “merely because he anticipated that the Government will link the objects recovered in that search to [him] at trial”).
In any event, the Government has submitted evidence that undermines any reasonable expectation of privacy in the images and videos uploaded to the ESPs in this case. First, the Government has submitted copies of the CyberTip reports from the ESPs to NCMEC. (Dkt. No. 46-2–46-13). Crucially, these reports show that six of the CyberTips from Snapchat and Discord (#87232205, #88898921, #92616570, #136275941, #136275942, #139232954) involved uploaded image files whose “entire contents” were “publicly available.” (Dkt. No. 46-2, at 3–4; Dkt. No. 46-3, at 3–4; Dkt. No. 46-4, at 3; Dkt. No. 46-10, at 4; Dkt. No. 46-11, at 4; Dkt. No. 46-13, at 3–4). According to Discord, that means that “anyone that was a member of the server or direct message in which the content was posted or was provided with the CDN link for the content would have access to view the image.” (Dkt. No. 50-1, at 4). The public nature of these uploaded image files strongly indicates that they could not be reasonably viewed as private, and Defendant has not provided any evidence to the contrary.[10] See also Katz, 389 U.S. at 351 (“What a person knowingly exposes to the public, even in his own home or office, is not a subject of Fourth Amendment protection”); United States v. Whitcomb, No. 18-CR-00123, 2021 WL 5150040, at *6, 2021 U.S. Dist. LEXIS 214485, at *16–17 (D. Vt. Nov. 5, 2021) (finding that the defendant did not meet his burden to show that he had a reasonable expectation of privacy where, among other things, he “failed to establish that the information sought pursuant to the Facebook warrant was not publicly available”). Therefore, the Court finds that Defendant lacked a reasonable expectation of privacy in “publicly available” image files uploaded to the ESPs.
*9 Further, the Government has submitted copies of the Terms of Service and Community Standards of the ESPs, which users must agree to before joining the ESPs, all of which contain clear language that content depicting child sexual exploitation is prohibited and will be reported to the authorities or NCMEC. (Dkt. No. 46-15, at 1; Dkt. No. 46-16, at 23; Dkt. No. 50-2, at 24). The ESPs also inform their users that they may screen and monitor communications on their platforms for prohibited content. (Dkt. No. 46-16, at 9; Dkt. No. 50-1, at 1–2).[11] Defendant argues that he retained a reasonable and legitimate expectation of privacy despite the ESPs’ Terms of Service. (Dkt. No. 49, at 3). But given the prohibitions and reservations of rights in the Terms of Service and Community Standards, even for the CyberTips involving uploaded images and videos whose contents were not “publicly available,” a reasonable person would not have viewed files containing prohibited content as private. And Defendant has not submitted any evidence to the contrary.
In sum, based on all the facts and circumstances, the Court finds that Defendant has failed to show that he had a reasonable expectation of privacy in the uploaded images and videos at issue in this case.[12] See also United States v. Thompson, No. 21-CR-190, 2023 WL 424212, at *4, 2023 U.S. Dist. LEXIS 13650, at *10–11 (D.N.D. Jan. 26, 2023) (finding that the defendant did not have a reasonable expectation of privacy in SnapChat files that were “publicly available”); United States v. Sporn, No. 21-CR-10016, 2022 WL 656165, at *10, 2022 U.S. Dist. LEXIS 39070, at *25–26 (D. Kan. Mar. 4, 2022) (finding that the defendant did not have a reasonable expectation of privacy in his Twitter account where the Terms of Service prohibited child sexual exploitation and reserved the right to access, read, preserve, and remove content); cf. United States v. Coyne, 387 F. Supp. 3d 387, 396 (D. Vt. 2018) (finding that the defendant retained a reasonable expectation of privacy in Microsoft and Yahoo accounts because “general statements” in these ESPs’ user agreements failed “to describe the monitoring or the disclosure of content – without legal process such as a warrant or subpoena– to NCMEC and its law enforcement partners,” and “cannot serve as wholesale waivers of rights arising under the Fourth Amendment”).
B. Protected Area
The Court also finds that Defendant has failed to show that the Government trespassed on a constitutionally protected area. This theory traces back to the Fourth Amendment's baseline protection against physical intrusions on “persons, houses, papers, or effects.” Jardines, 569 U.S. at 5. Thus, separate from the expectation of privacy question, when the Government engages “in a physical intrusion of a constitutionally protected area in order to obtain information, that intrusion may constitute a violation of the Fourth Amendment.” United States v. Jones, 565 U.S. 400, 406 (2012) (quoting United States v. Knotts, 460 U.S. 276, 286 (1983) (Brennan, concurring)).
*10 According to Defendant, “social media messages” should be considered a constitutional protected area akin to one's papers or effects. (Dkt. No. 38-2, at 8). The Government does not dispute this assertion, but argues that even if there was a search, it was done by private parties, and therefore, Defendant's Fourth Amendment rights were not implicated. (Dkt. No. 46, at 8). The Court need not jump to the private search doctrine, however, because Defendant has not demonstrated that his papers or effects are at issue. As discussed above, Defendant has not submitted any sworn evidence asserting ownership of the social media accounts and image and video files referenced in the CyberTips. And it is well-established that “Fourth Amendment rights are personal rights that may not be asserted vicariously.” Rakas v. Illinois, 439 U.S. 128, 133 (1978).
Accordingly, absent any showing that Defendant has a possessory or property interest in the uploaded images and videos at issue here, the Court cannot find that there was an intrusion on a constitutionally protected area. See also United States v. White, No. 17 CR. 611, 2018 WL 4103490, at *8, 2018 U.S. Dist. LEXIS 146444, at *22–23 (S.D.N.Y. Aug. 28, 2018) (“White has failed to establish standing as required to bring this motion to suppress because neither he nor a person with personal knowledge has demonstrated by sworn evidence that White had any property or possessory interest in the Facebook account.”).
C. Private Search Doctrine
Assuming arguendo that a search did occur, the next question is whether it was effectuated by the Government so as to implicate the Fourth Amendment. In this case, the parties dispute whether the ESPs and NCMEC acted privately or on behalf of the Government. The distinction is crucial because it is well-established that Fourth Amendment protection does not apply to a search “effected by a private individual not acting as an agent of the Government or with the participation or knowledge of any governmental official.” United States v. Jacobsen, 466 U.S. 109, 113 (1984).
In Jacobsen, FedEx employees opened a damaged package, observed a white powdery substance, and called law enforcement. Id. at 111. A federal agent removed a trace of the powder, performed a field test, and determined that it was cocaine. Id. at 111–12. The Supreme Court recognized that the initial search of package was a private action and thus did not violate the Fourth Amendment. Id. at 115. As to the agent's actions, the Supreme Court stated that they must be “tested by the degree to which they exceeded the scope of the private search.” Id. Ultimately, the Supreme Court found that “[t]he agent's viewing of what a private party had freely made available for his inspection did not violate the Fourth Amendment.” Id. at 119. Further, the Supreme Court found that “the removal of the plastic bags from the tube and the agent's visual inspection of their contents enabled the agent to learn nothing that had not previously been learned during the private search.” Id. at 120. The Supreme Court also found that the field test “could disclose only one fact previously unknown to the agent—whether or not a suspicious white powder was cocaine.” Id. at 122. In sum, the Supreme Court concluded that “the federal agents did not infringe any constitutionally protected privacy interest that had not already been frustrated as the result of private conduct.” Id. at 126.[13]
*11 In a contrasting decision, the Supreme Court found that the private search doctrine did not apply to law enforcement's viewing of films that were opened but not viewed by employees of a private company. Walter v. United States, 447 U.S. 649, 659 (1980). In Walter, the private employees opened packages containing boxes of film; on the boxes were suggestive drawings and explicit descriptions of the contents. Id. at 651–52. When they called in the FBI, agents viewed the films with a projector, revealing sexual activities that resulted in obscenity charges against the defendants. Id. at 652. The Supreme Court ruled that “notwithstanding that the nature of the contents of these films was indicated by descriptive material on their individual containers ...the unauthorized exhibition of the films constituted an unreasonable invasion of their owner's constitutionally protected interest in privacy.” Id. at 654. The Supreme Court explained that: 1) the private employees had not viewed the films; 2) “[p]rior to the Government screening one could only draw inferences about what was on the films”; and 3) “[t]he projection of the films was a significant expansion of the search that had been conducted previously by a private party and therefore must be characterized as a separate search.” Id. at 657. Further, the Supreme Court found that the private employees’ search only frustrated the defendants’ expectation of privacy in part, and that the additional search by the FBI—screening the films, amounted to a further intrusion on privacy. Id. at 659.
As an initial matter, Defendant suggests that the burden is on the Government to show that Jacobsen’s private search doctrine applies in this case, i.e. that the ESPs and NCMEC acted as private parties, as opposed to government entities or agents. (Dkt. No. 38-2, at 9). The Government does not appear to dispute that it carries the burden, characterizing the private search doctrine as an exception to the warrant requirement, (Dkt. No. 46, at 18), in which case it would indeed carry the burden.
The Second Circuit has not indicated who bears the burden of proof in this situation. Arguably, the private search doctrine is not properly characterized as an “exception” to the Fourth Amendment's warrant requirement because exceptions only apply in situations where the Fourth Amendment has already been triggered, i.e. exigent circumstances. When a private party conducts a search, on the other hand, the Fourth Amendment does not come into play. Therefore, some courts have held that it is the defendant who “bear[s] the burden to establish that a private party acted as a government instrument or agent.” United States v. Dupree, 781 F. Supp. 2d 115, 158 (E.D.N.Y. 2011) (quoting United States v. Feffer, 831 F.2d 734, 739 (7th Cir. 1987)); see also United States v. Couch, 378 F. Supp. 2d 50, 55 (N.D.N.Y. 2005) (“The party objecting to the search has the burden to establish by a preponderance of the evidence that the government involvement was significant enough to change the character of the search.”) (citing Feffer). Authorities outside the Second Circuit appear to be divided as to who bears the burden. Compare United States v. Wilson, 13 F.4th 961, 971 (9th Cir. 2021) (“The government bears the burden to prove Agent Thompson's warrantless search was justified by the private search exception to the Fourth Amendment's warrant requirement.”) and United States v. Miller, 982 F.3d 412, 425 (6th Cir. 2020) (“Miller failed to show that Google acted as a government agent under this test.”).
Since the Government does not dispute that it bears the burden of proving application of the private search doctrine, the Court will assume without deciding that is the case and hold it to that standard. Therefore, it is up to the Government to show that each entity involved in the searches here, the ESPs and NCMEC, acted privately and not as an actor or agent of the Government. In reviewing the Government's showing, the Court is mindful that “private actions are generally ‘attributable to’ the government only where ‘there is a sufficiently close nexus between the State and the challenged action of the ... entity so that the action of the latter may be fairly treated as that of the State itself.’ ” United States v. DiTomasso, 932 F.3d 58, 67–68 (2d Cir. 2019) (quoting United States v. Stein, 541 F.3d 130, 146 (2d Cir. 2008)). “The requisite nexus is not shown merely by government approval of or acquiescence in the activity, or by the fact that the entity is subject to government regulation.” Id. The purpose of the nexus requirement “is to assure that constitutional standards are invoked only when it can be said that the [Government] is responsible for the specific conduct of which the [defendant] complains.” Id. (emphasis in original) (cleaned up) (citing cases); see also Skinner v. Ry. Lab. Executives’ Ass'n, 489 U.S. 602, 614 (1989) (“Whether a private party should be deemed an agent or instrument of the Government for Fourth Amendment purposes necessarily turns on the degree of the Government's participation in the private party's activities.”).
1. The ESPs
*12 Defendant argues that Snapchat, Discord, and Instagram functioned as government actors/agents in this case. (Dkt. No. 38-2, at 6). According to Defendant, “the government knew of and acquiesced to Snapchat, Discord, and Instagram monitoring [his] online activity.” (Id.). Defendant also asserts that the ESPs “intended to assist NCMEC's law enforcement efforts when they monitored [his] accounts.” (Id.). However, Defendant does not cite to any specific evidence supporting these claims. Defendant suggests that the CyberTips indicate that the ESPs “fully intended to assist NCMEC and law enforcement, (id., at 7), but he does not explain how so. Defendant also points out that ESPs are required by statute to report suspect child pornography to NCMEC. (Id., at 6) (citing 18 U.S.C. 2258A(a), (e), (f)). But Defendant does not cite any authority for the proposition that a reporting duty automatically transforms a private entity into a government actor or agent, and the Government points to caselaw saying just the opposite, (Dkt. No. 46, at 17–18). And importantly, the reporting statute does not require that the ESPs monitor, screen, or search content on their platforms for suspected child pornography—the action that animates this case. See 18 U.S.C. § 2258A(f). At most, Defendant has shown that the ESPs are regulated by the Government, which acquiesced to their conduct—neither of which is a sufficient nexus to transform the ESPs into government actors or agents. Therefore, if Defendant had borne the burden of proof, he would not have met it.
In any event, the Government has submitted affidavits from the ESPs which refute Plaintiff's unsubstantiated claims. According to SnapChat's Alexander Brian Barczak, Snapchat “has a strong business interest in enforcing its Terms of Service and associated Community Guidelines and strives to ensure that its products are free of illegal content, and in particular, Child Sexual Exploitation and Abuse Imagery (‘CSEAI’).” (Dkt. No. 46-16, at 2). SnapChat uses various methods to detect illegal images and videos of CSEAI on its platform, and SnapChat “did not implement these policies and procedures at the request or direction of law enforcement or any government agency.” (Id., at 3). Rather, “preventing, detecting, and eradicating CSEAI on Snapchat is a top priority for Snap, and Snap continually evolves its capabilities to combat these and other crimes.” (Id.).
According to Meta's Tyler Harmon, “Meta has a private, independent business interest in keeping its platform safe and free from harmful content and conduct, including that which sexually exploits children.” (Dkt. No. 46-15, at 1). And Discord's Rolando Vega states that “Discord has a strong business interest in enforcing our terms of service and ensuring that our products are free of illegal content, and in particular, child sexual abuse material (‘CSAM’).” (Dkt. No. 50-1, at 2). Vega further states that Discord “independently and voluntarily takes steps to monitor and safeguard [its] platform,” and “[t]aking steps to rid our services of CSAM is important to protecting our users, our brand, and our business interests.” (Id.).
Based on the above, the Court finds that the Government has shown by a preponderance of the evidence that the ESPs did not operate as government actors or agents because: 1) they acted in their own business interests to detect and combat child pornography on their platforms; and 2) they did so without significant participation by law enforcement. Thus, the Court concludes that the ESPs conducted private searches in uncovering the images and videos reported in the CyberTips, which did not implicate Defendant's Fourth Amendment rights.[14] See also Coyne, 387 F. Supp. 3d at 396 (rejecting the argument that ESPs “were acting as government agents when they monitored the Internet traffic and reviewed the matches provided by PhotoDNA,” noting evidence that the ESPs “monitor their customer traffic for child pornography for business reasons and not to satisfy any government mandate”); United States v. DiTomasso, 81 F. Supp. 3d 304, 309–10 (S.D.N.Y. 2015) (finding that ESP was not a government actor based on evidence that it monitored content “of its own accord” and the lack of evidence that the ESP “intended its monitoring program to assist law enforcement”), aff'd, 932 F.3d 58 (2d Cir. 2019); see also United States v. Bebris, 4 F.4th 551, 562 (7th Cir. 2021) (finding that “a company which automatically scans electronic communications on its platform does ‘not become a government agent merely because it had a mutual interest in eradicating child pornography from its platform’ ”) (citing United States v. Ringland, 966 F.3d 731, 736 (8th Cir. 2020)), cert. denied, 142 S. Ct. 489 (2021).
2. NCMEC
*13 Next, the Court must consider whether the Government has shown that NCMEC's searches were private in nature or whether it operated as a government actor or agent. Defendant urges the Court to adopt the analysis and holding of the Tenth Circuit Court of Appeals in United States v. Ackerman, which found that NCMEC was acting as both a government entity and a government agent. 831 F.3d 1292, 1295–1304 (10th Cir. 2016). The Government argues that NCMEC is not a government actor and points to contrary authority. (Dkt. No. 46, at 18 n. 13) (citing United States v. Meals, 21 F.4th 903, 908 (5th Cir. 2021)). But the Government insists that the Court need not resolve this issue because “NCMEC did not exceed the scope of the [ESPs’] private search.” (Id.). This argument aligns with Jacobsen’s holding that a private search may be replicated by law enforcement without implicating the Fourth Amendment, so long as law enforcement does not exceed the scope of the private search. 466 U.S. at 115. According to Jacobsen, if law enforcement merely replicates a private search, it has not performed its own search for purposes of the Fourth Amendment. Id. Therefore, the Court will assume without deciding for purposes of this motion that NCMEC is a government actor or agent and proceed to the question of whether it (and law enforcement) exceeded the scope of the ESPs’ private searches.
3. Scope of Searches
To answer this question, the Court must compare the searches performed by the ESPs with those of NCMEC and law enforcement. If they are materially the same, the private search doctrine continues to hold. For most of the CyberTips, a comparison is straightforward. In these instances, the record shows that a human being at the ESPs viewed the “entire contents” of the uploaded images and videos, discovered apparent child pornography, and reported it to NCMEC. (Dkt. No. 46-4, at 3; Dkt. No. 46-6, at 3–4; Dkt. No. 46-7, at 3; Dkt. No. 46-9, at 3; Dkt. No. 46-10, at 4; 46-11, at 4; Dkt. No. 46-12, at 3; Dkt. No. 46-13, at 3–4). For some of these files, NCMEC viewed the contents as well; for others NCMEC did not view the contents and used hash matching to confirm the presence of child pornography. (Id.). And Investigator Obrist opened and viewed all the images and videos during his investigation. (Dkt. No. 38-1, at 7–22). But because the ESPs had already viewed the entire contents of these images and videos, any reasonable expectation of privacy had been extinguished, and NCMEC and Investigator Obrist viewing the images and videos again did not exceed the scope of the ESPs’ private searches. Therefore, NCMEC's and Investigator Obrist's actions did not amount to new searches and Defendant's Fourth Amendment rights were not violated with respect to these previously viewed images and videos.
A more nuanced comparison is necessary for the CyberTips where the ESPs did not view the contents of the images and videos but instead relied on hash matching software tools to detect apparent child pornography. (Dkt. No. 46-2, at 3–4; Dkt. No. 46-3, at 3–4; Dkt. No. 46-5, at 4; Dkt. No. 46-8, at 4). Defendant argues that the private search doctrine does not apply to these files “because no one at the social media company ever viewed [the] images,” and Investigator Obrist was the first person to do so. (Dkt. No. 38-2, at 9). Put another way, Defendant contends that Investigator Obrist's viewing of the contents of the images and videos was a new search because it exceeded the scope of the ESPs’ private (hash matching) searches of these files. In response, the Government argues that, assuming a reasonable expectation of privacy, there was no Fourth Amendment violation because the ESPs’ hash matching tools had already identified the files as child pornography and “Investigator Obrist's viewing of the files to confirm their character did not exceed the scope of the ESPs’ searches.” (Dkt. No. 46, at 20).
a. Relevant Decisions
Although the Second Circuit has not addressed this sort of comparison between a private actor's hash matching of files and law enforcement's visual inspection, several other appellate decisions are instructive. In United States v. Reddick, the Fifth Circuit faced a similar situation, where the defendant uploaded files to a Microsoft cloud service, which were flagged by hash matching as child pornography and then reported to NCMEC and law enforcement. 900 F.3d 636, 637–38 (5th Cir. 2018). The Fifth Circuit found that law enforcement's actions in reviewing and opening the flagged files did not violate the Fourth Amendment because: 1) “whatever expectation of privacy [defendant] might have had in the hash values of his files was frustrated by Microsoft's private search”; 2) when the detective received the files, “he already knew that their hash values matched the hash values of child pornography images known to NCMEC”; 3) “[h]is visual review of the suspect images—a step which merely dispelled any residual doubt about the contents of the files—was akin to the government agents’ decision to conduct chemical tests on the white powder in Jacobsen”; and 4) “opening the file merely confirmed that the flagged file was indeed child pornography, as suspected.” Id. at 638–39. Thus, the court concluded that there was no “significant expansion” of Microsoft's private search sufficient to constitute a “separate search.” Id. at 639 (quoting Walter, 447 U.S. at 657).
*14 In United States v. Miller, the Sixth Circuit Court of Appeals addressed law enforcement's review of files that had been flagged as child pornography by Google's hash-matching technology and reported to NCMEC; the court agreed with Reddick and found that the private search doctrine applied. 982 F.3d 412, 429 (6th Cir. 2020). In so finding, the court noted that: 1) “Google's technology ‘opened’ and ‘inspected’ the files, revealing that they had the same content as files that Google had already found to be child pornography”; 2) the reliability of hash-matching had not been challenged; and 3) available evidence showed that hash-matching provided “virtual certainty” that law enforcement would discover no more information opening the files than Google had already learned via hash match.[15] Id. at 428–30.
Most recently, the Ninth Circuit Court of Appeals disagreed with Reddick and Miller and found that the Government failed to prove that the private search doctrine applied to law enforcement's viewing of files that had been flagged as child pornography by Google's hash-matching technology. See Wilson, 13 F.4th at 972. The court reasoned that law enforcement's search exceeded the scope of Google's private search because: 1) “it allowed the government to learn new, critical information that it used first to obtain a warrant and then to prosecute [the defendant]”; and 2) “the government agent viewed [the defendant's] email attachments even though no Google employee—or other person—had done so, thereby exceeding any earlier privacy intrusion.” Id. The court explained that the “government learned at least two things above and beyond the information conveyed by the CyberTip by viewing [the defendant's] images: First, Agent Thompson learned exactly what the image showed. Second, Agent Thompson learned the image was in fact child pornography. Until he viewed the images, they were at most ‘suspected’ child pornography.” Id. at 973.
b. CyberTips #87232205 and #88898921 (SnapChat)
First, it is worth noting that the “entire contents” of the images from these CyberTips were “publicly available,” meaning that Defendant did not have a reasonable expectation of privacy for these images in the first place. (Dkt. No. 46-2, at 3–4; Dkt. No. 46-3, at 3–4). Assuming that Defendant retained a reasonable expectation of privacy as to these images, however, the Court finds that the record does not support application of the private search doctrine. While the Government argues that a hash value shows that an individual at some point viewed the same images in question, there is no evidence to support such a finding. Indeed, there is no indication that a human being at SnapChat ever viewed the contents of the images, nor is there any indication where the original hash values came from, such as a SnapChat repository of hashes or a NCMEC-hosted list.[16]
For CyberTip #87232205, the record merely shows that SnapChat reported the Incident Type as “Child Pornography” and listed the hash values for seven images, indicating that SnapChat had used hash-matching software tools to scan the contents of these images. (Dkt. No. 46-2, at 3–4).[17] SnapChat did not provide any further description of the contents of the files and listed the filenames as simply “file.jpg.” (Id.). When Investigator Obrist opened and viewed the files, however, he learned a great deal more:
*15 Two of the images were duplicates of a naked preteen girl with her legs spread, showing her vagina, in front of a camera. One image was of two naked preteen girls, one of whom was being vaginally penetrated by what appears to be a naked preteen boy. Only the boy's bottom half is visible in the image. One image was of a preteen girl, naked from the waist down, performing oral sex on what appears to be an adult male. The other three images depict preteen girls partially clothed with their vaginas exposed.
(Dkt. No. 38-1, at 7). Similarly, for CyberTip #88898921, SnapChat again reported the Incident Type as Child Pornography and listed the hash values for two image files. (Dkt. No. 46-3, at 3–4). SnapChat did not provide any description of the contents of the files and listed the filenames as “file.jpg.” (Id.). Upon opening and viewing the files, Investigator Obrist learned that: “[t]hese files were the same images of the naked preteen girl spreading her legs and the half-naked preteen girl performing oral sex that had been reported in the previous cybertip.” (Dkt. No. 38-1, at 7). Thus, by viewing the actual images, Investigator Obrist was able ascertain the sex acts involved, the number of minors, and potentially their identities. Further, Investigator Obrist was able to see that the images depicted not only minors but specifically preteens. And Investigator Obrist was able to determine with certainty that these images were child pornography under the applicable legal standard.
Given the gaps in the record regarding the hash values assigned by SnapChat and whatever search led to that assignment, the Government has failed to establish that the private search doctrine applies to these CyberTips. And it cannot be said that Investigator Obrist's “visual inspection of [the files] enabled [him] to learn nothing that had not previously been learned during the private search.” Jacobsen, 466 U.S. at 120. Rather, the “breadth of essential information” Investigator Obrist obtained by opening and viewing the files went significantly beyond what SnapChat communicated to NCMEC. Wilson, 13 F.4th at 979. For these reasons, assuming that Defendant retained a reasonable expectation of privacy, the Court finds that the Government has failed to show that Investigator Obrist's viewing of the images referenced in CyberTips #87232205 and #88898921 did not exceed SnapChat's private searches.[18]
c. CyberTips #116612117 and #124080432 (Instagram/Meta)
The private searches related to these CyberTips present a harder case. For CyberTip #116612117, Instagram reported the incident as Child Pornography and listed the hash value of one video file. (Dkt. No. 46-5, at 4). Instagram did not describe the contents of the file but listed an “Industry Classification” for the file, “B1,” which, according to NCMEC, indicates a pubescent minor involved in a sex act. (Dkt. No. 46-5, at 4–5). Meta states that it uses “proprietary hash technology to find exact or near exact copies of images or videos that a Meta employee or contractor previously viewed and confirmed as apparent child pornography as provided in 18 U.S.C. § 2256.” (Dkt. No. 46-15, at 1). “Before Meta creates a hash for an image or video, at least one Meta employee or contractor who is a member of its content review team must view and verify that the image or video is apparent, reportable child pornography as provided in 18 U.S.C. § 2256.” (Id., at 1–2). The Meta employee or contractor also applied a classification label when reviewing the file. (Id., at 2). According to Meta, it has confirmed that the video from CyberTip #116612117 “was an exact match to a hash added to Meta's hash repository prior to February 6, 2020.” (Id.). Thus, it can be inferred that a human being at Meta had previously viewed the same video reported in this CyberTip when it was uploaded to Instagram on a different occasion. Upon opening and viewing the file, Investigator Obrist learned that: “The video was of a girl masturbating and playing with her vagina in front of a camera. The female is nude from the waist down. Based on the lack of any discernible public hair or breast development, I estimate the girl could be no older than 11 years old.” (Dkt. No. 38-1, at 21).
*16 Similarly, for CyberTip #124080432, Instagram reported the incident as Child Pornography and listed the hash value of one video file. (Dkt. No. 46-8, at 3–4). Instagram did not describe the contents of the file but classified it as “A1,” which, according to NCMEC, indicates a prepubescent minor involved in a sex act. (Id., at 4–5). Meta confirmed that the video from CyberTip #124080432 “was an exact match to a hash added to Meta's hash repository prior to June 6, 2017.” (Dkt. No. 46-15, at 2). Again, it can be inferred that a human being at Meta had previously viewed the same video when it was uploaded to Instagram on a different occasion. Upon opening and viewing the file, Investigator Obrist saw “a girl approximately 6–8 years old in a blue dress performing oral sex on an adult male.” (Dkt. No. 38-1, at 21).
Thus, in contrast with the SnapChat images described above, the record shows that someone employed at Meta had previously viewed the same two videos in these CyberTips and “verif[ied]” that they were apparent child pornography under federal law. (Dkt. 46-15, at 1–2). Nonetheless, Defendant argues in his reply brief that the private search doctrine does not apply because no one at Meta viewed the actual uploaded files before submitting these CyberTips. (Dkt. No. 49, at 3–4) (citing Wilson, 13 F.4th 961). In Wilson, the Ninth Circuit noted that Fourth Amendment rights are personal rights and that a defendant retains an expectation of privacy “in his files, even if others had identical files.” 13 F.4th at 975 (emphasis in original). The Ninth Circuit found that “whether Google had previously reviewed, at some earlier time, other individuals’ files is not pertinent to whether a private search eroded [the defendant's] expectation of privacy.” Id.[19] The Government has not addressed this analysis. (Dkt. No. 46, at 20) (noting only that the Ninth Circuit does not apply the private search doctrine “in this context”).
Following Wilson’s logic, Defendant retained at least some expectation of privacy in his uploaded videos even though identical videos (presumably shared by others) had been previously viewed by Meta.[20] Conversely, if the Court applied Miller’s reasoning, Defendant's privacy interest in the videos had already been extinguished because someone at Meta had viewed them. See 982 F.3d at 429 (finding that the defendant's privacy interest in images was frustrated because “[a]t some point, Google employees who are trained on the federal definition of child pornography viewed two images to confirm that they are illegal child pornography before adding them to its child-pornography repository”). Ultimately, the Court declines to address this issue since it has not been fully briefed by the parties, and moreover, Defendant failed to show that he had a reasonable expectation of privacy at all, much less one that remained after Meta viewed the videos in question.
D. Exclusion is Not Warranted
Finally, the Government argues that even if there was a Fourth Amendment violation, suppression is not warranted. (Dkt. No. 46, at 22–23). According to the Government, Investigator Obrist acted in good faith and “a reasonably well-trained officer would not have known that his review of a file attached to the ESPs’ CyberTips was illegal.” (Id., at 22). The Government thus contends that Investigator Obrist did not act with culpable intent and that suppression would not serve the deterrent purpose of the exclusionary rule. (Id., at 23). In response, Defendant asserts that “[a] well-trained officer would have been aware that the warrantless search was illegal because of the authority cited in this reply and the moving papers.” (Dkt. No. 49, at 5).
*17 In general, the exclusionary rule operates as “a deterrent sanction that bars the prosecution from introducing evidence obtained by way of a Fourth Amendment violation.” Davis v. United States, 564 U.S. 229, 231–32 (2011). Beyond serving a deterrent purpose, the rule also recognizes that “exclusion exacts a heavy toll on both the judicial system and society at large.” Id. at 237. Thus, exclusion is only appropriate as a “last resort,” when the “deterrence benefits of suppression [ ] outweigh its heavy costs.” Id. (quoting Hudson v. Michigan, 547 U.S. 586, 591 (2006)). For instance, the Supreme Court has held that exclusion is not appropriate when the police “conduct a search in objectively reasonable reliance on binding judicial precedent,” because there is a clear absence of police culpability. Id. at 239. As another example, the Supreme Court has held that the exclusionary rule does not apply when the police conduct a search in “objectively reasonable reliance” on a warrant later held invalid, because any “marginal or nonexistent benefits produced by suppression cannot justify the substantial costs of exclusion.” United States v. Leon, 468 U.S. 897, 922 (1984).
In this case, the Court finds that even if there was a Fourth Amendment violation involved in Investigator Obrist's review of the CyberTips, suppression of the files attached to those CyberTips (and files obtained via warrants secured by reference to those CyberTip files) is not warranted. As the above discussion shows, courts have only begun to explore the application of the private search doctrine to searches of images and videos uploaded to ESPs, and the ones to do so have reached differing results. The Second Circuit has not spoken on the specific issue and thus there was no binding appellate precedent. But at the time of Investigator Obrist's searches, it was objectively reasonable for him to rely on the private search doctrine established in Jacobsen and conclude that his review of the ESPs’ flagged files (as conveyed to him by NCMEC) did not require a search warrant.[21] Therefore, the Court finds that Investigator Obrist did not act with a culpable state of mind and suppression would serve little if any deterrent effect. On the other side of the ledger, suppression of the evidence in this case would exact a heavy toll on society, as a pernicious crime against children would potentially go unpunished. Here, the costs of suppression far outweigh any benefits, and therefore, the Court would decline to apply the exclusionary rule even if there was a Fourth Amendment violation. See also Coyne, 387 F. Supp. 3d at 402 (declining to exclude evidence despite Fourth Amendment violation involving the private search doctrine where law enforcement's conduct fell within the good faith exception).
E. Request for Evidentiary Hearing
As an alternative to suppression, Defendant requests that the Court conduct an evidentiary hearing. (Dkt. No. 38-2, at 2). However, Defendant has not identified any factual issues that are disputed and require further development. As discussed above, Defendant has not submitted an affidavit in support of his motion that raises any factual issues, and the existing record and undisputed facts are sufficient to resolve the dispositive legal issues before the Court. Therefore, Defendant is not entitled to a hearing and the request is denied. See United States v. Kirk Tang Yuk, 885 F.3d 57, 77 (2d Cir. 2018) (noting that “an evidentiary hearing is required ‘if the moving papers are sufficiently definite, specific, detailed, and nonconjectural to enable the court to conclude that contested issues of fact going to the validity of the search are in question’ ”) (quoting In re Terrorist Bombings of U.S. Embassies in E. Afr., 552 F.3d 157, 165 (2d Cir. 2008)); see also United States v. Stein, 440 F. Supp. 2d 315, 330 (S.D.N.Y. 2006) (denying request for evidentiary hearing as to the defendants’ suppression motion where no factual issue had been raised).
IV. CONCLUSION
*18 For these reasons, it is hereby
ORDERED that Defendant Austin Tennant's motion to suppress (Dkt. No. 38) is DENIED.
IT IS SO ORDERED.
Footnotes
The facts herein are drawn from the exhibits submitted with the parties’ submissions, including the affidavit and exhibit submitted by the Government following Defendant's reply brief. (Dkt. No. 50).
See Instagram, https://about.meta.com/technologies/instagram (last visited October 10, 2023).
The Court notes that these data fields for “Suspect” appear in each CyberTip. The information that fills these fields has been redacted in the copies of the CyberTips provided to the Court.
“A hash value is a number that is often represented as a sequence of characters and is produced by an algorithm based upon the digital contents of a drive, medium, or file.” United States v. Miller, 982 F.3d 412, 418 (6th Cir. 2020) (quoting 2017 Advisory Committee Note to Fed. R. Evid. 902(14)). MD5 is a common algorithm used to produce hash values. Id.
Investigator Obrist did not identify the CyberTip numbers related to these files.
There is no evidence that the images and videos at issue in this case were accompanied by any text or words.
While Defendant may have wished to avoid claiming ownership of the social media accounts and uploaded image and video files, he could have done so without fear of the Government using such a statement against him, as it would not be admissible as substantive evidence of guilt at trial. See Simmons v. United States, 390 U.S. 377, 394 (1968).
The Court notes that the parties have not addressed whether any reasonable expectation of privacy was impacted by the third-party doctrine. In some instances, the Supreme Court has found that an individual does not have a reasonable expectation of privacy in information voluntarily turned over to third parties. See United States v. Miller, 425 U.S. 435, 442–43 (1976) (financial records held by a bank); Smith v. Maryland, 442 U.S. at 745 (dialed telephone numbers held by a telephone company); cf. Carpenter v. United States, 585 U.S. ___, 138 S. Ct. 2206, 2216–17, 2220 (2018) (finding that detailed cell phone location information is “qualitatively different” from telephone numbers and bank records, and is not subject to the third-party doctrine). Because the parties have not addressed the third-party doctrine, the Court declines to do so.
The Government has also cited persuasive caselaw from other Circuits finding that defendants did not have a reasonable expectation of privacy in files that were shared with the public on peer-to-peer networks, a situation analogous to the “publicly available” files in this case. (Dkt. No. 46, at 10) (citing cases).
Although the Government has not submitted Meta's Terms of Service for Instagram, Meta reserves the right to “detect, prevent, and combat harmful or unlawful behavior.” See Privacy Policy, Promoting safety, security, and integrity, available at https://privacycenter.instagram.com/policy (last visited October 10, 2023).
This finding is strictly limited to the facts of this case and is not meant to suggest that users of SnapChat, Instagram, and/or Discord do not have a reasonable expectation of privacy in their messages in general or under other circumstances. Further, the Court need not and does not take a position on the Government's argument that individuals lack any reasonable expectation of privacy in contraband. (Dkt. No. 46, at 13). The Court also notes again that there is no evidence that the images and videos at issue in this case were accompanied by any text or words that might raise additional privacy concerns.
As this summary indicates, the Jacobsen decision applied the Katz formulation of Fourth Amendment rights, i.e. analyzing whether the search infringed a person's legitimate expectation of privacy. Here, Defendant argues that Jacobsen’s private search doctrine does not apply to searches under the intrusion on a constitutionally protected area test in United States v. Jones, 565 U.S. 400 (2012). (Dkt. No. 38-2, at 9). The Government disagrees. (Dkt. No. 46, at 8). The Second Circuit has not addressed this issue. However, logically the private search doctrine can be applied to a Jones search, since a trespass on a constitutionally protected area by a private actor would not implicate the Fourth Amendment. See United States v. Phillips, 32 F.4th 865, 874 (9th Cir. 2022) (“Jacobsen thus establishes that law enforcement officers do not violate the Fourth Amendment when ... they mimic the trespass a private individual visited on another's possessions after being alerted to the information uncovered pursuant to that trespass.”). Nonetheless, the facts in this case are clearly analogous to those in Jacobsen, and therefore, the Court will apply the private search doctrine as set forth therein. See also Miller, 982 F.3d at 433 (declining to apply Jones’s property-based approach and finding that Jacobsen was controlling for its Fourth Amendment analysis).
To the extent Defendant argues that the ESPs should be considered agents of law enforcement unless the Government shows that they had “no intention” to assist law enforcement, (Dkt. No. 38-2, at 6–7), the Court disagrees with this reading of United States v. Ackerman, 831 F.3d 1292, 1303 (10th Cir. 2016), which regardless is not binding precedent.
SnapChat states that it uses hash matching tools that enable the comparison of hash values of images and videos on its platform with the hash values of “known” Child Sexual Exploitation and Abuse Imagery, that is images and videos that “were previously determined to be” Child Sexual Exploitation and Abuse Imagery. (Dkt. No. 46-16, at 3 n. 1). SnapChat does not indicate who determined that the images or videos were Child Sexual Exploitation and Abuse Imagery, i.e. SnapChat, NCMEC, or law enforcement, how they did so, or what standard they used or the meaning of “Child Sexual Exploitation and Abuse Imagery.”
While SnapChat reported seven images of “Child Pornography,” NCMEC only characterized two of the images as “Apparent Child Pornography,” whereas two others were “CP (Unconfirmed),” two were “Child Unclothed,” and one was “Child Clothed.” (Dkt. No. 46-2, at 6).
The Court notes a recent decision in this district found that the private search doctrine applied to law enforcement's review of hash-matched child pornography files flagged by Google. See United States v. Maher, Case No. 5:21-cr-00275-GTS, Dkt. No. 48, at *29 (N.D.N.Y Aug. 22, 2022). That case is distinguishable for various reasons, including that the Government presented evidence that an employee at the ESP, Google, had previously viewed the images in question. (Id., at *25–27). Here, SnapChat simply indicates that it uses hash-matching tools, not that any employee viewed the images in CyberTips #87232205 and #88898921. (Dkt. No. 46-16, at 1–3).
In Wilson, the court noted that the San Diego Internet Crimes Against Children Task Force “now obtains a search warrant before opening a CyberTip when the provider has not viewed the images,” 13 F.4th at 965 n.3, an approach that would avoid many of the thorny issues presented in these types of cases.
For example, a search of images uploaded by other individuals would not, under Wilson’s analysis, erode a defendant's privacy interest in his own image files because those files might reveal personal information in addition to child pornography, such as showing him as the abuser or abuse occurring at his residence.
Defendant has not pointed to any authority indicating that Jacobsen’s private search doctrine remains anything less than good law. While some courts have expressed uncertainty about the doctrine's viability after United States v. Jones, 565 U.S. 400, 406 (2012), see United States v. Ackerman, 831 F.3d 1292, 1307 (10th Cir. 2016), the Second Circuit has not addressed the issue and continues to apply the doctrine. See United States v. DiTomasso, 932 F.3d 58, 67 (2d Cir. 2019).