U.S. v. Wilbert
U.S. v. Wilbert
2018 WL 6729659 (W.D.N.Y. 2018)
August 20, 2018

Feldman, Jonathan W.,  United States Magistrate Judge

Photograph
Criminal
Privacy
Download PDF
To Cite List
Summary
The court found that the two screenshots of the video chat were properly obtained by Omegle and NCMEC, and that the evidence was not subject to suppression. The court also found that NCMEC's use of a publicly available search tool to pinpoint the geographical location of the IP address was not a search. The court also discussed the difficulty of applying Fourth Amendment principles to ESI and the potential implications of the third party doctrine in the context of ESI.
UNITED STATES of America,
v.
Scott T. WILBERT, Defendant
16-CR-6084-DGL-JWF
United States District Court, W.D. New York
Signed August 20, 2018

Counsel

Kyle P. Rossi, Melissa M. Marangola, U.S. Attorney's Office, Rochester, NY, for United States of America.
Robert A. Napier, Napier & Napier, Rochester, NY, for Defendant.
Feldman, Jonathan W., United States Magistrate Judge

REPORT AND RECOMMENDATION

Procedural Background
*1 In August 2016, the Federal Grand Jury returned an Indictment, charging Scott T. Wilbert (“Wilbert” or “the defendant”) with receipt of child pornography. See Docket # 11. The defendant filed omnibus motions in November 2016 (Docket # 17), which included a challenge to the search of his residence. The Court resolved most of the defendant’s motions on the record (Docket # 22), but reserved decision on the defendant’s motion to suppress the search of his home. On May 22, 2017, the Court recommended that the defendant’s motions to suppress and for a Frankshearing be denied. Docket # 32. The defendant filed objections to the Court’s Report and Recommendation (Docket # 36) and on May 22, 2017, Judge Larimer denied the objections and denied the motion to suppress. Docket # 37.
On August 16, 2017, after the government had made a motion to set a trial date, the defendant filed the instant motion to suppress and motion to dismiss on grounds not previously raised in his omnibus motions. Docket # 42. The government responded on September 5, 2017 (Docket # 44) and the defendant replied on September 19, 2017 (Docket # 48), prompting another response from the government (Docket # 49). The Court scheduled an evidentiary hearing, which the government moved to cancel, arguing that there was no factual basis for such a hearing. Docket # 52. The Court denied the government’s request (Docket # 57) and proceeded with the evidentiary hearing on January 17, 2018, at which three witnesses testified. Docket # 58. The Court ordered post-hearing briefing, which the defendant filed on March 26, 2018 (Docket # 65) and the government filed on April 23, 2018 (Docket # 69). Wilbert replied on April 30, 2018 (Docket # 69). The Court requested additional briefing on several issues. Docket # 70. After an extension, the parties filed their supplemental briefs on June 1, 2018. Docket ## 72, 73.
Relevant Facts
In this motion, the defendant argues that two screenshots of a video chat uploaded from his IP address[1] to a chat site and forwarded to the National Center for Missing and Exploited Children (“NCMEC”) and then to law enforcement must be suppressed as the result of an illegal warrantless search of his computer. He also argues that other evidence subsequently obtained as a result of that illegal search should be suppressed as fruit of the poisonous tree. Finally, the defendant asserts that the indictment itself must be dismissed for failure to preserve evidence. Three witnesses testified at the evidentiary hearing: (1) Leif K-Brooks (“K-Brooks”), owner and founder of Omegle; (2) John Shehan (“Shehan”), Vice President of the Exploited Child Division at NCMEC; and (3) Investigator Cerretto (“Cerretto”) of the New York State Police (“NYSP”).
*2 Leif K-Brooks: K-Brooks testified that he is the owner and founder of Omegle, a chat website that connects users randomly and anonymously to other users. Jan. 17, 2018 Hr'g Tr., Docket # 61, (“Tr.”) at 21. K-Brooks founded the Omegle site in 2009 and described his invention as follows:
So most instant messaging or chat sites are for talking to people you already know. So you would say I want to talk to Bob and you would enter Bob’s user name and you'd have a conversation with Bob. Omegle is for meeting new people so the site connects you to someone random, someone you don't select and it’s anonymous, meaning there are no names associated with it. You don't know the other person’s name and they don't know your name.
Tr. at 21. If the user does not want to chat to the person they are randomly connected with, they simply press a “button” to disconnect and connect to a different random user. Omegle supports both video chats and text chats. Tr. at 27. Unlike text chats, video chats do not pass over Omegle’s servers. Tr. at 27-28. Rather, video chats are conducted peer-to-peer or computer-to-computer, i.e. directly between the two users. Tr. at 27-28. Although Omegle does not possess or retain any of the chat itself, it does log “chat history metadata,” including the time when the chat began and the IP addresses associated with the chat. Tr. at 29. Today, the Omegle site averages one million distinct users per day. Tr. at 47.
Omegle is free and users need not register before using the site. Omegle does not collect or share users' identifying information. Tr. at 21-22. Upon entering the site, but before engaging in a chat, the platform displays a link to Omegle’s privacy policy. Tr. at 22-23. There is “also a warning in large text about the video chat moderation” policy utilized by Omegle. Tr. at 22-23, 60. At the bottom of the initial Omegle screen, users are notified that they agree to Omegle’s terms of service (“TOS”) by using the site. Tr. at 87. However, users are not required to read the TOS or affirmatively agree to them before engaging in a video or text chat and Omegle does not track whether users actually accessed the privacy policy. Tr. at 33.
K-Brooks explained the moderation system Omegle uses to monitor video chats. In order to discourage nudity, sexual behavior and otherwise illegal conduct during video chats, Omegle uses proprietary software that automatically captures snapshot images from the user’s computer and instantly uploads those images to Omegle’s servers “for moderation purposes.” Tr. at 39. The software randomly captures four frames of the video chat during the first few seconds of the chat. No other images are monitored except for these initial still frame captures. Tr. at 62. After the images are uploaded to Omegle’s servers they are immediately “screened” with an automated software program called “Computer Vision.” Tr. at 36-37. According to K-Brooks, the program uses an algorithm that looks “at an image in sort of the same way a human would” by trying to recognize “shapes, colors and trying to detect features in images that it’s never seen before.” Tr. at 35. If the software detects that the snapshots contain “good things,” like a face or a person’s upper body, then the image will not be screened further by the software. Tr. at 36. However, if the program detects “bad things” - “things that are more likely to be something other than just a person sitting in front of a webcam talking to someone else while fully clothed,” then the software will flag the images for “review by human moderators.” Tr. at 36-37.
*3 Images that the program determines may contain nudity, sexual conduct or other unknown or unwanted content are flagged to be inspected by a human moderator contracted by Omegle through Gracall, a third party company that staffs a moderator review force 24 hours a day, seven days a week. Tr. at 36-37, 40-41. Gracall staffs monitoring centers in various countries around the world and K-Brooks estimated that three to four moderators are actively viewing Omegle’s screenshots at any time. Tr. at 75. K-Brooks designed a system where the snapshots are displayed in a “big grid” and the human moderators constantly scan the grid for offensive images. Tr. at 75. If a snapshot appears to be illegal or require further review, the human moderator “can just press buttons to flag them.” Tr. at 75. K-Brooks testified that hundreds of thousands - if not millions - of screenshots are flagged for review every day. Tr. at 46-47. Omegle bans IP addresses in the “tens of thousands” daily due to unwanted conduct. Tr. at 47.
If a moderator determines that the image contains suspected child pornography, a program developed by K-Brooks automatically compiles information in Omegle’s system into a report and electronically submits it to NCMEC. Tr. at 47. The NCMEC report, known as a “cyber-tip,” is generated and transmitted “[w]ithin a few minutes” after human review. Tr. at 55. The snapshots themselves typically only remain on Omegle’s system for a few hours. Tr. at 64. However, if the image requires generation of a cyber tip report, Omegle preserves the images for 90 days. Tr. at 71.
K-Brooks testified that on October 25, 2015, Omegle’s automated software flagged as suspicious two images from IP address 50.49.31.78. Tr. at 38. For purposes of this hearing, the defendant has admitted that IP address 50.49.31.78 was assigned to his computer. Docket # 60, at ¶ 6. The first image was uploaded at 2:30:21 UTC[2] and is a jpeg file whose name ended in “a9e7” (“image a9e7”). Hr'g Ex. 6. The second image was uploaded at 2:26:12 UTC and is a jpeg file whose name ended in “c6d0” (“image c6d0”). Hr'g Ex. 6. K-Brooks testified that he is certain a third party moderator viewed image c6d0 because it was flagged as unwanted content, but he is not certain whether the moderator viewed image a9e7.[3] Tr. at 48-49. This is so because, by looking at the records, K-Brooks could tell that the moderator had flagged image c6d0 as containing unwanted material, but did not flag image a9e7. That could mean that image a9e7 was viewed but did not contain unwanted material or that it was not reviewed at all. Tr. at 48-49, 53. However, because the two images came from the same chat session at around the same time, both were grouped together and sent on to NCMEC even though K-Brooks can only confirm that one was reviewed by a moderator and flagged as containing apparent child pornography. Tr. at 53-54.
Consistent with K-Brooks's testimony, a report[4] was automatically generated by Omegle’s software and sent to NCMEC’s “CyberTipline” at 2:38:58 UTC, indicating that two files were uploaded from IP address 50.49.31.78 and confirming that image c6d0 was reviewed by Omegle. Hr'g Ex. 7, at 2-3; Tr. at 57-58. As part of that report, Omegle attached both images, even though the report is silent with respect to whether Omegle reviewed image a9e7. Tr. at 59. In accordance with Omegle’s image preservation policy, because law enforcement did not request the snapshots within 90 days, Omegle deleted them. Tr. at 71. K-Brooks did not interview the Gracall employee who performed the review at issue here. Tr. at 73.
*4 John Shehan: Shehan testified that he is the Vice President of the Exploited Child Division at NCMEC. Tr. at 97-98. According to Shehan, NCMEC’s “mission is to help reunite families with missing children, reduce child sexual exploitation and prevent child victimization.” Tr. at 99. NCMEC is a private not-for-profit organization that receives significant funding from the federal government, as well as from various other sources. Tr. at 101. As part of his duties as Vice President, Shehan is responsible for NCMEC’s CyberTipline, to which members of the public or internet service providers (“ISPs”) can submit tips on suspected child pornography activity. Tr. at 98-103. Shehan testified that he understands 18 U.S.C. § 2258A to require ISPs to report apparent child pornography on their systems to the CyberTipline. Tr. at 104. In doing so, ISPs must indicate – either manually or through automated software, like Omegle’s – what conduct they are reporting and select the date and time of the incident; all other information they provide is voluntary. Tr. at 106. In turn, the law requires NCMEC to provide the collected information to law enforcement. Tr. at 106. NCMEC makes available CyberTipline reports to law enforcement from relevant jurisdictions via a virtual private network (“VPN”). Tr. at 109. In other words, NCMEC does not transmit the CyberTipline reports; rather, the appropriate law enforcement members may access the database of tips through the VPN. Tr. at 109. In 2017, NCMEC received 10 million reports into the CyberTipline. Tr. at 105.
Shehan testified that NCMEC received the tip report at issue here on October 25, 2015 at approximately 2:38:58 UTC. Tr. at 111. The information reported by Omegle appears as Section A in NCMEC’s CyberTipline Report 6928493 (“the Report”), which was ultimately provided to law enforcement. Tr. at 111-112; Hr'g Ex. 7. That section indicates that two images were uploaded from IP address 50.49.31.78 on October 25, and that image c6d0 was viewed by a moderator; the Report is silent with respect to whether image a9e7 was viewed by a moderator. Tr. at 114. However, based on the omission of any notation regarding Omegle’s viewing of image a9e7, Shehan “would assume it was not” viewed. Tr. at 114. NCMEC’s software blocks its staff from viewing any image that the ISP does not expressly state it viewed, but images viewed by the ISP are available to NCMEC staff to inspect. Tr. at 120. In other words, because Omegle did not specify that it viewed image a9e7, that image was not available for NCMEC staff to view.[5] Shehan testified that NCMEC engages in this practice so that it cannot be said to expand the scope of a previous search conducted by a private entity. Tr. at 122-23. Rather, the image resided on Omegle’s servers in a locked file so that it could be passed on to law enforcement. Tr. at 122-23; see Hr'g Ex. 9. A NCMEC staff member, however, was also able to view image c6d0, which Shehan testified appeared to be a series of four screenshots of an animal performing oral sex on a young girl. Tr. at 130-31. NCMEC classified this image as “child pornography unconfirmed” because the NCMEC employee was not able to determine the age of the child depicted in image c6d0. Tr. at 133, 156.
Shehan testified that sections B and C of the Report include information added by NCMEC. Once NCMEC received the Report, a NCMEC staff member used a publicly available search tool called MaxMind to pinpoint the geographical location of the IP address so that it could send the Report to the appropriate law enforcement agencies. Tr. at 116. Using that tool, the NCMEC employee determined that the IP address came from Rochester, New York and that the ISP was Frontier Communications. Tr. at 116.
Section D of the Report is a list of law enforcement to whom the Report was made available. All of the images – even those not viewable by NCMEC – were made available to law enforcement. Tr. at 152. The Report also documents which images, if any, were viewed by the original reporting ISP. Tr. at 148-49. Based on the geographic location of the IP address from which the images were uploaded, NCMEC made the Report available to the New York State Police Internet Crimes Against Children (“ICAC”) task force. Tr. at 133. NCMEC’s involvement ends once the report is made available to law enforcement; each agency decides whether to the review the report and what to do with the information contained in it. Tr. at 136.
*5 Investigator David Cerretto: Cerretto, an investigator with the NYSP, testified that once NCMEC makes a report available to ICAC, members of the NYSP are able to retrieve it and review its contents. Tr. at 164. After being reviewed by ICAC, the Report here was forwarded to Keith Becker at the Computer Crimes Unit and then eventually[6] to Cerretto on December 22, 2015. Tr. at 165-66.
Cerretto testified that he accessed and viewed the Report and both images associated with the Report, even though he was aware that only image c6d0 had been viewed by Omegle and NCMEC.[7] Tr. at 168-69, 180. According to Cerretto, it is the NYSP’s practice to open every image they obtain regardless of whether the image had been previously opened by a private entity. Tr. at 190. Cerretto determined that image c6d0 (Hr'g Ex. 1) was of an unclothed female child and a dog who was performing oral sex on the child. Tr. at 170. He was unable to tell whether image a9e7 (Hr'g Ex. 11) – which had not previously been viewed by Omegle or NCMEC – contained child pornography. Tr. at 170. Based on these two images and further investigation, Cerretto sought and obtained a search warrant for 634 Garson Avenue, the physical location where his investigation indicated was associated with the IP address included in the Report. Tr. at 171.
Cerretto testified that because he could not tell what was happening in image a9e7, the search warrant application was based on the image of the girl and the canine, i.e. image c6d0. Tr. at 170. In a section of the warrant application entitled “Facts Providing Reasonable Cause,” Cerretto indicated that “at approximately 02:30:21, username ‘WCP5MWA3’ uploaded an image of a prepubescent female (between four (4) years of age and seven (7) years of age), who was engaged in oral sex with a K9, to the Internet through the website identified as Omegle.com.” Docket # 17-2, at 5. Cerretto testified that in the search warrant application he indicated incorrectly that the image with the girl and the canine was uploaded at 2:30:21 UTC – instead of at 2:36 UTC - because he believed both images derived from one continuous incident that began at 2:30:21 UTC. Tr. at 173, 192-93.
Discussion
Motion to Suppress: The defendant argues that all evidence seized from the search of 634 Garson Avenue must be suppressed because the warrant for that residence was based on law enforcement viewing image a9e7, which Omegle had not previously reviewed. Wilbert insists that law enforcement significantly and illegally expanded Omegle’s private search by conducting a warrantless search of image a9e7, for which the remedy is suppression of the fruits of the resulting search warrant.
Standing in the way of the relief Wilbert seeks is the difficulty the Court and the parties face in applying well established Fourth Amendment principles to processes and forms of evidence that did not even exist until fairly recently. This Court is not alone in struggling with how to apply Fourth Amendment formulas developed and grounded in physical places and objects to a virtual world in which evidence, objects and locations exist only as electronic impulses momentarily displayed on a computer screen. The Fourth Amendment issues implicated in Wilbert’s suppression motion are not new in the sense that they raise novel constitutional concepts, but rather pay tribute to the difficulty in fitting the square peg of the Fourth Amendment into the rounded hole of ESI - electronically stored information.
*6 To prevail in this suppression motion, Wilbert has to overcome, inter alia, issues of standing and reasonable expectations of privacy, as well as navigate application of the third party doctrine, the consent to search doctrine and the private search doctrine. Complicating his challenge is the fact that all of these issues originate in the bewildering “new world” of ESI. After careful consideration of the arguments of counsel, I conclude that resolution of Wilbert’s suppression motion does not require the Court to untangle most of these difficult Fourth Amendment issues. For, as discussed below, even if the defendant were able to “run the gauntlet” and have each issue decided in his favor, suppression of the evidence he seeks would not result.
While it may not be necessary to resolve many of the Fourth Amendment issues Wilbert’s motion raises, it is helpful for the Court’s analysis to at least identify them. An initial issue is whether Wilbert had a reasonable expectation of privacy in the place searched sufficient to afford him standing to contest the search in the first place. “The party moving to suppress bears the burden of establishing that his own Fourth Amendment rights were violated by the challenged search or seizure.” United States v. Osorio, 949 F.2d 38, 40 (2d Cir. 1991) (citing Rakas v. Illinois, 439 U.S. 128, 131 n.1 (1978) ). Here, the parties disagree as to exactly what the constitutionally protected space at issue is. The defendant asserts that he had a reasonable expectation of privacy in the contents of his personal computer. See United States v. Lifshitz, 369 F.3d 173, 190 (2d Cir. 2004) (“Individuals generally possess a reasonable expectation of privacy in their home computers.”). The government responds by arguing that Wilbert is not entitled to any expectation of privacy without first admitting that he had an expectation of privacy in the images in question. See Gov't Reply, Docket # 55, at 2 (to obtain standing, the defendant must assert that he had “a subjective expectation of privacy in transmitting child pornography through the video chat”). While the Court is skeptical of the government’s position that Wilbert may not challenge the search of his computer without first admitting he was using Omegle at the time the offending images were found, it raises another expectation of privacy argument that might be more problematic for Wilbert. K-Brooks testified that Omegle’s video chat component “is actually a peer-to-peer system” where the video stream goes directly from one user to the other. Tr. at 27-28. There is support in the case law for a diminished expectation of privacy for information an individual chooses to “share” on a peer-to-peer network, even if the sharing is not public, but limited to “friends”. See, e.g., United States v. Brooks, No. 12-cr-166, 2012 WL 6562947, at *2 (E.D.N.Y. Dec. 17, 2012) (discussing expectation of privacy in private peer-to-peer sharing networks)(citing cases).
The government also points out that Wilbert, like all users of the Omegle website, was advised that video chats are subject to monitoring for offensive content.[8] According to the government, such a warning (1) eliminates Wilbert’s expectation of privacy in the content of his chats and (2) constitutes a “binding” consent by Wilbert to allow Omegle to search and reveal the content of his video communications. See Gov't Suppl. Br., Docket # 72, at 13-18. K-Brooks testified that upon entering the site, but before engaging in a chat, the platform displays a link to the privacy policy. Tr. at 22-23. At the bottom of the login screen, users are notified that they agree to Omegle’s terms of service by using the site. Tr. at 87.
*7 The issue of how one’s privacy rights are impacted by website “warnings” or by specific agreement to a posted TOS is a difficult one. Wilbert relies on United States v. DiTomasso, 56 F. Supp. 3d 584 (S.D.N.Y. 2014), a Southern District of New York case involving Omegle in which the court expressly rejected the argument the government makes now. While that case dealt with a text chat rather than a video chat, the court analyzed the language of Omegle’s privacy policy and concluded that
it would subvert the purpose of the Fourth Amendment to understand its privacy guarantee as “waivable” in the sense urged by the government. In today’s world, meaningful participation in social and professional life requires using electronic devices–and the use of electronic devices almost always requires acquiescence to some manner of consent-to-search terms. If this acquiescence were enough to waive one’s expectation of privacy, the result would either be (1) the chilling of social interaction or (2) the evisceration of the Fourth Amendment. Neither result is acceptable.
DiTomasso, 56 F. Supp. 3d at 592. The court in DiTomasso held that the language contained in Omegle’s privacy policy was not so clear and explicit as to completely destroy the defendant’s expectation of privacy:
Omegle took snapshots of DiTomasso’s chats and parsed them for content. Although that form of monitoring is referenced in the policy, it is mentioned exclusively as a means of “monitoring for misbehavior”–by which the policy clearly means violations of Omegle’s rules, not criminal activity–ahd of improving Omegle’s internal monitoring system.
A reasonable person, having read carefully through the policy, would certainly understand that by using Omegle’s chat service, he was running the risk that another party–including Omegle–might divulge his sensitive information to law enforcement. But this does not mean that a reasonable person would also think that he was consenting to let Omegle freely monitor his chats if Omegle was working as an agent of law enforcement. When Omegle’s policy refers to the “law enforcement [purpose]” behind maintaining IP address records, it is unclear whether this “purpose” is motivated (1) by Omegle’s independent desire to aid criminal investigations, or (2) by Omegle’s obligations under state or federal law. In other words, it is plausible to interpret the policy as implying that Omegle is required to keep IP address records. So construing the policy, a reasonable user would be unlikely to conclude that Omegle intended to act as an agent of law enforcement. And such a user would be even lesslikely to conclude that he had agreed to permit such conduct.
Id. at 596-97 (emphasis supplied) (footnotes omitted).
The government obviously disagrees with the DiTomasso holding that Omegle users have privacy rights in their online chats. See Docket #72, at 11-12. The government points the Court to the so-called “third-party doctrine” which provides that “[p]eople who share information with third parties assume the risk that their information would be shared with law enforcement.” Docket # 72, at 12; see Smith v. Maryland, 442 U.S. 735, 743-44 (1979) (“This Court consistently has held that a person has no legitimate expectation of privacy in information he voluntarily turns over to third parties.”). But, as with other Fourth Amendment principles, courts are now grappling with whether the downloading of an “app” or the use of a website in today’s world of ESI reflects the same relinquishment of privacy rights in which the third party doctrine was grounded. Indeed, in Carpenter v. United States, ––– U.S. ––––, 138 S. Ct. 2206 (2018), Chief Justice Roberts referred to “the seismic shifts in digital technology” as a basis for rejecting the government’s argument that the third party doctrine allows law enforcement access to cell phone records created and stored by the phone carrier. 138 S. Ct. at 2219. “Given the unique nature of cell phone location records, the fact that the information is held by a third party does not by itself overcome the user’s claim to Fourth Amendment protection.” Id. at 2217. The vast amount and personal nature of information stored on our electronic devices through websites and apps expands every day and includes not only location information, but our pulse rates, blood pressure, calorie consumption, credit card numbers, prescription and medical information, music and podcast choices, child monitoring cameras, thermostat controls, travel plans and airline tickets, shopping interests and purchases, diary entries and new year resolutions, prayer books, photographs and real time conversations with friends and family members. The list goes on and on and continues to exponentially explode with new technology and applications. Do we automatically relinquish Fourth Amendment protections to this highly personal data because the information is no longer physically stored in a file cabinet or a bookshelf or a desk drawer, but instead is digitally captured on an “app” we downloaded on our phone or other internet-connected device from a “third party”? Obviously, this case involves only Omegle, a website inviting users to privately and anonymously communicate with another person. But whether simply posting a “warning banner” on a website that refers a user to the website’s privacy statement unequivocally “binds” the user to anything and everything contained in the third party’s terms of service may be not be as cut and dry as the government asserts.
*8 As it turns out, what matters most in deciding the defendant’s suppression motion is the “private search doctrine.” The “Fourth Amendment’s guarantee to be free from unreasonable search and seizure is directed at [g]overnment activity.” United States v. Heleniak, No. 14-cr-42A, 2015 WL 521287, at *4 (W.D.N.Y. Feb. 5, 2015). The Fourth Amendment’s protections are “wholly inapplicable ‘to a search or seizure, even an unreasonable one, effected by a private individual not acting as an agent of the [g]overnment or with the participation or knowledge of any governmental official.’ ” United States v. Jacobsen, 466 U.S. 109, 113-114 (1984) (quoting Walter v. United States, 447 U.S. 649, 662 (1980)(Blackmun, J., dissenting) ).
Omegle is obviously a private company and Wilbert does not claim otherwise. Thus, the flagging of the offending images by Omegle’s surveillance algorithm and the viewing of the images by the private moderators employed by Omegle by themselves raise no Fourth Amendment concerns. Where Wilbert’s motion gets tricky, however, is the transmittal of the images by Omegle to NCMEC and NCMEC’s subsequent transmittal to law enforcement. In Jacobsen the Supreme Court held that where a governmental search expands the scope of a private one “[t]he additional invasions of [a person’s] privacy by the government agent must be tested by the degree to which they exceeded the scope of the private search.” Id. at 115. “A private party acting as a government agent also may not expand upon a previously private search without running afoul of the Fourth Amendment.” United States v. Knoll, 16 F.3d 1313, 1320 (2d Cir. 1994). Here, the two image files captured by Omegle (c6d0 and a9e7) were passed on to NCMEC and then to the NYSP. The latter is obviously a law enforcement agency subject to the Fourth Amendment. But what about NCMEC?
In United States v. Ackerman, 831 F.3d 1292 (10th Cir. 2016), the Tenth Circuit held that NCMEC acts as a government entity, or at least an agent of the government, when it creates and maintains CyberTipline reports for Congress and reports suspected illegal content to law enforcement. In an opinion written by now Supreme Court Justice Gorsuch, the court analyzed the statutory structure governing NCMEC and its obligations to collaborate with law enforcement agencies. See 42 U.S.C. § 5773(b); 18 U.S.C. § 2258A. The court held that because Congress statutorily required Electronic Service Providers (“ESPs”) to report content containing suspected child pornography to NCMEC, required NCMEC to maintain the CyberTipline to receive illegal content, permitted NCMEC to review suspected child pornography, and statutorily required NCMEC to forward CyberTipline reports to any appropriate law enforcement agency, “NCMEC qualifies as a governmental entity” for purposes of the Fourth Amendment. Ackerman, 831 F.3d at 1297.[9]
For purposes of this Report and Recommendation, the Court will assume without deciding that the Tenth Circuit’s decision in Ackerman is correct and NCMEC is a governmental entity, or at least an agent of the government. Even with that assumption, however, the hearing testimony does not support a finding that NCMEC, as a governmental entity, expanded the scope of the private search conducted by either Omegle or its private video chat monitors. K-Brooks testified that although he could not be certain, as far as he could tell it appears only image c6d0 was definitely flagged as containing suspected child pornography and reviewed by a monitor. Tr. at 53-54. The report uploaded by Omegle to the CyperTipline website (Hr'g Ex. 7) also supports a finding that only image c6d0 was viewed by the monitor. And finally, Shehan testified that based on his review of the CyberTipline report received from Omegle, he believes only image c6d0 was viewed by NCMEC staff and image a9e7 was never even made available for viewing. Tr. at 120-22
*9 Nevertheless, there is no question that both images were attached to the CyberTipline Report that was uploaded and submitted to the NYSP by NCMEC. Thus, the next question is: Even if NCMEC did not expand the private search of Wilbert’s video chat, did the NYSP? As to this issue, the evidence was unequivocal. NYSP Investigator David Cerretto confirmed that he accessed and viewed both images in connection with his investigation of Wilbert and he was aware that only image c6d0 had been viewed by Omegle and NCMEC. Indeed, according to Cerretto, it was the policy of the NYSP to open and view every image submitted to them through the CyberTipline regardless of whether the image or its content had been previously viewed by a private party or entity. Tr. at 168-69, 190. Thus, it appears certain that the NYSP did expand the scope of the Omegle’s private search of Wilbert’s video chat by opening image a9e7.
Of course, to even get to this point in the analysis Wilbert would have had to run the gauntlet of various Fourth Amendment issues the Court has briefly touched upon. Assuming Wilbert was able to overcome issues of standing and reasonable expectations of privacy, as well as successfully navigate application of the consent to search doctrine and the private search doctrine, he would appear to have a strong argument that law enforcement violated the Fourth Amendment by exceeding the scope of the private search. And Investigator Cerretto testified that based on the images and further investigation, he applied for and obtained a state search warrant (Hr'g Ex. 12) for the upstairs apartment at 634 Garson Avenue in Rochester, New York and discovered what we know was Wilbert’s laptop computer. If the NYSP did improperly expand the scope of the Omegle’s private search of Wilbert’s video chat by opening image a9e7 and then relied on that image as part of their factual justification to obtain a search warrant to search 634 Garson Avenue for computers containing child pornography, does the inclusion of tainted evidence lead Wilbert to achieving suppression of the evidence obtained from his computer?
Unfortunately for Wilbert, it is at this juncture where his suppression motion arguments falls apart.
[T]he inclusion in an affidavit of indisputably tainted allegations does not necessarily render the resulting warrant invalid. The ultimate inquiry on a motion to suppress evidence seized pursuant to a warrant is not whether the underlying affidavit contained allegations based on illegally obtained evidence, but whether, putting aside all tainted allegations, the independent and lawful information stated in the affidavit suffices to show probable cause.
United States v. Giordano, 416 U.S. 505, 555 (1974) (Powell, J. concurring) (emphasis added); see United States v. Trzaska, 111 F.3d 1019, 1026 (2d Cir. 1997) (“[A] reviewing court should excise the tainted evidence and determine whether the remaining, untainted evidence would provide a neutral magistrate with probable cause to issue a warrant.”)(quoting United States v. Vasey, 834 F.2d 782, 788 (9th Cir. 1987) ). Investigator Cerretto testified that image c6d0 was an image of a dog performing oral sex on a naked female child, but he “was not able to tell” what was happening in image a9e7.[10] Tr. at 170. Image c6d0 was not “tainted” as it was flagged, opened and viewed by Omegle and Omegle’s private monitors before being transmitted to law enforcement. It was only the indecipherable image, image a9e7, that was arguably tainted by law enforcement expanding the private search and viewing this second image.
If the “ultimate inquiry” for suppression is whether “putting aside all tainted allegations, the independent and lawful information stated in the affidavit suffices to show probable cause,” (Giordano, 416 U.S. at 555) then the answer is self-evident. Image c6d0 depicts child pornography. The “tainted” file, image a9e7, does not depict child pornography and hence could not have added anything to the judge’s probable cause determination. Excising the tainted image from the probable cause calculation yields the same result as including it: On October 25, 2015 someone using an IP address assigned to Scott Wilbert at 634 Garson Avenue uploaded an image of a “prepubescent” female child between age 4 and 7 “who was engaged with oral sex with a K-9” through the Omegle website and Wilbert was identified as a level 2 registered sex offender who in 2014 had been investigated by the NYSP for uploading images of child pornography. See Hr'g Ex. 12. Even if the “tainted” evidence was excised, the untainted evidence would provide a “neutral magistrate” with ample probable cause to issue the search warrant.
*10 Request for a Franks Hearing: Alternatively, Wilbert seeks a hearing pursuant to Franks v. Delaware, 438 U.S. 154 (1978). The Second Circuit has held that:
To be entitled to a Franks hearing, a defendant must make a “substantial preliminary showing” that: (1) the claimed inaccuracies or omissions are the result of the affiant’s deliberate falsehood or reckless disregard for the truth; and (2) the alleged falsehoods or omissions were necessary to the judge’s probable cause finding. If, after setting aside the allegedly misleading statements or omissions, the affidavit, nonetheless, presents sufficient information to support a finding of probable cause, the district court need not conduct a Franks hearing.
United States v. Salameh, 152 F.3d 88, 113 (2d Cir. 1998) (internal citations omitted).
Here, I find the defendant has failed to make the required showing necessary for the Court to order a Franks hearing. Wilbert claims that the affidavit in support of the search warrant was deliberately and materially inaccurate because (1) Cerretto falsely described what image c6d0 (Hr'g Ex. 1) depicted and (2) Cerretto falsely stated the time image c6d0 was uploaded. Neither allegation merits the convening of a Franks hearing. As to Cerretto’s description of image c6d0, the Court finds that it was reasonably accurate and certainly provides probable cause that the image depicted constituted child pornography. As to any inaccuracy in the time image c6d0 was uploaded, Cerretto credibly testified that he included the upload time of 2:30:12 simply because he viewed that time as the beginning of a continuous event that encompassed the upload of the two images. Moreover, even assuming this was inaccurate, there is no evidence to suggest that the alleged inaccuracies were deliberate or material. United States v. Longo, 70 F. Supp. 2d 225, 254 (W.D.N.Y. 1999) (where alleged misrepresentations and omissions in the supporting affidavit were “inconsequential to the finding of probable cause,” Franks hearing unnecessary).
Motion to Dismiss Indictment: Finally, Wilbert asks this Court to dismiss the indictment because the government did not request the defendant’s “actual chats.” Def.'s Br. (Docket # 42-1), at ¶ 26. However, the government could not have requested the “actual chats” because, as the testimony at the hearing established, Omegle does not keep the video chats themselves; it only keeps the metadata, which was provided to the government and the defense. The copies of the video chats simply never existed, and there was never anything to preserve. Accordingly it is my Report and Recommendation that Wilbert’s motion to dismiss the indictment should be denied.
Conclusion
For the foregoing reasons, it is my Report and Recommendation that the defendant’s motion to suppress evidence (Docket # 42) during the search of Wilbert’s computer be denied. It is my further Report and Recommendation that Wilbert’s request for a “Franks hearing” be denied and that his motion to dismiss the Indictment for failure to preserve evidence be denied.
SO ORDERED.
Pursuant to 28 U.S.C. § 636(b)(1), it is hereby
*11 ORDERED, that this Report and Recommendation be filed with the Clerk of the Court.
ANY OBJECTIONS to this Report and Recommendation must be filed with the Clerk of this Court within fourteen (14) days after receipt of a copy of this Report and Recommendation in accordance with the above statute and Rule 59(b)(2) of the Local Rules of Criminal Procedure for the Western District of New York.[11]
The district court will ordinarily refuse to consider on de novo review arguments, case law and/or evidentiary material which could have been, but was not, presented to the magistrate judge in the first instance. See, e.g., Patersorl-Leitch Co., Inc. v. Mass. Mun. Wholesale Elec. Co., 840 F.2d 985 (1st Cir. 1988).
Failure to file objections within the specified time or to request an extension of such time waives the right to appeal the District Court’s Order. Thomas v. Arn, 474 U.S. 140 (1985); Wesolek v. Canadair Ltd., 838 F.2d 55 (2d Cir. 1988).
The parties are reminded that, pursuant to Rule 59(b)(2) of the Local Rules of Criminal Procedure for the Western District of New York, “[w]ritten objections ... shall specifically identify the portions of the proposed findings and recommendations to which objection is made and the basis for each objection, and shall be supported by legal authority.” Failure to comply with the provisions of Rule 59 (b) (2) may result in the District Court’s refusal to consider the objection.
Let the Clerk send a copy of this Order and a copy of the Report and Recommendation to the attorneys for the Plaintiff and the Defendant.
SO ORDERED.

The defendant maintains that he did not use the Omegle website to upload any images and that the offending conduct must have been committed by a roommate who used the defendant’s computer. Docket # 60. The government, obviously, disagrees and intends to prove it was the defendant who was participating in the video chat on the Omegle website. Regardless, there is no dispute that it was an IP address subscribed to and paid for by Wilbert that was utilized in the relevant video chat.
Also known as Uniform Coordinated Time, Coordinated Universal Time, or Greenwich Mean Time. Tr. at 78.
On cross examination, K-Brooks clarified that first in time images are not always reviewed before other images captured later, or at all. He explained, “[t]here’s a lot of complexity to the logic of how queues work, so things can drain from the end of the queue if it gets too big, and so on and so on.” Tr. at 77-78. He admitted that “not all images end up getting viewed. We try to review most of them, but sometimes they don't.” Tr. at 78.
What Omegle sent to NCMEC appears as Section A in Hearing Exhibit 7.
A staff member could obtain supervisor override authorization to view the image but that did not happen here. Shehan testified that image a9e7 was not viewed by NCMEC staff. Tr. at 122.
Investigator T. Northrup – who did not testify – viewed both images before Cerretto viewed the images. Cerretto was aware that Northrup had viewed both images before viewing them himself. Tr. at 175-83.
Cerretto stated that he makes his own assessment of whether an image contains child pornography rather than relying on any assessment made by a non-law enforcement officer. Tr. at 166-67.
The relevant parts of Omegle’s privacy policy provide:
Chat messages are screened by an automated system for spam. In general, messages are not stored, but messages which are flagged by a as [sic] suspicious may be stored indefinitely, and select messages may be read by a human being to improve Omegle’s anti-spam software, or for other quality control purposes.
At the beginning of every chat, a record is made of the fact that a chat has occurred between you and your chat partner. These records may be used for the purpose of tracking spammers, hackers, and others who pose harm to the site; and may also be used for law enforcement purposes....
Webcam images may be captured from Omegle video chats, uploaded to Omegle’s servers, and monitored for misbehavior as part of Omegle’s moderation process....
The records Omegle keeps may be shared with third parties for the purpose of law enforcement, to monitor and enforce compliance with Omegle’s rules, or to improve Omegle’s monitoring and enforcement process.
Hr'g Ex. 3.
In fact, according to NCMECs John Shehan, and probably in response to the Ackerman decision, NCMEC has a policy to notopen electronic files containing suspected child pornography that have not been previously opened or viewed by the private party submitting the materials through NCMEC’s CyberTipline. Shehan testified that this policy is followed to foreclose any claim that NCMEC expanded the scope of a search initially conducted by a private entity.
The Court reviewed the two images during the hearing (Hr'g Exs. 1 & 2) and agrees with Cerretto’s description of the images.
Counsel is advised that a new period of excludable time pursuant to 18 U.S.C. § 3161(h) (1) (D) commences with the filing of this Report and Recommendation. Such period of excludable delay lasts only until objections to this Report and Recommendation are filed or until the fourteen days allowed for filing objections has elapsed. United States v. Andress, 943 F.2d 622 (6th Cir. 1991), cert. denied, 502 U.S. 1103 (1992); United States v. Long, 900 F.2d 1270 (8th Cir. 1990).