Monday, October 10, 2005
Grokster: A Rocky Shoal In The DMCA's Safe Harbor?
David Maizenberg did a great podcast (MP3) awhile back where he interviewed some of the panelists at this conference ("Copyright After MGM v. Grokster: Understanding The Supreme Court's Decision And Its Impact On Law And Business"). The comments from Matt Neco and William Patry are not to be missed. Neco gives what I fear is a depressingly accurate account of the impact of the case on technology businesses, and Professor Patry picks up on the tension between Grokster and the DMCA highlighted in JD Lasica's video (Patry's discussion touching on this point starts at about 18:00 into the recording).
Professor Patry is of course absolutely right that we need law that encourages people to take common sense, reasonable steps to try to encourage lawful activities and discourage unlawful ones, and not law that perversely incentivizes turning a blind eye to potential infringement or other problems. Under the twin messages of Grokster and the DMCA safe harbor provisions, I'm not sure that's what we've got. Instead, what I think we may have is the kind of situation that ultimately gave rise to the adoption of Section 230(c) of the Communications Decency Act. This discussion from Batzel v. Smith (PDF), 333 F.3d 1018, 1029-30 (9th Cir. 2003) illustrates what I mean:
Without the immunity provided in Section 230(c), users and providers of interactive computer services who review material could be found liable for the statements of third parties, yet providers and users that disavow any responsibility would be free from liability. Compare Stratton Oakmont, 1995 WL 323710 (holding a service provider liable for speech appearing on its service because it generally reviewed posted content) with Cubby, Inc. v. CompuServe, Inc., 776 F.Supp. 135 (S.D.N.Y.1991) (holding a service provider not liable for posted speech because the provider was simply the conduit through which defamatory statements were distributed).
In particular, Congress adopted § 230(c) to overrule the decision of a New York state court in Stratton Oakmont, 1995 WL 323710. Stratton Oakmont held that Prodigy, an Internet access provider that ran a number of bulletin boards, could be held responsible for libelous statements posted on its "Money Talk" bulletin board by an unidentified person. Id. The court relied on the fact that Prodigy held itself out as a service that monitored its bulletin boards for offensive content and removed such content. Id. at *2, *4. Prodigy used filtering software and assigned board leaders to monitor the postings on each bulletin board. Id. at *1-*2. Because of Prodigy's active role in monitoring its bulletin boards, the court found, Prodigy was a publisher for purposes of state libel law and therefore could be held liable for any defamatory statements posted on the website. Id. at *4.
Although Stratton was a defamation case, Congress was concerned with the impact such a holding would have on the control of material inappropriate for minors. If efforts to review and omit third-party defamatory, obscene or inappropriate material make a computer service provider or user liable for posted speech, then website operators and Internet service providers are likely to abandon efforts to eliminate such material from their site. See S.Rep. No. 104-230, at 194 (1996) ("One of the specific purposes of [Section 230] is to overrule Stratton-Oakmont v. Prodigy and any other similar decisions...."); H.R. Conf. Rep. No. 104-458, at 194 (1996) ("The conferees believe that [decisions such as Stratton Oakmont] create serious obstacles to the important federal policy of empowering parents to determine the content of communications their children receive through interactive computer services."); 141 Cong. Rec. at H84691-70 (statement of Rep. Cox) (referring to disincentives created by Stratton Oakmont decision); see also Zeran, 129 F.3d at 331 (emphasizing that § 230 was adopted to overrule Stratton Oakmont, and to provide incentives to self-regulate the dissemination of offensive material); Harvey L. Zuckman et al., Modern Communication Law 615 (1999) (observing that it is "crystal clear that [Section 230 was] designed to change the result in future cases like Stratton Oakmont"). [FN 14]
[FN 14] Although not relevant to the current case, § 230(c)(2) further encourages good samaritans by protecting service providers and users from liability for claims arising out of the removal of potentially "objectionable" material from their services. See § 230(c)(2). This provision insulates service providers from claims premised on the taking down of a customer's posting such as breach of contract or unfair business practices. Cf. 17 U.S.C. § 512(g)(1) (providing similar protection for service providers who take down material alleged to violate copyright laws); H.R.Rep. No. 105-551, at 25 (1998).
(Emphasis added.) This issue of a conflict between the DMCA safe harbor and Grokster doesn't seem to be on the legislative radar (at least, it doesn't appear to have come up at the Senate Judiciary Committee's September 28 hearing, Protecting Copyright and Innovation in a Post-Grokster World). But the problem is as follows. Grokster says affirmative policing is not required to avoid indirect liability for user infringement (the much-discussed Footnote 12), but there's no denying that those anxious to avoid liability under the decision will want to do everything in their power, including filter and monitor if that's possible. Indeed, I wrote not long ago that "preemptive warnings about, and active filtering, monitoring, and control (as much as may be possible or practical) of, infringements by users will help reduce the risk of indirect liability." While that may be true, I'm now more than a little concerned about a possible consequence of such a strategy: sacrificing the protections of the DMCA safe harbor. Specifically, for online service providers to be protected from user infringement under the DMCA, they must not have actual knowledge of infringement (17 U.S.C. Section 512(c)(1)(A)(1)); and must not know of facts or circumstances from which infringing activity is apparent (17 U.S.C. Section 512(c)(1)(A)(2)). If you watch the video referenced earlier, you'll see that at least one practitioner who routinely advises companies on DMCA issues has told JD Lasica that "the ISP MUST bury its head in order to receive the protections of DMCA unless it receives actual knowledge of an infringing post or file. Otherwise, it creates a duty for itself to monitor and it loses safe harbor." A similar summing-up appears in Ian Ballon's E-Commerce and Internet Law treatise, which characterizes the safe harbor as available "where a Service Provider expeditiously removes or disables access to material upon learning of infringement or neither knew that the material was there in the first place nor was aware of facts that would have raised a 'red flag' for a reasonable person."
It seems to me that particularly now, with Grokster on the books, it will be important for Congress to look to its CDA enactments as a model, and take steps to ensure that online service providers who choose to actively monitor and filter potentially infringing material need not worry they might be doing so at the expense of protections to which they would otherwise be entitled under the DMCA.
Unless otherwise expressly stated, all original material of whatever nature created by Denise M. Howell and included in the Bag and Baggage weblog and any related pages, including the weblog's archives, is licensed under a Creative Commons License.