Don’t leave developers out in the discussion of Section 230
We shared our thoughts on Section 230 protections for developers and the importance of innovation. TechCrunch– You can read the full article below.
Last week, the US Supreme Court first considered Section 230 of the Communications Decency Act of 1996. oral arguments in the The Gonzalez v. Google casewhich raised important questions about platform liability and the risks of viral content.
As the courts grapple with these issues, why the 230 was created in the first place, how it encourages innovation, and what we all stand to lose if the protections built into the 230 are narrowed. It is an opportunity to reflect on things.
Section 230, dubbed “The 26 Words That Created the Internet” by Jeff Kosseff, established a liability shield for platforms that host third-party content. In the early days of the Internet, 230 created favorable legal conditions for startups and entrepreneurs to thrive, cementing the United States as a world leader in software.
Today’s technology landscape is dramatically different than the fledgling Internet of the 90’s, but the reasoning behind Section 230 still applies today. The legal system creates the conditions for innovation and can also chill it.
Understanding how Section 230 supports the broader online ecosystem, especially software developers, seems lost in discussions aimed at the enormous influence of large social media platforms. looks like
Developers are at the center of the online world and at the forefront of creating solutions to global challenges, working to make the software that underpins our digital infrastructure safer, more reliable, and more secure.
Developers are using 230 to collaborate on platforms like GitHub to build and operate new platforms that are rethinking social media. Narrowing 230’s protections could have far-reaching implications, creating legal uncertainty for software developers, start-ups, and the important work of providing the tools to bring their platform visions to life. As policy makers consider how to address the new frontiers of intermediary responsibility, it is imperative to put developers at the center of decisions that will shape the future of the Internet.
Software developers make significant contributions to US economic competitiveness and innovation and are important stakeholders in platform policy. GitHub’s platform has his 17 million American developers. This is more than any other country. Their open source work alone contributes over $100 billion annually to the US economy.
These developers maintain the invisible yet essential software infrastructure that underpins our daily lives. Nearly all software (97%) contains open source components and is often developed and maintained on GitHub.
As Chief Legal Officer of GitHub, a global community of over 100 million software developers collaborating to code, I know firsthand the importance of keeping 230 intact. While GitHub is far from a general-purpose social media platform, GitHub relies on 230 protections to host and engage with third-party content in good faith. content moderation.
This is especially important when the platform has over 330 million software repositories. GitHub has been able to grow while maintaining the health of the platform thanks to intermediary liability protection. GitHub has a robust A developer-first approach to content management Ensure the safety, health and inclusiveness of your platform while tailoring your approach to the unique environment of code collaboration where the takedown of a single project can have a significant impact on thousands or more software projects. to maintain.
As to the specifics of the Gonzalez v. Google case, which asks the court to consider whether algorithm-recommended third-party content should be included in the §230 liability protection, the decision in favor of petitioners is pending development. may have unintended consequences. Recommendation algorithms are used throughout software development in myriad ways that are different from general-purpose social media platforms.
GitHub contributions to Microsoft’s Amycus Brief GitHub’s algorithmic recommendations are also used to connect users with similar interests, help them discover related software projects, and recommend ways to improve code and fix software vulnerabilities.One such example is on GitHub Code QLis a semantic code analysis engine that enables developers to find vulnerabilities and errors in open source code.
Developers use GitHub to maintain an open source project that employs algorithmic recommendations to block hate speech and remove malicious code. A court decision to narrow 230 to exclude protections for recommendation algorithms could quickly undermine a range of socially valuable services, including tools that maintain the quality and security of the software supply chain. .
The Gonzalez v. Google ruling seeking to withdraw protections that benefit social media platforms could have implications for the broader community. Heading into court hearings, a number of trial attorneys emphasized its far-reaching implications: from non-profit organizations (Wikimedia Foundation) to manage community content (Reddit and Reddit Moderators) and SMEs and start-ups (engine).
The call to narrow 230 is primarily focused on stifling Big Tech, which unintentionally stifles competition and innovation, further raising barriers to entry for next-generation developers and emerging providers. will be
These concerns are not exaggerated.How Law Made Silicon ValleyAnupam Chander notes that “concerns about copyright infringement and strict privacy protection have held back Internet startups” and “web companies in Asia are not only bound by copyright and privacy restrictions, but strict intermediary liability.” There are also rules.”
Narrowing down the 230 will not only undermine US international competitiveness.it will hinder the progress of technology internal US GitHub has come a long way since our startup’s inception, but we’re committed to leveling the playing field so that anyone, anywhere can become a developer.
While we await Gonzalez v. Google’s court ruling, no matter what the outcome of the lawsuit, further efforts to narrow down the 230, whether aiming for algorithmic recommendations, AI, or other innovations, are sure to follow. It is important to note what is done. While these new technologies raise important questions about the future of intermediary liability, policy makers should seek to create a legal environment that supports developers, start-ups, small businesses and non-profits. We must strive to chart the course of
Policy makers interested in reducing harmful content can take a look at how developers are leading content moderation. The developer uses her GitHub to develop valuable software projects, including open-source content moderation algorithms that reflect policymakers’ demands for algorithmic transparency on the platform. The Algorithm Accountability Act of 2022 and the Algorithmic Justice and Online Platform Transparency Act.
platform including twitter, bumble and Wikimedia We use GitHub to share the source code of algorithms that flag misinformation, filter obscene images, and block spam, respectively. Open source is driving innovation in content moderation while offering new models of community participation, oversight, and transparency.
Faced with new frontiers of brokerage responsibility, policymakers should recognize the critical role of developers and strive to support, rather than stifle, innovation.
https://github.blog/2023-03-10-dont-leave-developers-behind-in-the-section-230-debate/ Don’t leave developers out in the discussion of Section 230