The internet has brought about remarkable opportunities in our modern world. It allows us to keep in touch with family and friends thousands of miles away, order goods and services with the click of a button, and access all sorts of information. And it’s an incredible tool for a robust exchange of ideas. But unfortunately, the internet – and social media platforms in particular – can also be used to spread extremism, violence, hate speech, and pornography.

There is currently a robust debate on the role that government should have in regulating these platforms. How much should the government be stepping in to prevent the spread of harmful content? How can we still allow for free speech and a free exchange of ideas? What is the proper balance?

Thankfully, that’s where Section 230 of the Communications Decency Act comes in.

Section 230 is one of the most important pieces of internet law because it protects websites from being held liable for content that is generated by the website’s users. In other words, without this legal protection, it would be difficult for companies like Facebook, Twitter, or YouTube to allow for third-party content to be posted without being held legally responsible for it.

But it also includes a so-called the “Good Samaritan” provision, which means that companies can also not be held liable for taking down objectionable content that is considered obscene, lascivious, or excessively violent, for example – whether it is constitutionally protected or not. This is what makes it possible for all of us to visit YouTube, Facebook, and Twitter without being inundated with obscene content.

During this week’s Commerce committee hearing exploring these issues, I heard from company executives from Facebook, Twitter, and Google on just how beneficial Section 230 and the Good Samaritan provision are for providing a fundamental legal framework in this debate.

Because it’s not just Congress that wants to prevent objectionable and hateful content on online platforms. Tech companies themselves have incentives to provide safe and positive experiences for their users. Section 230 is critical for providing security and safety on their websites, because it allows them to directly and swiftly remove objectionable content from their platforms themselves.

Not only that, but it’s been an important part of maintaining a competitive ecosystem as well. In empowering these companies to take a proactive role in these efforts, Section 230 has also been a huge part of the reason that America has been a leader in innovation and technological development in this sector. It’s been particularly beneficial in the way that it has empowered small companies and tech-startups.

While Section 230 has come under much criticism lately, it would present big problems if it removed from the law. In fact, removing Section 230 would pose a big problem for these smaller companies. While large and established companies would have the technological tools to navigate changes to the law – and an army of lawyers at their disposal – it would be much more difficult for start-ups to get around such a significant change to the legal framework of the internet.

As the debate on government regulation of the internet continues, we ought to protect measures like Section 230 and the Good Samaritan provision that strike the right balance in protecting both our security and our free speech.