In 1996, before the widespread public use of the internet, Congress recognized the need to regulate improper content online while also encouraging the growth of the then-nascent industry. From these competing values emerged Section 230 of the Communications Decency Act. Now there is an increasing movement in Congress and among the judiciary to strike a different balance. Section 230 has been interpreted to protect entities that host platforms online from liability for the harms that are perpetrated on and allegedly facilitated by their platforms. Among those harms are human trafficking and child exploitation. Wrongdoers are known to use interactive platforms to identify, connect with, and exploit victims. Recent cases show that some courts are no longer inclined to allow Section 230 to fully shield platform hosts from liability. These cases have begun to shift responsibility onto hosts, suggesting that they have some obligation to protect their users from these harms. Section 230 has also been the subject of high-profile political discussion, with demonstrated bipartisan interest in legislative change. This article provides an overview of Section 230 as it stands today, and reviews the cases and legislative proposals that together demonstrate that the law’s broad liability shield is already shrinking and may undergo dramatic change in the near future. We discuss the implications this has for entities operating online and conclude that while Section 230 still protects them in some instances as it relates to trafficking and child exploitation material, this may soon change and should spur proactive efforts to implement appropriate safeguards.