November 5, 2014
On October 16, the Digital Policy Institute at Ball State University hosted a webinar entitled, “Net Neutrality: FCC and Congressional Options and Alternatives for an Open Internet.” Experts from across the nation discussed key issues before the FCC and the Congress as they address the future of “net neutrality” – issues also being addressed in capitals around the world. What are the right decisions? What are the consequences of legislative or regulatory wrong choice? DPI senior fellow and senior policy analyst Barry Umansky moderated the discussion of how the FCC and the Congress should proceed as they assess government’s next steps in the wake of the January 2014, rejection of the FCC’s open Internet rules
In the opening remarks, Umansky provided an overview of the FCC’s recent history on net neutrality. He noted that the FCC chose – eleven years ago – to treat Internet service providers (“ISPs”) as so-called Title I “information services” rather than a Title II “telecommunication services,” thus exempting the Internet from the kind of common carrier regulations that long had applied to traditional telephone service. Umansky said the agency had concluded that it was essential to allow the Internet to continue to grow, and for innovation to thrive, unencumbered by the chilling impact of federal regulation.
Since then, Umansky observed, there was increased interest in having some formal federal oversight of the Internet, but with FCC attempts to regulate the Internet consistently being struck down by the federal appeals courts. Each time, he said, the courts found that the agency lacked the statutory authority to adopt and enforce the network neutrality rules it had created.
As the most recent example, he pointed to how this past January the appeals court said that certain parts of the FCC’s 2010, “open Internet” rules – those calling for no blocking of websites and barring speed and priority discrimination among Internet traffic – amounted to the kind of Title II constraints the FCC earlier had vowed not to employ.
The FCC’s ongoing proceeding, he observed, is aimed at restoring net neutrality rules, but under somewhat different terms, with a new approach to demonstrating the Commission’s legal ability to adopt them, and including FCC chairman Tom Wheeler’s concept of allowing some “paid prioritization” among websites. Umansky noted that the proceeding has been the subject of nearly four million public comments – signifying widespread interest among the general public and many stakeholders.
He also underscored how there still are wide philosophical and generally party line-related differences among the FCC commissioners as to how the agency proceed. Similarly, he said, varying forms of legislation have been introduced in Congress – some that would bar Title II regulation, some that would require it, and some that would offer a hybrid approach.
Randolph May, President of the Free State Foundation, noted that, given the present state of competition in the broadband marketplace and the absence of documented consumer harm from current ISP practices, the costs of implementing new net neutrality regulations would outweigh the benefits. He said that in the 1970s, our current large ISPs began to form in the face of only emerging competition and a fairly monopolistic communications marketplace. But, over a decade ago, he said the FCC ruled that emerging broadband services should not be regulated under the old Title II common carrier, public utility regime.
Richard Bennett, a visiting fellow at the American Enterprise Institute Center for Internet, observed that, across the nation, there is a collective fear over regulating a major driver of our economy. He questioned whether the regulatory framework for the communications networks that economy-driving websites and the public rely upon should be different? Bennett stated that we already have “fast lanes” in our content distribution networks and that we should be careful with the rules we implement to allow for further flexibility in managing our networks.
Jim Prieger, Associate Professor of Public Policy at Pepperdine University, noted that if any such net neutrality regulations bring some sort of short-term benefit, it would be at the expense of long-term investment and innovation. Looking at the 1990s, the Baby Bells, which were starting to introduce new services like voicemail and audio tech services, had to get regulatory approval before introducing these new services. He suggested that if then-existing regulations had not been in place, consumers could have enjoyed new services 62% faster.
When asked if there are areas of the law already in place to safeguard consumers and business, Prieger gave an emphatic “yes”. He observed that two federal agencies are entrusted with enforcing antitrust laws. Prieger posed an intriguing comparison to airline regulation. Since deregulation in the 1970s, prices have fallen, the quantity of airline flights has risen, and that flying has become immensely more common. Prieger admitted that there have been a few instances of anticompetitive behavior; but, he noted, most observers agree that deregulation was a good move.
According to Richard Bennett, the net neutrality debate can boil down to a simple notion: Do we want to have a static Internet and a dynamic Internet? He said the FCC has very good reasons to allow carriers to sell prioritization and that public utility regulation results in a resistive, slow-changing system. Jim Prieger agreed that innovation does not arise from a static industry. Prieger argued that the FCC should let companies experiment with prioritization so long as there is a minimum level of available service. Randolph May concluded that the FCC should exercise the option to sit back and observe the marketplace to avoid making shortsighted policy decision that could result in long-term negative impacts. Under Section 706, he argued, the FCC has the authority to act, and by simply watching the marketplace, they are reserving their authority to act.