The DCIA believes that the Federal Communications Commission (FCC) exceeded its authority and acted without proper Congressional approval by promulgating net neutrality regulations that took effect last year after being narrowly passed by a 3-2 vote in December 2010.
The Court of Appeals for the DC Circuit previously ruled in a case involving Comcast that the FCC lacked authorization to regulate broadband, which is currently considered an information service rather than a telecommunications service.
Instead of proceeding with caution after the court ruling, the Commission moved ahead with its open Internet order, considered by many at the time to be a “reckless power grab,” imposing additional restrictions and discriminating between wireline and wireless access providers, while excluding major portals, app store operators, search engines, and others.
The FCC’s regulations also require broadband Internet providers to disclose information about their network management practices, which in itself could be an important component of a more fully developed and properly sanctioned program to ensure transparency.
Now Verizon and MetroPCS have filed litigation against the FCC in a brief filed Monday with the DC Circuit as part of the operators’ ongoing 18-month legal challenge to the regulations.
The two operators originally filed suit against the Commission’s rules in early 2011, only to have their complaint dismissed on a technicality. The suit was re-filed, and in March the court allowed the challenge to proceed after dismissing the FCC’s request for a delay.
The companies argue that the rules should be vacated because they conflict with the Communications Act, are outside the FCC’s authority, and violate constitutional rights.
In a more difficult argument, the telecoms’ appellate brief also contends that the regulations violate their free speech rights because they strip providers of control over what they transmit and how they transmit it, and compel carriage without compensation.
Their point is that other major gatekeepers to Internet-based content are excluded from this anti-discrimination requirement. Whether others should be included, or whether there should be different standards for different participants in the web ecosystem remains an unanswered but important issue at this juncture.
This also opens a more complex set of considerations, including placement of liability for copyright infringement and the question of censorship, which need far more discussion in the context of an acceptable regulatory process.
In addition, the concerns voiced by both independent (as opposed to carrier-owned) content providers and public advocacy groups regarding fair and equitable treatment of Internet data remain unanswered.
From the DCIA’s perspective, which is focused on commercial advancement of distributed computing over the Internet and other networks, the FCC’s movements here have been in no way beneficial.
Without a more comprehensive approach, too much marketplace uncertainty remains, and private sector interests and the public at large would be better served by vacating the Commission’s current rules.
The FCC’s regulations, which ban all Internet access providers from blocking sites or competing applications and impose greater restrictions on those that do so through wireline networks, may have been well intended, but were clearly premature and incomplete. The basic problem, as noted above (and at the time it was issued) is that the FCC’s order imposes classic common-carrier obligations on broadband providers, which is prohibited by the Communications Act.
Further complicating this issue, the advocacy group Free Press, which also sued the FCC contending that it acted arbitrarily in adopting different standards for wireless and wireline providers, this week withdrew its lawsuit rather than file a brief in the case.
The group isn’t satisfied with the neutrality regulations, but decided to drop the litigation; which it was pursuing in order to improve the rules and not contest the FCC’s authority or basis for imposing them.
Free Press and a coalition of more than 100 organizations, academics, start-up founders, and tech innovators instead launched the Declaration of Internet Freedom — five principles outlining the basic freedoms that all Internet users should enjoy. This effort is meant to spark a passionate, global discussion among Internet users and communities about the Internet and our role in protecting it.
What is needed is a complete reworking on the now outdated Communications Act in light of today’s Internet and the business realities of providing access and fostering continued investment and innovation.
If the court simply overturns the FCC’s regulations, wireless and broadband Internet providers could be allowed to block online content and competing services, and that would not be a good thing. And the potential for consumer net users and digital content providers to benefit from more advanced and flexible services could be curtailed if the court doesn’t do so, and that would be bad, too.
The FCC’s response to the telecoms’ brief is due in September. Meanwhile, if you agree that Expression, Access, Openness, Innovation, and Privacy are principles that should be secured for the Internet globally, please sign the Declaration of Internet Freedom. Share wisely, and take care.
The DCIA commends Chairman Greg Walden (R-OR) and the US House Energy and Commerce Committee Communications Subcommittee for holding a Hearing Wednesday June 27th focused on the Future of Video.
Most of the regulations under which current television programming distributors operate were put in place before the advent of the Internet, video downloading, over-the-top (OTT) streaming, and now IPTV and cloud-based storage and delivery.
“The Federal Communications Commission (FCC) regulates based on a bygone era,” Chairman Walden noted, referencing the 1992 Cable Act. “It was meant to spur competition and it worked. But the act does not apply to YouTube, iTunes, Netflix, Amazon, Hulu, Roku and Sky Angel.”
In addition to conflicts among broadcasters and multichannel video programming distributors (MVPDs) over carriage deals, new issues for online video providers, which depend on Internet service providers (ISPs) to reach their viewers, are emerging.
ISPs are often also MVPDs themselves, and therefore subject to scrutiny for the even-handedness in their treatment of independent video services versus those that they own-and-operate.
Indeed, the Department of Justice (DoJ) has recently initiated an antitrust investigation into whether cable companies are using broadband data caps to steer consumers to their own Internet video services and discourage, by pricing, the usage of video services not controlled by them.
Witnesses in the two-hour hearing included David Barrett, President, Hearst Television; Charlie Ergen, Chairman, Dish Network; Jim Funk, Vice President, Roku; David Hyman, General Counsel, Netflix; Robert Johnson, CEO, Sky Angel; Michael O’Leary, EVP, Motion Picture Association of America (MPAA); Michael Powell, President, National Cable and Telecommunications Association (NCTA); and Gigi Sohn, President, Public Knowledge.
If anything, the panel focused more on today’s disputes among competing distribution technologies and business models than on views of the divergent interests represented in terms of what the future may hold.
Predictably, Hearst Television’s Barrett defended existing retransmission consent rules, saying the rules should also be applied to new entrants in the video marketplace including IPTV providers, while Dish Network chairman Charlie Ergen characterized these as outdated regulations that have led to an increasing number of station blackouts.
“Local broadcasters are a government-sponsored monopoly,” Ergen said at one point, and also complained of station group owners unfairly leveraging their market power to demand higher fees.
Barrett maintained that retransmission dollars are critical for a “21st century media company” to fund local programming, including multicast channels and newscasts.
Ergen, whose DVR offering introduced an ad-skipping AutoHop feature in May, which is being met with legal challenges from CBS, FOX, and NBC, defended that capability as doing nothing more than improving on “existing, legally accepted and widely available technologies” in response to consumer desires.
Congressman John Dingell (D-MI) referred to Ergen as “Mr. Hopper” asking if he understood why Subcommittee Members would object to a service that would skip-over political ads. “I understand the consumer very well, but I am not a politician, so I cannot say I understand your concerns as well,” Ergen responded.
The hearing also addressed data caps — Internet service providers’ (ISPs) limits on the amount of bandwidth consumers can access in a single month. Netflix’s David Hyman argued that platforms and networks should not use their leverage to “stifle video providers” from independent sources.
“When you couple limited broadband competition with a strong desire to protect a legacy video distribution business, you have both the means and motivation to engage in anti-competitive behavior,” he said. “Add to this mix a regulatory and legislative framework largely crafted before the modern Internet era, and you have the makings for confusion and gamesmanship.”
“Netflix is the largest provider of subscription video in the country,” The NCTA’s Powell (formerly an FCC Commissioner) responded. “We sell broadband. Their services help stimulate the services we sell.”
In March, Comcast announced it wouldn’t count toward the monthly data limit videos viewed on its Xfinity application via the Microsoft Xbox game console, while continuing to apply broadband used for Netflix and other on-demand streaming services.
Public Knowledge’s Gigi Sohn further warned of the abuse of data caps as a way for cable operators to favor their own video services and stressed that there has to be robust development of Internet video competitors for the industry to advance in the public’s interest.
Sohn said the traditional media industry is “trying to limit the online distribution of independent programming.”
“I know it’s too late to do a bill in this Congress,” said Congressman Joe Barton (R-TX). “In general, I think we need less regulation than more. I look forward to big things happening in the next Congress, but it has to be done in a bipartisan basis.” Congressman Steve Scalise (R-LA) meanwhile has co-authored a bill to wipe away many communications regulations.
There’s no doubt this issue will continue to be a subject for US lawmaker consideration, with more than an introductory hearing to guide a process to determine what rules should be eliminated, which ones revised, and whether new ones are needed. Share wisely, and take care.
The DCIA commends the US House Energy & Commerce Committee’s leadership and bipartisan approval this week of a resolution opposing the United Nations’ and its International Telecommunications Union’s (ITU) attempt to assert and impose unprecedented governmental regulation over the Internet.
The Internet’ss current multi-stakeholder governance model fosters continuing investment and innovation absent heavy-handed regulatory controls.
Beyond the substantial growth that the Internet and related distributed computing technologies are contributing to the global economy, unprecedented advances in political freedom can also be attributed to the current model.
Congresswoman Mary Bono Mack’s leadership of this initiative has been particularly laudable: “In many ways, we’re facing a referendum on the future of the Internet. A vote for my resolution is a vote to keep the Internet free from government control and to prevent giving the UN unprecedented power over Web content and infrastructure. That’s the quickest way for the Internet to one day become a wasteland of unfilled hopes, dreams, and opportunities.”
We strongly urge the timely support of this resolution by the full US House of Representatives and similar actions by other responsible legislative bodies around the world in advance of the upcoming World Conference on International Telecommunications (WCIT) in Dubai with 190 nations expected to participate this December.
At WCIT, the International Telecommunications Regulations, comprising an international treaty developed nearly 25 years ago to deal with global telephone and telegraph systems at the time, will be opened for revisions.
And while any amended treaty would only be binding in the US if ratified by the Senate, the implications of currently proposed changes, if adopted elsewhere around the world, would have profoundly damaging effects on the operation of the Internet everywhere.
The secret drafting of ITU proposals in preparation for WCIT has been widely and rightly criticized by public interest groups for a serious lack of transparency. But our concerns go deeper than that.
If the ITU is successful in taking power over the Internet with the proposed amendments, such technologically valuable activities as the current flexibility of Internet-connected devices to perform as both clients and servers would be jeopardized.
Certain communications among devices would be hampered based on jurisdictional considerations and governmental security intervention measures, including repressive surveillance of Internet users and sanctioned censorship of the Internet.
Eli Dourado, a researcher at George Mason University, articulated this aspect of the looming battle well:
“It’s really one between Internet users worldwide and their governments. Who benefits from increased ITU oversight of the Internet? Certainly not ordinary users in foreign countries, who would then be censored and spied upon by their governments with full international approval. The winners would be autocratic regimes, not their subjects.”
In addition, a sending-party tax to be paid by content providers, would upend longstanding principles of Internet architecture and take us back to the days of the extortionary taxes that were once imposed on long-distance phone-calls. Some of the most promising cloud-based content delivery applications and systems would be made economically unfeasible.
The ongoing and smoothly proceeding transition to IPv6 would come to a grinding halt.
We join the Internet Society, representing engineering groups that develop and maintain core Internet technologies, in objecting to these proposals on principle and as a practical matter.
Independent organizations including the Society, as well as the Internet Corporation for Assigned Names and Numbers, and the Worldwide Web Consortium, already deal much more effectively than the ITU possibly could with such fundamental tasks as network and domain name registrations, allowing the Internet to develop and evolve with relatively fast responses to changes in technology, business practices, and consumer behavior.
We also agree with Philip Verveer, Deputy Assistant Secretary of State and US Coordinator for International Communications and Information Policy, who said, “It is important that when we have values, as we do in the area of free speech and the free flow of information, that we do everything that we can to articulate and sustain those values.”
And the negative economic impacts of the proposed treaty changes on expansion of Internet-based services as well as job creation would be devastating.
Verveer called the proposals unworkable and said they would have unintended consequences that would seriously harm the Internet. We concur, and urge DCINFO readers everywhere to join us in their opposition. Share wisely, and take care.
As the Distributed Computing Industry Association (DCIA) and the Cloud Computing Association (CCA) ramp-up for our inaugural strategic summit, CLOUD COMPUTING WEST 2012, we pause to celebrate the success of Sys-Con’s tenth international developers’ conference, Cloud Expo 2012. Two unstoppable enterprise information technology (IT) trends - cloud computing and big data - were the central themes at this event, which was held June 11th-14th at the Javits Convention Center in New York, NY with an estimated 10,000 attendees. The expo featured industry keynotes, technical breakout sessions, and “power panels,” as well as a busy exhibit floor with leading solutions vendors displaying their latest offerings. The State of Cloud Computing was the topic of discussion in a power panel recorded the day before the event opened. The preview highlighted recent IDC research showing that worldwide spending on cloud services will grow almost threefold, reaching $44.2 billion by 2013 and a recent Gartner report predicting that the volume of enterprise data overall will increase by a phenomenal 650% over the next five years. It was clear at Cloud Expo that the cloud is now being adopted by mainstream companies, organizations, and even national governments to leverage the power of data on demand at a scale and pace never seen before in the history of the Internet. Cloud Computing Bootcamp, led by Larry Carvalho, helped make sense of this hottest new technology that is still rapidly evolving, while also continuously being peppered with hype. With prospective customers finding it hard to determine what aspects technology will yield the greatest benefits, Cloud Computing Bootcamp offered a practical understanding of the technology. Citrix VP Peder Ulander cut through the hype and clarified the ontology for cloud computing in his Crash Course in Open Source Cloud Computing, focusing on the open-source software that can be used to build compute clouds for infrastructure-as-a-service (IaaS) deployments, and the complementary open-source management tools that can be combined to automate the management of cloud-computing environments. Hadoop, MapReduce, Hive, Hbase, Lucene, Solr? The only thing growing faster than enterprise data is the landscape of big data tools. These tools, which are designed to help organizations turn big data into opportunities, are gaining deeper insight into massive volumes of information. The time is now for IT decision makers to determine which big data tools are the best - and most cost-effective - for their organization. In The Growing Big Data Tools Landscape, David Lucas, Chief Strategy Officer at GCE, ran through what enterprises need to know about this growing set of big data tools - including those being leveraged by organizations today as well as new and innovative ones just arriving on the scene. Blake Yeager, Product Manager Lead for IaaS at HP Cloud Services, in Run and Operate Your Web Services at Scale took attendees through Hewlett-Packard’s (HP) next public cloud infrastructure, platform services, and cloud solutions, showing how easy it can be to spin-up instances of compute, storage, and content delivery networking (CDN). In Cloud Computing and Big Data - It’s the Applications, Tom Leyden, Director of Alliances and Marketing at Amplidata, noted that, “While there is still a lot of interest in Big Data Analytics, we see an increasing focus on Big Unstructured Data. Object storage is the new paradigm to store those massive amounts of free-form data.” IT Cloud Management strategies enable organizations to maximize the business value of their private clouds. Joe Fitzgerald, Chief Product Officer & Co-founder of ManageIQ, discussed customer experiences and how these tactical approaches increase agility, improve service delivery levels, and reduce operating expenses in Cloud Computing: Enterprise Cloud Management. Shannon Williams, VP of Market Development for the Cloud Platforms Group at Citrix and a Co-founder of Cloud.com, in Architecting Your Cloud, discussed how CloudStack has been the platform of choice for more than a hundred public and private production clouds, and provided an insider’s view of the company’s experiences in designing the right architecture to meet customers’ clouds. IT departments are experiencing storage capacity needs doubling every 12-18 months, 50x the amount of information and 75x the number of files. IT managers are dealing with growing constraints on space, power, and costs to manage their data center infrastructures. The Growth and Consolidation of Big Data in the Cloud explored how Intel is helping businesses and users realize the benefits of cloud computing technology by working to develop open standards that operate across disparate IT infrastructures and by delivering cloud-based architectures. Securing Big Data Input addressed one of the most widely asked questions about big data today: “How do we get valuable analytics from big data?” As data continues to grow exponentially, so does the variety of data (structured and unstructured) coming from humans, machines, and applications. In order to pull valuable information from it all, proper data gathering is critical, and the data itself needs to be timely and accurate. And in The Ever-Expanding Role of Big Data, William Bain, Founder & CEO of ScaleOut Software observed that, “Security standards for moving data into and out of the cloud and for hosting it within the cloud will dramatically help accelerate adoption of the cloud as a secure computing platform, and additional standards for creating elastic clusters that are physically co-located and use high-speed networking will also help in hosting applications.” There is no longer any question that the cloud computing model will be the prevailing style of delivery for computing over the coming decades. Forrester Research predicts that the global market for cloud computing will grow to more than $241 billion in 2020. Cloud - Vision to Reality explored how greenfield application development projects can be designed from the outset to benefit from cloud-computing features such as elastic scalability, automated provisioning, and infrastructure level APIs. SHI, a $4 billion+ global provider of IT products, and Rackspace Hosting, a services leader in cloud computing, were Platinum Plus Sponsors of SYS-CON’s Expo. For developers, it was a must-attend event. According to IBM’s 2011 Tech Trends Report, 75% of respondents said that over the next 2 years their organizations will begin to build cloud infrastructure and in the next 24 months “developing new applications” will be the top cloud adoption activity, overtaking the current top investment areas of virtualization and storage. Huge cloud-driven opportunities for wealth creation exist today - but the race is to the swift. The cloud-computing industry is one in which even a few months can make all the difference. DCINFO readers are encouraged to sign-up now for the CLOUD COMPUTING WEST 2012 (CCW:2012) summit being presented November 8th-9th in Santa Monica, CA by the Cloud Computing Association (CCA) and the Distributed Computing Industry Association (DCIA). CCW:2012 features three co-located conferences geared for management charged with addressing the key strategies and business decisions critical to cloud computing adoption in the entertainment, telecom, and investment sectors. Share wisely, and take care.
Wednesday June 6th was World IPv6 Launch Day (WILD), a historic day when the Internet and cloud computing companies gained significant growing room.
Internet pioneer Vint Cerf called WILD the start of the 21st century Internet because of the vast implications of advancing to a new system for assigning Internet Protocol (IP) addresses.
IP addresses, which identify computers and other connected devices on the global network, are essential to the Internet’s operation, and the IPv6 protocol is coming not a moment too soon.
Its predecessor, IPv4 could handle 4.3 billion possible IP addresses. While that may seem like a lot, the last unreleased block was assigned by the Internet Assigned Numbers Authority (IANA) last year, and IPv5 was an experimental streaming protocol that never took off.
By contrast, IPv6 spans 340,000,000,000,000,000,000,000,000,000,000,000,000 unique addresses – which means it’s virtually unlimited.
It’s not only more IP addresses that makes IPv6 better than IPv4. There’s also streamlining in how addresses are assigned and connectivity recovered when networks change, along with standardization in how MAC address identifiers are handled. IPsec is also baked in, one of several improvements in overall network security.
If you use Android or the iPhone, or a version of Windows or Mac OS that was released in the past five years, it probably supports IPv6 as well as IPv4. The big problem has been that websites, household routers, and consumer Internet service providers (ISPs) have not supported it.
And for IPv6 to work Internet-wide, everybody needs to get on board — PCs, networks, routers, and websites, too.
A year ago, some companies switched on IPv6 temporarily, just to test it out. But this week to ensure that the Internet can continue to grow and connect billions of more people and devices around the world, thousands of companies and literally millions of websites permanently enabled IPv6 for their products and services as part of WILD.
Participants in WILD included many DCIA Member companies and other web-based businesses in more than 100 countries.
By making IPv6 the new norm, these companies enabled millions of end-users to enjoy its benefits without having to do anything. There’s more on IPv6 at Wikipedia.
WILD was organized by the Internet Society as part of its mission to ensure that the Internet remains open and accessible for everyone – including the five billion people not yet connected to the web.
“The support of IPv6 from these organizations delivers a critical message to the world: IPv6 is not just a ‘nice to have;’ it is ready for business today and will very soon be a ‘must have,’” said Leslie Daigle, Chief Internet Technology Officer, Internet Society.
“We believe that the commitment of these companies to deploy IPv6 will ensure that they remain industry leaders. Any company wishing to be effective in the new Internet should do the same.”
At some point, the entire Internet infrastructure has to move to using the newer address space, since the differences in the protocols mean that computers with IPv4 addresses cannot communicate with machines with IPv6 addresses.
“IPv6 is critical to the future of the Internet’s underlying architecture, and to supporting the billions of devices that will connect to the Internet over the coming years,” said Tom Leighton, Chief Scientist and Co-Founder, Akamai.
“Having expanded our global IPv6 footprint to over 50 countries, Akamai enables websites to reach a growing audience over IPv6 with the performance and reliability that they have come to expect and demand from IPv4.”
Cisco SVP Engineering and General Manager Service Provider Business, Pankaj Patel, added, “The Internet has fueled remarkable economic growth and innovation that would have never happened without a network.”
“Today, we face an explosion of connected devices moving data and content, especially video, and of applications and services coming from the Cloud. IPv6 enables the network — the platform on which innovation is built — to scale and make more growth more possible, today and into the future.”
John Schanz, Chief Network Officer, Comcast, concluded, “We at Comcast take great pride in being an innovator and technical leader. As a result of our team’s hard work, enabling IPv6 in over a third of our network, I am happy to report that by today we have exceeded our goal of 1% of our customer base being enabled with IPv6 for WILD!”
“Thank you to the Internet Society and others for organizing and participating in this important event!”
The World IPv6 Day in June 2011 was a 24-hour “stress test” that focused on websites. It also served as a wake-up call that it was time to upgrade the World Wide Web.
At NANOG 52, the Internet Society’s Phil Roberts provided an introduction to World IPv6 Day and moderated a panel of key participants. Panelists from Akamai, Cisco, and Comcast presented their companies’ results including how they prepared for the event, issues that arose, lessons learned, and the current status of IPv6 in their networks.
World IPv6 Day Observations at the Technical Plenary at IETF81 further outlined the overwhelming industry response and several additional reports were delivered by participants in the IPv6 Operations Group at IETF81.
World IPv6 Day Operators Review described the traffic growth Hurricane Electric saw on World IPv6 Day, including significantly higher IPv6 traffic. In Comcast Experience with IPv6 Deployment, John Brzozowski presented results from Comcast’s trials including increased traffic on Teredo, 6to4, 6rd, and native IPv6 access. He noted that 50% continued to publish AAAA records after World IPv6 Day.
In Investigating IPv6 Traffic: What happened at the World IPv6 Day, authors compared IPv6 activity before, during, and after World IPv6 Day. They examined traffic traces recorded at a large European Internet Exchange Point (IXP) and on the campus of a major US university; analyzing volume, application mix, and the use of tunneling protocols for transporting IPv6 packets.
Comparing IPv6 and IPv4 Performance shared the results of a comparison of performance measurement between IPv4 and IPv6 among their vantage points in the network and 46 of the websites who turned up IPv6 on that day.
And finally, in World IPv6 Day, Phil Roberts summarized the rationale for the 2011 event.
The June 6th, 2012 WILD was a permanent commitment across the distributed computing industry, laying the foundation to accelerate the deployment of IPv6 across the global Internet.
Major Internet companies and ISPs permanently enabled IPv6 on their websites and across a significant portion of their current and all new residential wireline subscribers. Home networking equipment manufacturers enabled IPv6 by default through their router products, and additional commitments to IPv6 by companies beyond websites demonstrated broad support of the new Internet Protocol.
This move was imperative as the last of 4.3 billion IP addresses enabled by the current protocol IPv4 were assigned to the Regional Internet Registries in February 2011.
Already there is no remaining IPv4 address space to be distributed in the Asia Pacific region, and very soon the rest of the globe will follow. IPv4 address space is expected to run out in Europe this year, in the US next year, and in Latin America and Africa in 2014.
IPv6 provides an essentially unlimited number, which will help connect the billions of people that are not connected today, allow a wide range of new devices to connect directly with one another, and help ensure that the Internet can continue its current growth rate indefinitely.
For more information about WILD and the participating companies, as well as links to useful information for users and how other companies can participate in the continued deployment of IPv6, please click here. Share wisely, and take care.
The National Institute of Standards and Technology (NIST) has just issued Special Publication 800-146 reiterating previously published NIST classifications of the various types of cloud computing implementations and their benefits, while also identifying what NIST sees as “23 open issues” regarding cloud computing technology overall.
In the DCIA’s view, most of NIST’s “issues” are well-known aspects of distributed computing that have been “open” for years, but have received much greater prominence as a result of the emergence of cloud computing.
The document itself claims that only some of the issues “appear to be unique to cloud computing.”
The NIST “issues” are organized under five general categories: computing performance, cloud reliability, economic goals, compliance, and information security, with privacy integrated into the last two of these.
An example of the computing performance issue is off-line data synchronization. When users are disconnected from the network, obviously their documents and data don’t stay in synch with versions hosted in the cloud - making version-control an important consideration, especially within group collaboration activities.
Cloud reliability was one of two concerns the DCIA addressed at our CLOUD COMPUTING CONFERENCE at the 2012 NAB Show (CCC at NAB).
The conclusion of our keynote speakers and panelists was that cloud-based solutions can be configured to deliver the level of reliability that a customer desires. As in many other computing approaches and technological processes generally, it’s a question of how much redundancy and what level of fail-prevention features are built into the given deployment.
NIST notes that “for the cloud, reliability is broadly a function of the reliability of four individual components: (1) the hardware and software facilities offered by providers, (2) the provider’s personnel, (3) connectivity to the subscribed services, and (4) the consumer’s personnel.”
We wouldn’t disagree that a serious deficiency in any of these areas could impact the overall performance of a cloud-based system, and therefore each needs to be properly and carefully configured to match the desired level of quality of service (QoS).
NIST’s commentary on economic goals - or the cost savings that can be achieved by using cloud computing - also suggests that several factors need consideration.
In addition, and probably because a key constituency for NIST is the large and growing group of government agencies that are end-users of cloud computing services, it believes that standardization of cloud service agreements could serve the “achievement of economic goals.”
An agreement template, for example “in a machine-readable format using common ontologies” could facilitate automated review, and potentially foster a greater understanding of the functionality and benefits of cloud computing, rather than distracting the focus of agreement participants onto ancillary terms-and-conditions and what are for all practical purposes boiler-plate provisions.
The second area of concern that we explored during CCC at NAB was security in the cloud, or what NIST includes along with privacy considerations and “compliance.”
Regarding this subject, there was a consensus among our conference speakers that in many respects cloud computing is actually more secure than the older technologies it supplants, but that key to delivering the desired levels of security and compliance are the proper application of appropriate tools, such as closed-versus-open networks and encrypted-versus-unencrypted data.
Obviously, the definitional cloud attribute of multi-tenancy - which often translates to a sharing of resources - is a concern especially among new cloud users.
Depending on the type of implementation, as NIST explains, security concerns will vary.
For software-as-a-service (SaaS), different end-user consumers may share the same application or database. For platform-as-a-service (PaaS), different processes may share an operating system (OS) and supporting data and networking services creating another level of security concern. And for infrastructure-as-a-service (IaaS) clouds, different virtual machines (VMs) may share hardware via a hypervisor creating yet another.
NIST sees the potential for flaws in logical separation with this sharing, but as the DCIA has pointed out, with the scale that is possible through cloud computing, greater resources can be dedicated to ensuring that vital components maintain their separate integrity.
NIST’s view of safeguards is that “for clouds that perform computations, mitigation can occur by limiting the kinds of data that are processed in the cloud or by contracting with providers for specialized isolation mechanisms such as the rental of entire computer systems rather than VMs (mono-tenancy), Virtual Private Networks (VPNs), segmented networks, or advanced access controls.”
And of course, because it’s increasingly common for cloud applications to be accessed through their end-users’ browsers, this becomes another vulnerability point. Separate and apart from cloud computing, browser security represents its own concerns.
The bottom line here is that NIST has published another valuable reference, which can serve as a resource for industry to review and in some ways as a voluntary guide to follow as progress continues in this space. NIST’s document is particularly useful for those involved in contracting with and servicing public sector entities.
Having said that, we do not agree with the assertion that reliability and security issues per se are greater with cloud computing than traditional computing. A growing body of evidence demonstrates that the cloud actual provides more trustworthy solutions. Share wisely, and take care.
We commend US Senator Ron Wyden (D-OR) for introducing legislation this week that would clarify the US Trade Representative’s (USTR) obligation to share information on trade agreements with Members of Congress.
Sen. Wyden, who spoke on the floor of the Senate about why this is necessary, has been a critic of the Administration’s handling of international treaties including the Anti-Counterfeiting Trade Agreement (ACTA) and most recently the Trans Pacific Partnership (TPP).
At the heart of this new practice of skipping Congressional oversight and approval on international treaties like ACTA and TPP is the USTR. They have also been responsible for negotiating these treaties in secret.
In his “Statement for the Record” on the introduction of the Congressional Oversight Over Trade Negotiations Act, Wyden pointed out that the USTR has continually stymied Congress’s efforts to learn more about the negotiations between the USTR and other countries.
According to Wyden, the lack of transparency is beyond the pale because corporations and interest groups know more about the negotiations for this latest treaty than lawmakers do:
“Right now, the Obama Administration is in the process of negotiating what might prove to be the most far-reaching economic agreement since the World Trade Organization was established nearly twenty years ago.
The goal of this agreement - known as the Trans Pacific Partnership (TPP) - is to economically bind together the economies of the Asia Pacific. It involves countries ranging from Australia, Singapore, Vietnam, Peru, Chile and the United States and holds the potential to include many more countries, like Japan, Korea, Canada, and Mexico. If successful, the agreement will set norms for the trade of goods and services and includes disciplines related to intellectual property, access to medicines, Internet governance, investment, government procurement, worker rights and environmental standards.
If agreed to, TPP will set the tone for our nation’s economic future for years to come, impacting the way Congress intervenes and acts on behalf of the American people it represents.
It may be the USTR’s current job to negotiate trade agreements on behalf of the United States, but Article 1 Section 8 of the U.S. Constitution gives Congress - not the USTR or any other member of the Executive Branch - the responsibility of regulating foreign commerce. It was our Founding Fathers’ intention to ensure that the laws and policies that govern the American people take into account the interests of all the American people, not just a privileged few.
And yet, Mr. President, the majority of Congress is being kept in the dark as to the substance of the TPP negotiations, while representatives of US corporations - like Halliburton, Chevron, PHRMA, Comcast, and the Motion Picture Association of America - are being consulted and made privy to details of the agreement. As the Office of the USTR will tell you, the President gives it broad power to keep information about the trade policies it advances and negotiates, secret. Let me tell you, the USTR is making full use of this authority.
As the Chairman of the Senate Finance Committee’s Subcommittee on International Trade, Customs, and Global Competitiveness, my office is responsible for conducting oversight over the USTR and trade negotiations. To do that, I asked that my staff obtain the proper security credentials to view the information that USTR keeps confidential and secret. This is material that fully describes what the USTR is seeking in the TPP talks on behalf of the American people and on behalf of Congress. More than two months after receiving the proper security credentials, my staff is still barred from viewing the details of the proposals that USTR is advancing.
Mr. President, we hear that the process by which TPP is being negotiated has been a model of transparency. I disagree with that statement. And not just because the Staff Director of the Senate subcommittee responsible for oversight of international trade continues to be denied access to substantive and detailed information that pertains to the TPP talks.
Mr. President, Congress passed legislation in 2002 to form the Congressional Oversight Group, or COG, to foster more USTR consultation with Congress. I was a senator in 2002. I voted for that law and I can tell you the intention of that law was to ensure that USTR consulted with more Members of Congress not less.
In trying to get to the bottom of why my staff is being denied information, it seems that some in the Executive Branch may be interpreting the law that established the COG to mean that only the few Members of Congress who belong to the COG can be given access to trade negotiation information, while every other Member of Congress, and their staff, must be denied such access. So, this is not just a question of whether or not cleared staff should have access to information about the TPP talks, this is a question of whether or not the administration believes that most Members of Congress can or should have a say in trade negotiations.
Again, having voted for that law, I strongly disagree with such an interpretation and find it offensive that some would suggest that a law meant to foster more consultation with Congress is intended to limit it. But given that the TPP negotiations are currently underway and I - and the vast majority of my colleagues and their staff - continue to be denied a full understanding of what the USTR is seeking in the agreement, we do not have time to waste on a protracted legal battle over this issue. Therefore, I am introducing legislation to clarify the intent of the COG statute.
The legislation, I propose, is straightforward. It gives all Members of Congress and staff with appropriate clearance access to the substance of trade negotiations. Finally, Members of Congress who are responsible for conducting oversight over the enforcement of trade agreements will be provided information by the Executive Branch indicating whether our trading partners are living up to their trade obligations. Put simply, this legislation would ensure that the representatives elected by the American people are afforded the same level of influence over our nation’s policies as the paid representatives of PHRMA, Halliburton and the Motion Picture Association.
My intent is to do everything I can to see that this legislation is advanced quickly and becomes law, so that elected Members of Congress can do what the Constitution requires and what their constituents expect.”
Share wisely, and take care.
The Distributed Computing Industry Association (DCIA) and the Cloud Computing Association (CCA) have entered into a strategic alliance to better serve expanding industry needs driven by the explosive growth of cloud computing in the software sector.
Along with CCA Executive Director Don Buford, we are very pleased to announce plans this week for our first jointly sponsored event, CLOUD COMPUTING WEST 2012 (CCW: 2012), November 8th-9th in Santa Monica, CA.
CCW: 2012 will feature three co-located conferences focusing on the impact of cloud-based solutions in the industry’s fastest-moving and most strategically important areas.
The CCA is an independent membership organization, founded in 2012, dedicated to building a community of end-users and service providers of cloud-based solutions and products through individual professional memberships and industry conferences. The CCA has quickly amassed a contact list of three-hundred thousand industry participants.
As DCINFO readers know, the DCIA is an international trade organization, established in 2003, with more than one-hundred industry-leading member companies, including software developers, broadband network operators, and content providers. The DCIA conducts working groups and oversees political initiatives, as well as publishing this weekly online newsletter.
As part of the strategic alliance, DCIA member company employees will be offered the opportunity to become CCA professional members on special terms. And CCA individual members will be offered opportunities to participate in DCIA activities on special terms, as well as receive subscriptions to DCINFO.
The DCIA & CCA together invite companies and individuals to take active roles as exhibitors and speakers in CCW: 2012, a first-of-its kind cloud-computing summit that will bring together three conferences in a single venue. Interested parties are encouraged to call 410-476-7965 or e-mail email@example.com for details.
CCW: 2012 will encompass the latest advances and key pitfalls associated with cloud computing applied to entertainment distribution, telecommunications infrastructure, and venture financing, which the DCIA & CCA have identified as the three most critical areas for industry development.
The opening plenary session, “Cloud Computing Effects on Media, Telecom, and Investing” will feature a keynote address answering the questions: How are cloud-based solutions in the entertainment sector evolving? How do these and related developments impact broadband network infrastructure? How should investments in this space be evaluated?
This will be followed by a panel discussion answering questions such as: Is “the cloud” more important to content creation or storage and delivery? What role(s) do Internet service providers (ISPs) play in this arena? What are risk profiles and ROI projections for entities in this space?
From there, each of the three conferences will zero in on issues of importance to its area of focus.
The entertainment conference will start with latest trends in cloud solutions for high-value content production and distribution, then analyze pitfalls to avoid in adopting cloud solutions for content development/delivery, and finally examine ten tactical areas associated with the implementation of cloud-based solutions for media companies.
The telecom conference will start with current impacts of cloud migration on broadband network operations and businesses, then analyze drawbacks to cloud deployments from broadband network operators’ perspectives, and finally examine ten tactical areas associated with external cloud services and those that ISPs themselves offer.
The investment conference will start with new updates on venture capital and mergers-and-acquisitions (M&A) activity in the cloud computing space, then analyze liabilities that need to concern investors regarding cloud-based businesses, and finally examine ten tactical areas associated with criteria used for investing in cloud-based ventures and exit strategies.
The DCIA most recently presented the CLOUD COMPUTING CONFERENCE held within the 2012 NAB Show, focusing on cloud-based solutions for broadcasting. On the same date at a different location, the CCA presented two conferences, hCLOUD and PublicCLOUD, focusing on cloud-based solutions for healthcare and government.
“We’ve been astonished by the responsiveness of software sector representatives to the introduction of CCA and the value they’ve derived from our initial efforts to bring buyers and sellers of cloud-based solutions together. We look forward to leveraging the depth of experience of the DCIA in this space as we team with them to increase the scope and effectiveness of our combined service offerings,” said CCA Executive Director Don Buford.
This very timely partnership will enable us to better serve our rapidly expanding industry by integrating a dual focus on individual participant as well as company level advancement. People as well as enterprises have major stakes in the ongoing cloud-computing revolution, and we are thrilled to be working with the CCA to drive continued success on both fronts. Share wisely, and take care.
Continuing our focus this week on issues surrounding CISPA - more heinous than SOPA, and it just passed the US House of Representatives, DCINFO readers now need to alert US Senators of their concerns.
The Senate’s version of the cybersecurity bill is straying even more off the mark of what should be covered in such a measure than the House version: protecting critical American infrastructure against attacks in the digital realm.
This is not the bill to attempt to address a host of other Internet-related items that various Senators are seeking to include based on differing political considerations.
That will only make matters worse, and even the special interests pushing for some of the expanded provisions stand to be hurt by unintended consequences of such amendments.
Senator John McCain’s (R-AZ) remark this week, that “unelected Digital Homeland Security bureaucrats could divert resources from actual cybersecurity to compliance with government mandates,” should raise a major red flag to all observers of this process.
Since this bill’s purported raison d’etre is to protect security, shouldn’t its “mandatory” provisions be aimed at accomplishing precisely that?
Instead, the way the discussion is heading now, the bill that emerges will be more likely to do nothing to protect vital American interests from cyberattacks, and actually harm privacy - both individual and institutional - as well as add operating expenses to US companies.
And as Preventing Counterfeits (excerpted below) suggests, private sector solutions will leave public sector attempts to legislate remedies here far behind.
Meanwhile, in another example of how challenging this space has become to lawmakers, The Password Protection Act (PPA) was introduced by Democrats this week in both the House and Senate, illustrating the converse problem to CISPA’s growing loss of focus, and that’s the problem of “techno-legislative-micro-management.”
PPA’s stated intent, echoing a Maryland law, is to prevent employers from demanding access to Facebook passwords of employees and job applicants.
Congressmen Ed Perlmutter (D-CO) and Martin Heinrich (D-NM) introduced in the House an identical version of the measure introduced in the Senate by Senator Richard Blumenthal (D-CT), who supported a petition on this subject, which failed to achieve its goal of 60,000 signatures, suggesting that citizens may not want this “help.”
PPA includes provisions intended to prohibit employers from requiring private social network and e-mail account access as a condition of employment and from discriminating against individuals who refuse to provide it. Exceptions include employees with access to national security information and, for inexplicable reasons, students.
Senator Blumenthal’s claim that, “This legislation, which I am proud to introduce, ensures that employees and job seekers are free from these invasive and intrusive practices,” is another indication that legislators are long on seeking politically advantageous credit for their efforts, but in the Internet law arena are short on delivering substantive value.
Moreover, Blumenthal’s assertions that employers requiring such information are perpetrating an “unreasonable and intolerable invasion of privacy” and that “no American should have to provide their confidential personal passwords as a condition of employment,” strike us as demagogic hyperbole.
The bill itself represents an unwarranted “intrusion” by the federal government into the internal workings of private sector organizations. There are numerous instances where the mission or culture of a particular institution wholly justifies heightened transparency and a deepened level of integrity in employer-employee relations, and this should not be prohibited by law.
Circumstances of these relationships vary tremendously; and our point is that, in a free society, neither employers nor employees should have this specific aspect of their association dictated by the federal government.
We tend to agree with Senator Patrick Toomey (R-PA), who said at a related Senate Commerce hearing on privacy protections this week, “It’s premature to begin discussing specific legislative fixes when we don’t fully know whether a problem exists.”
Senator Toomey was speaking against the Federal Trade Commission’s (FTC) bid to expand its powers to interfere with evolving privacy practices of Internet-based companies like Facebook and Google, absent regulatory authorization.
DCINFO readers will recall that the White House last Fall put forward what it called a Privacy Bill of Rights to provide basic online protection guidelines.
Those rights were presented as voluntary codes of conduct, and the DCIA applauded them. Industry in response launched a “Do Not Track” initiative along the lines of the “Do Not Call” list, which even FTC Chairman Jon Leibowitz acknowledged is working.
The eight basic principles included Individual Control, Transparency, Respect for Context (data used consistent with context in which consumers provided it), Security, Access and Accuracy, Focused Collection (“reasonable limits”) and Accountability (appropriate safeguards for data collection).
These are sufficiently broad not to be overly prescriptive, and companies can readily determine those that apply to them and those that don’t. A firm which voluntarily complies but then violates its commitment will be subject to FTC sanction for false and deceptive practices.
The DCIA believes that self-regulation will go a long way here because, among other reasons, social media users are more vocal with their complaints.
“The right to express one’s views, practice one’s faith, peacefully assemble with others to pursue political or social change - these are all rights to which all human beings are entitled, whether they choose to exercise them in a city square or an Internet chat room,” the US Secretary of State, Hillary Rodham Clinton, said at the end of 2011 at an Internet conference in the Netherlands.
“And just as we have worked together since the last century to secure these rights in the material world, we must work together in this century to secure them in cyberspace.” Share wisely, and take care.
The DCIA commends the US Senate Commerce Committee for conducting a hearing Tuesday April 24th on The Emergence of Online Video: Is It the Future?
The session explored the migration of viewing from traditional television to Internet and broadband-enabled video content, and examined the role that disruptive technologies play in facilitating this transition, and the business and legal models that will foster the growth of this sector.
The panel of witnesses was comprised of Barry Diller, Chairman, IAC; Paul Misener, VP, Amazon; Blair Westlake, VP, Microsoft; and Susan Whiting, Vice Chairman, Nielsen.
Committee Chairman Jay Rockefeller (D-WV) opened with a pair of questions: 1) How will the disruptive technology that online viewing provides lead to better content and more consumer choice? And 2) How do we harness this change for the power of consumers so we can get higher quality programming at lower rates?
He was generally receptive to the notion of over-the-top (OTT) services, like those represented by Amazon and Microsoft, for potentially providing downward pricing pressure on consumer cable bills, which he criticized for outstripping inflation. He also chastised current cable and satellite multichannel video program distributors (MVPDs) for making him pay for 500 channels, while he only views 10.
At one point, Diller responded by saying that major basic cable programmers, such as ESPN, would be “insane to go a la carte” abandoning the traditional model of 100% of cable subs having to pay for these channels whether or not they want to receive them.
The DCIA’s answers to Rockefeller’s first two questions are: 1) Cloud-based Internet protocol television (IPTV) offers virtually unlimited channel capacity as a result of the way the technology distributes video programming - this is drastically different from channel-bound cable and satellite systems. And 2) We harness this change first by replicating on the new platform all that consumers currently receive (albeit with incremental quality improvements), so that conversion does not force them to miss out, and second by introducing unique new services, many of them a la carte.
Cloud-based IPTV is more economical than cable or satellite and therefore able simultaneously to offer programmers greater revenue - from both traditional license fees and new interactive services - and also to offer consumers more attractive pricing.
Diller also said that without network neutrality protections, traditional broadcasters and cable operators would penalize competitors who try to deliver content those legacy distributors do not own - even as a complementary offering. “We have to protect network neutrality,” agreed Senator John Kerry (D-MA).
If the hearing had a major deficiency, it was that it did not scrutinize how broadband network operators control access to content - through caps, proprietary offerings, and pricing/packaging programs that should begin to raise questions. The problems of Internet access providers owning video content arguably should be the greatest concern to Congress as the technological transformation to cloud distribution proceeds.
We agree with the concerns voiced by a coalition of public interest groups in their letter this week to Congress.
Diller noted that first-rate broadband service was also crucial, and there seemed to be a consensus among panelists that universal broadband access was going to be very important to the success of online video businesses - and a future world not divided between video “haves” and “have-nots.”
In an acknowledgment of demand for “TV Everywhere” type services, Misener said that, “Although we recognize that our customers want to watch a variety of high-quality video content at affordable prices from the comfort of their homes, we also realize that they are on the move, and thus they want access to digital video not just anytime, but also anywhere.”
Meanwhile, Westlake touted Microsoft’s delivery of television programming to Xboxes, its work in video-on-demand (VoD), and its development of voice recognition software that integrates search with video delivery.
With cloud-based IPTV, in the fullness of time, it should be possible to access any previously recorded video content on demand, delivered to any device. And also to access any live streaming feed.
Susan Whiting testified that viewers find value in access to video content online from any device whenever they want to access it.
Diller, a former top TV and movie studio executive, said copyright law was working and condemned the “ridiculous overreach” of the Stop Online Protection Act (SOPA), recently abandoned legislation that was strongly opposed by the DCIA. In contrast, we would support coverage of IPTV services by the Communications Act of 1996, which would not require a lengthy rewrite during an election year.
The Next Generation Television Marketplace Act, proposed in December, is another possibility. This bill would further deregulate the broadcast industry to eliminate coverage requirements and allow broadcasters to negotiate retransmission fees more like cable programmers such as AMC or CNN do for their license fees. Key questions would be how broadcasters’ access to public airwaves is addressed and what happens to their requirement to present public-interest programming.
Diller defended his Aereo TV service, which offers subscribers access to a remote digital broadcast antenna and cloud DVR capabilities for $12/month, but which broadcasters are suing for failure to compensate them for retransmission of their signals in violation of copyright law.
With seeming inconsistency, he told Senators that a level playing field was critical and that the same rules and regulations that apply to traditional MVPDs should be applied to online video services. We agree: Aereo needs to pay for the programming it redistributes, as do related services ivi.TV and NimbleTV.
In the most heated part of the hearing, Senator Jim DeMint (R-SC) questioned the legitimacy of Aereo for its interception of station signals and then retransmission of them - charging viewers while not paying content tights holders. The Senator got it right.
Indeed, it was refreshing to see how far the Committee has progressed in its recognition of cloud-based IPTV as the coming distribution platform for television and interactive video.
There are many, many more questions to address to ensure that innovation progresses at an optimal pace, and that the interests of all stakeholders are reflected in the process. Share wisely, and take care.