The Distributed Computing Industry Association (DCIA) and Cloud Computing Association (CCA) are very pleased to welcome Microsoft to our all new co-hosted CLOUD DEVELOPERS SUMMIT & EXPO 2014 (CDSE:2014).
According to research firm IDC, by 2017, the cloud computing industry is expected to more than double from its 2013 level of $47.4 billion.
The cloud’s 23.5% compound annual growth rate is five times faster than that of the broader technology market.
The needs have never been greater for developers, programmers and architects to advance their knowledge, capabilities and skill-sets in order to profit from this revolutionary transformation in the business processes of the future.
Microsoft’s cloud platform Azure is helping to drive this growth with services that provide enhanced performance, scalability, and additional security, which conventional web hosts just can’t provide.
For web developers, this means access to hosted applications and data, along with cloud-based development services, enabling them to create web applications that have access to data and services like never before.
To help industry participants learn how to capture their share of this huge opportunity, the DCIA & CCA have partnered to present CDSE:2014 in Austin, TX on October 1st and 2nd.
Keynoting for Microsoft on the topic “Enabling DevOps for the Cloud” will be Haishi Bai, Microsoft’s Technology Evangelist for Azure.
Haishi is an active technical writer with two published books on cloud computing and a blog with 0.5 million views yearly.
He’s been working in the industry for 17 years, and has been accumulating software development skills since he was 12, when he wrote his first programs in BASIC.
After he joined Microsoft, Haishi’s focus has been promoting Azure adoption by speaking at events, creating samples, guidance, frameworks, and tools as well as engaging with customers at different project phases to guide their cloud computing projects.
Haishi’s keynote will highlight such areas as “focus on your application,” “learn and innovate,” and “continuous improvements.”
Microsoft will also offer two workshops at CDSE:2014, which will cover “Getting Started with Microsoft Azure” and “Big Data with Microsoft Azure.”
Joining Haishi in conducting these workshops will be Microsoft Research Connections’ Senior Research Program Manager, Wenming Ye.
After completing his graduate work at University of Colorado Boulder, Wenming joined SRI International, where he focused on design and development of innovative wireless, handheld, and web-based simulation tools and services.
He returned to Boulder as a developer on the commercialization team at Tech-X Corp, where he developed and productized large-scale HPC software.
Wenming is currently responsible for cloud-based big data and big compute projects at Microsoft Research Connections.
Microsoft’s participation and that of the other major cloud brands exemplify how this inaugural summit and expo is co-locating two related but distinct events with broader audience appeal than prior CCA & DCIA offerings.
First, it’s providing the kind of senior-level strategic business conference we’ve pioneered with the CLOUD COMPUTING EAST and CLOUD COMPUTING WEST conference series — and for which audiences will be of the same caliber as the decision-maker attendees for those.
And second, it adds an all new opportunity for cloud-solution providers and vendors to present hands-on instructional workshops and special seminars — and for which audiences will be more directly involved in developing, programming, and implementing cloud-computing solutions.
The schedule has been carefully organized so that workshop attendees do not have to miss out on the thematically related conference sessions that are most directly related to their areas of interest.
During the conference part of CDSE:2014, highly focused business strategy and technical keynotes, breakout panels, and seminars will thoroughly explore cloud computing solutions, and ample opportunities will be provided for one-on-one networking with the major players in this space.
At the co-located instructional workshops and special seminars facilitated by more than one-hundred industry leading speakers and world-class technical trainers, attendees will, see, hear, learn and master critical skills in sessions devoted to the unique challenges and opportunities for developers, programmers, and solutions architects.
All aspects of cloud computing will be represented: storage, networking, applications, integration, and aggregation.
Three tracks will cover mobile, logistics, and big data considerations that cut across nearly every enterprise vertical migrating business functions to the cloud.
Three tracks will zero-in on three economic sectors that are now experiencing the most explosive growth: media and entertainment, government and military, and healthcare and life sciences.
Register now for CDSE:2014 to take advantage of early-bird rates.
To learn more about conducting an instructional workshop, exhibiting, or sponsoring CDSE:2014, contact Don Buford, Executive Director, or Hank Woji, VP Business Development, at the CCA.
If you’d like to speak at this major industry event, please contact me at the DCIA. .
Share wisely, and take care.
Don’t ask for your privacy. Take it back. Today we #ResetTheNet to stop mass spying. Encrypt everything! Learn how: http://thndr.it/PVxjUl
The DCIA salutes incoming Federal Communications Commission (FCC) Chairman Tom Wheeler and looks forward to working with his new administration to continue broadband development initiatives spearheaded by former Chairman Julius Genachowski and to launch additional endeavors that will help advance the distributed computing industry.
Chairman Wheeler’s carefully chosen staff includes Chief of Staff Ruth Milkman, Senior Counselor Phil Verveer, Head of the Technology Transitions Policy Task Force Jon Sallet, Special Counsel Diane Cornell, Special Counsel for External Affairs Gigi Sohn, Acting Managing Director and Advisor to the Chairman for Management Jon Wilkins, Acting Chief of the Wireless Telecommunications Bureau Roger Sherman, and Legal Advisors to the Chairman Daniel Alvarez, Maria Kirby, and Renee Gregory.
The new Chairman takes over the Commission at a time when the speed of innovation of Internet-based services in the private sector holds promise to continue at a breathtaking pace, and the potentially beneficial impacts to society and the economy of transformative technologies which rely on network connectivity — with cloud computing arguably chief among them — are nothing short of enormous.
At the same time, the challenges to continued advancement have never been greater due to several factors, ranging from threats to end-users’ trust of their network operators and service providers as a result of old and new abuses; to threats to innovators’ abilities to deploy new offerings as a result of unchecked hampering by threatened industry incumbents whose entrenched business models are being disrupted by these advances.
President Obama could not have nominated a better candidate for FCC Chairman than Tom Wheeler, whose breadth of experience and track record of accomplishments uniquely qualify him for this role.
We’re pleased that the Chairman has tasked Diane Cornell with heading a temporary working group over the next two months to identify FCC regulations that are past their prime and FCC procedures that can be improved upon — and to employ crowdsourcing, among other methods, as a contemporary communications technique to be employed by this group.
DCINFO readers should seize this opportunity to be a part of that crowd.
If successful, this effort will help prioritize matters the Commission needs to address to ensure continued expansion of such promising areas as mobile cloud computing, big data, and the migration to IP transport of high-value multimedia.
The concepts behind “content everywhere available to each person at any time” have an excellent chance of moving closer to reality under Chairman Wheeler’s expert guidance.
We are also heartened by Chairman Wheeler’s acknowledgment of the Commission’s responsibility “to act in the public interest, convenience, and necessity” to assure that innovation and technology advance with speed while preserving the relationship of trust between networks and those connected by them.
And we’re thrilled by his description of himself as “an unabashed supporter of competition because competitive markets produce better outcomes than regulated or uncompetitive markets.”
An increase in wireless broadband spectrum is an obvious need at this time.
But our chief concerns as a new and formative industry center on two other areas: privacy violations by unbridled over-reaching federal agencies; and collusive resistance to change by entrenched industries whose power structures permit and support anti-competitive behavior.
Specifically, we need resolution of the issues that threaten our continued growth internationally as a result of Edward Snowden’s exposures of scandalous NSA practices.
And even more importantly, we need resolution of the real issues at the core of the Aereo broadcast retransmission dispute.
An extension of compulsory licensing to this new technology could provide a stopgap measure to end the current litigation, protect the now necessary dual revenue streams for over-the-air TV stations, and enable innovators like Aereo and FilmOn to emerge from the shadow of copyright infringement.
Our view of the real problem here goes much deeper, however.
And it is that independent IPTV needs to be legitimized as a multichannel video program distribution (MVPD) channel and enabled to enter into carriage agreements with television programming services.
It’s time for the realtime distribution of TV channels to break free from the current limitations that shackle them — of only being licensed for exclusive delivery by broadband network operators.
The future can best be secured with the support of a pro-competition FCC by encouraging and not discouraging investment, by nurturing and not stifling innovation; by increasing and not reducing competitive opportunities, by protecting and not violating the trust of consumers, and by ensuring that the benefits of new communications technologies are accessible to all and not just a few.
We are fully aware that the FCC alone does not have the power unilaterally to address the issues that are posing such serious threats to the further advancement of our industry, the economy, and society at large.
But the Commission’s abilities to advise Congress and to influence other agencies are strong, and its leadership in these areas can be unequalled in the federal government.
And to fulfill its role as the "Optimism Agency," act it must, and as Chairman Wheeler has requested, act nimbly. Share wisely, and take care.
The DCIA believes that the Federal Communications Commission (FCC) exceeded its authority and acted without proper Congressional approval by promulgating net neutrality regulations that took effect last year after being narrowly passed by a 3-2 vote in December 2010.
The Court of Appeals for the DC Circuit previously ruled in a case involving Comcast that the FCC lacked authorization to regulate broadband, which is currently considered an information service rather than a telecommunications service.
Instead of proceeding with caution after the court ruling, the Commission moved ahead with its open Internet order, considered by many at the time to be a “reckless power grab,” imposing additional restrictions and discriminating between wireline and wireless access providers, while excluding major portals, app store operators, search engines, and others.
The FCC’s regulations also require broadband Internet providers to disclose information about their network management practices, which in itself could be an important component of a more fully developed and properly sanctioned program to ensure transparency.
Now Verizon and MetroPCS have filed litigation against the FCC in a brief filed Monday with the DC Circuit as part of the operators’ ongoing 18-month legal challenge to the regulations.
The two operators originally filed suit against the Commission’s rules in early 2011, only to have their complaint dismissed on a technicality. The suit was re-filed, and in March the court allowed the challenge to proceed after dismissing the FCC’s request for a delay.
The companies argue that the rules should be vacated because they conflict with the Communications Act, are outside the FCC’s authority, and violate constitutional rights.
In a more difficult argument, the telecoms’ appellate brief also contends that the regulations violate their free speech rights because they strip providers of control over what they transmit and how they transmit it, and compel carriage without compensation.
Their point is that other major gatekeepers to Internet-based content are excluded from this anti-discrimination requirement. Whether others should be included, or whether there should be different standards for different participants in the web ecosystem remains an unanswered but important issue at this juncture.
This also opens a more complex set of considerations, including placement of liability for copyright infringement and the question of censorship, which need far more discussion in the context of an acceptable regulatory process.
In addition, the concerns voiced by both independent (as opposed to carrier-owned) content providers and public advocacy groups regarding fair and equitable treatment of Internet data remain unanswered.
From the DCIA’s perspective, which is focused on commercial advancement of distributed computing over the Internet and other networks, the FCC’s movements here have been in no way beneficial.
Without a more comprehensive approach, too much marketplace uncertainty remains, and private sector interests and the public at large would be better served by vacating the Commission’s current rules.
The FCC’s regulations, which ban all Internet access providers from blocking sites or competing applications and impose greater restrictions on those that do so through wireline networks, may have been well intended, but were clearly premature and incomplete. The basic problem, as noted above (and at the time it was issued) is that the FCC’s order imposes classic common-carrier obligations on broadband providers, which is prohibited by the Communications Act.
Further complicating this issue, the advocacy group Free Press, which also sued the FCC contending that it acted arbitrarily in adopting different standards for wireless and wireline providers, this week withdrew its lawsuit rather than file a brief in the case.
The group isn’t satisfied with the neutrality regulations, but decided to drop the litigation; which it was pursuing in order to improve the rules and not contest the FCC’s authority or basis for imposing them.
Free Press and a coalition of more than 100 organizations, academics, start-up founders, and tech innovators instead launched the Declaration of Internet Freedom — five principles outlining the basic freedoms that all Internet users should enjoy. This effort is meant to spark a passionate, global discussion among Internet users and communities about the Internet and our role in protecting it.
What is needed is a complete reworking on the now outdated Communications Act in light of today’s Internet and the business realities of providing access and fostering continued investment and innovation.
If the court simply overturns the FCC’s regulations, wireless and broadband Internet providers could be allowed to block online content and competing services, and that would not be a good thing. And the potential for consumer net users and digital content providers to benefit from more advanced and flexible services could be curtailed if the court doesn’t do so, and that would be bad, too.
The FCC’s response to the telecoms’ brief is due in September. Meanwhile, if you agree that Expression, Access, Openness, Innovation, and Privacy are principles that should be secured for the Internet globally, please sign the Declaration of Internet Freedom. Share wisely, and take care.
The DCIA commends Chairman Greg Walden (R-OR) and the US House Energy and Commerce Committee Communications Subcommittee for holding a Hearing Wednesday June 27th focused on the Future of Video.
Most of the regulations under which current television programming distributors operate were put in place before the advent of the Internet, video downloading, over-the-top (OTT) streaming, and now IPTV and cloud-based storage and delivery.
"The Federal Communications Commission (FCC) regulates based on a bygone era," Chairman Walden noted, referencing the 1992 Cable Act. "It was meant to spur competition and it worked. But the act does not apply to YouTube, iTunes, Netflix, Amazon, Hulu, Roku and Sky Angel."
In addition to conflicts among broadcasters and multichannel video programming distributors (MVPDs) over carriage deals, new issues for online video providers, which depend on Internet service providers (ISPs) to reach their viewers, are emerging.
ISPs are often also MVPDs themselves, and therefore subject to scrutiny for the even-handedness in their treatment of independent video services versus those that they own-and-operate.
Indeed, the Department of Justice (DoJ) has recently initiated an antitrust investigation into whether cable companies are using broadband data caps to steer consumers to their own Internet video services and discourage, by pricing, the usage of video services not controlled by them.
Witnesses in the two-hour hearing included David Barrett, President, Hearst Television; Charlie Ergen, Chairman, Dish Network; Jim Funk, Vice President, Roku; David Hyman, General Counsel, Netflix; Robert Johnson, CEO, Sky Angel; Michael O’Leary, EVP, Motion Picture Association of America (MPAA); Michael Powell, President, National Cable and Telecommunications Association (NCTA); and Gigi Sohn, President, Public Knowledge.
If anything, the panel focused more on today’s disputes among competing distribution technologies and business models than on views of the divergent interests represented in terms of what the future may hold.
Predictably, Hearst Television’s Barrett defended existing retransmission consent rules, saying the rules should also be applied to new entrants in the video marketplace including IPTV providers, while Dish Network chairman Charlie Ergen characterized these as outdated regulations that have led to an increasing number of station blackouts.
"Local broadcasters are a government-sponsored monopoly," Ergen said at one point, and also complained of station group owners unfairly leveraging their market power to demand higher fees.
Barrett maintained that retransmission dollars are critical for a “21st century media company” to fund local programming, including multicast channels and newscasts.
Ergen, whose DVR offering introduced an ad-skipping AutoHop feature in May, which is being met with legal challenges from CBS, FOX, and NBC, defended that capability as doing nothing more than improving on “existing, legally accepted and widely available technologies” in response to consumer desires.
Congressman John Dingell (D-MI) referred to Ergen as “Mr. Hopper” asking if he understood why Subcommittee Members would object to a service that would skip-over political ads. “I understand the consumer very well, but I am not a politician, so I cannot say I understand your concerns as well,” Ergen responded.
The hearing also addressed data caps — Internet service providers’ (ISPs) limits on the amount of bandwidth consumers can access in a single month. Netflix’s David Hyman argued that platforms and networks should not use their leverage to “stifle video providers” from independent sources.
"When you couple limited broadband competition with a strong desire to protect a legacy video distribution business, you have both the means and motivation to engage in anti-competitive behavior," he said. "Add to this mix a regulatory and legislative framework largely crafted before the modern Internet era, and you have the makings for confusion and gamesmanship."
"Netflix is the largest provider of subscription video in the country," The NCTA’s Powell (formerly an FCC Commissioner) responded. "We sell broadband. Their services help stimulate the services we sell."
In March, Comcast announced it wouldn’t count toward the monthly data limit videos viewed on its Xfinity application via the Microsoft Xbox game console, while continuing to apply broadband used for Netflix and other on-demand streaming services.
Public Knowledge’s Gigi Sohn further warned of the abuse of data caps as a way for cable operators to favor their own video services and stressed that there has to be robust development of Internet video competitors for the industry to advance in the public’s interest.
Sohn said the traditional media industry is “trying to limit the online distribution of independent programming.”
"I know it’s too late to do a bill in this Congress," said Congressman Joe Barton (R-TX). "In general, I think we need less regulation than more. I look forward to big things happening in the next Congress, but it has to be done in a bipartisan basis." Congressman Steve Scalise (R-LA) meanwhile has co-authored a bill to wipe away many communications regulations.
There’s no doubt this issue will continue to be a subject for US lawmaker consideration, with more than an introductory hearing to guide a process to determine what rules should be eliminated, which ones revised, and whether new ones are needed. Share wisely, and take care.
The DCIA commends the US House Energy & Commerce Committee’s leadership and bipartisan approval this week of a resolution opposing the United Nations’ and its International Telecommunications Union’s (ITU) attempt to assert and impose unprecedented governmental regulation over the Internet.
The Internet’ss current multi-stakeholder governance model fosters continuing investment and innovation absent heavy-handed regulatory controls.
Beyond the substantial growth that the Internet and related distributed computing technologies are contributing to the global economy, unprecedented advances in political freedom can also be attributed to the current model.
Congresswoman Mary Bono Mack’s leadership of this initiative has been particularly laudable: “In many ways, we’re facing a referendum on the future of the Internet. A vote for my resolution is a vote to keep the Internet free from government control and to prevent giving the UN unprecedented power over Web content and infrastructure. That’s the quickest way for the Internet to one day become a wasteland of unfilled hopes, dreams, and opportunities.”
We strongly urge the timely support of this resolution by the full US House of Representatives and similar actions by other responsible legislative bodies around the world in advance of the upcoming World Conference on International Telecommunications (WCIT) in Dubai with 190 nations expected to participate this December.
At WCIT, the International Telecommunications Regulations, comprising an international treaty developed nearly 25 years ago to deal with global telephone and telegraph systems at the time, will be opened for revisions.
And while any amended treaty would only be binding in the US if ratified by the Senate, the implications of currently proposed changes, if adopted elsewhere around the world, would have profoundly damaging effects on the operation of the Internet everywhere.
The secret drafting of ITU proposals in preparation for WCIT has been widely and rightly criticized by public interest groups for a serious lack of transparency. But our concerns go deeper than that.
If the ITU is successful in taking power over the Internet with the proposed amendments, such technologically valuable activities as the current flexibility of Internet-connected devices to perform as both clients and servers would be jeopardized.
Certain communications among devices would be hampered based on jurisdictional considerations and governmental security intervention measures, including repressive surveillance of Internet users and sanctioned censorship of the Internet.
Eli Dourado, a researcher at George Mason University, articulated this aspect of the looming battle well:
"It’s really one between Internet users worldwide and their governments. Who benefits from increased ITU oversight of the Internet? Certainly not ordinary users in foreign countries, who would then be censored and spied upon by their governments with full international approval. The winners would be autocratic regimes, not their subjects."
In addition, a sending-party tax to be paid by content providers, would upend longstanding principles of Internet architecture and take us back to the days of the extortionary taxes that were once imposed on long-distance phone-calls. Some of the most promising cloud-based content delivery applications and systems would be made economically unfeasible.
The ongoing and smoothly proceeding transition to IPv6 would come to a grinding halt.
We join the Internet Society, representing engineering groups that develop and maintain core Internet technologies, in objecting to these proposals on principle and as a practical matter.
Independent organizations including the Society, as well as the Internet Corporation for Assigned Names and Numbers, and the Worldwide Web Consortium, already deal much more effectively than the ITU possibly could with such fundamental tasks as network and domain name registrations, allowing the Internet to develop and evolve with relatively fast responses to changes in technology, business practices, and consumer behavior.
We also agree with Philip Verveer, Deputy Assistant Secretary of State and US Coordinator for International Communications and Information Policy, who said, “It is important that when we have values, as we do in the area of free speech and the free flow of information, that we do everything that we can to articulate and sustain those values.”
And the negative economic impacts of the proposed treaty changes on expansion of Internet-based services as well as job creation would be devastating.
Verveer called the proposals unworkable and said they would have unintended consequences that would seriously harm the Internet. We concur, and urge DCINFO readers everywhere to join us in their opposition. Share wisely, and take care.
As the Distributed Computing Industry Association (DCIA) and the Cloud Computing Association (CCA) ramp-up for our inaugural strategic summit, CLOUD COMPUTING WEST 2012, we pause to celebrate the success of Sys-Con’s tenth international developers’ conference, Cloud Expo 2012. Two unstoppable enterprise information technology (IT) trends - cloud computing and big data - were the central themes at this event, which was held June 11th-14th at the Javits Convention Center in New York, NY with an estimated 10,000 attendees. The expo featured industry keynotes, technical breakout sessions, and “power panels,” as well as a busy exhibit floor with leading solutions vendors displaying their latest offerings. The State of Cloud Computing was the topic of discussion in a power panel recorded the day before the event opened. The preview highlighted recent IDC research showing that worldwide spending on cloud services will grow almost threefold, reaching $44.2 billion by 2013 and a recent Gartner report predicting that the volume of enterprise data overall will increase by a phenomenal 650% over the next five years. It was clear at Cloud Expo that the cloud is now being adopted by mainstream companies, organizations, and even national governments to leverage the power of data on demand at a scale and pace never seen before in the history of the Internet. Cloud Computing Bootcamp, led by Larry Carvalho, helped make sense of this hottest new technology that is still rapidly evolving, while also continuously being peppered with hype. With prospective customers finding it hard to determine what aspects technology will yield the greatest benefits, Cloud Computing Bootcamp offered a practical understanding of the technology. Citrix VP Peder Ulander cut through the hype and clarified the ontology for cloud computing in his Crash Course in Open Source Cloud Computing, focusing on the open-source software that can be used to build compute clouds for infrastructure-as-a-service (IaaS) deployments, and the complementary open-source management tools that can be combined to automate the management of cloud-computing environments. Hadoop, MapReduce, Hive, Hbase, Lucene, Solr? The only thing growing faster than enterprise data is the landscape of big data tools. These tools, which are designed to help organizations turn big data into opportunities, are gaining deeper insight into massive volumes of information. The time is now for IT decision makers to determine which big data tools are the best - and most cost-effective - for their organization. In The Growing Big Data Tools Landscape, David Lucas, Chief Strategy Officer at GCE, ran through what enterprises need to know about this growing set of big data tools - including those being leveraged by organizations today as well as new and innovative ones just arriving on the scene. Blake Yeager, Product Manager Lead for IaaS at HP Cloud Services, in Run and Operate Your Web Services at Scale took attendees through Hewlett-Packard’s (HP) next public cloud infrastructure, platform services, and cloud solutions, showing how easy it can be to spin-up instances of compute, storage, and content delivery networking (CDN). In Cloud Computing and Big Data - It’s the Applications, Tom Leyden, Director of Alliances and Marketing at Amplidata, noted that, “While there is still a lot of interest in Big Data Analytics, we see an increasing focus on Big Unstructured Data. Object storage is the new paradigm to store those massive amounts of free-form data.” IT Cloud Management strategies enable organizations to maximize the business value of their private clouds. Joe Fitzgerald, Chief Product Officer & Co-founder of ManageIQ, discussed customer experiences and how these tactical approaches increase agility, improve service delivery levels, and reduce operating expenses in Cloud Computing: Enterprise Cloud Management. Shannon Williams, VP of Market Development for the Cloud Platforms Group at Citrix and a Co-founder of Cloud.com, in Architecting Your Cloud, discussed how CloudStack has been the platform of choice for more than a hundred public and private production clouds, and provided an insider’s view of the company’s experiences in designing the right architecture to meet customers’ clouds. IT departments are experiencing storage capacity needs doubling every 12-18 months, 50x the amount of information and 75x the number of files. IT managers are dealing with growing constraints on space, power, and costs to manage their data center infrastructures. The Growth and Consolidation of Big Data in the Cloud explored how Intel is helping businesses and users realize the benefits of cloud computing technology by working to develop open standards that operate across disparate IT infrastructures and by delivering cloud-based architectures. Securing Big Data Input addressed one of the most widely asked questions about big data today: “How do we get valuable analytics from big data?” As data continues to grow exponentially, so does the variety of data (structured and unstructured) coming from humans, machines, and applications. In order to pull valuable information from it all, proper data gathering is critical, and the data itself needs to be timely and accurate. And in The Ever-Expanding Role of Big Data, William Bain, Founder & CEO of ScaleOut Software observed that, “Security standards for moving data into and out of the cloud and for hosting it within the cloud will dramatically help accelerate adoption of the cloud as a secure computing platform, and additional standards for creating elastic clusters that are physically co-located and use high-speed networking will also help in hosting applications.” There is no longer any question that the cloud computing model will be the prevailing style of delivery for computing over the coming decades. Forrester Research predicts that the global market for cloud computing will grow to more than $241 billion in 2020. Cloud - Vision to Reality explored how greenfield application development projects can be designed from the outset to benefit from cloud-computing features such as elastic scalability, automated provisioning, and infrastructure level APIs. SHI, a $4 billion+ global provider of IT products, and Rackspace Hosting, a services leader in cloud computing, were Platinum Plus Sponsors of SYS-CON’s Expo. For developers, it was a must-attend event. According to IBM’s 2011 Tech Trends Report, 75% of respondents said that over the next 2 years their organizations will begin to build cloud infrastructure and in the next 24 months “developing new applications” will be the top cloud adoption activity, overtaking the current top investment areas of virtualization and storage. Huge cloud-driven opportunities for wealth creation exist today - but the race is to the swift. The cloud-computing industry is one in which even a few months can make all the difference. DCINFO readers are encouraged to sign-up now for the CLOUD COMPUTING WEST 2012 (CCW:2012) summit being presented November 8th-9th in Santa Monica, CA by the Cloud Computing Association (CCA) and the Distributed Computing Industry Association (DCIA). CCW:2012 features three co-located conferences geared for management charged with addressing the key strategies and business decisions critical to cloud computing adoption in the entertainment, telecom, and investment sectors. Share wisely, and take care.
Wednesday June 6th was World IPv6 Launch Day (WILD), a historic day when the Internet and cloud computing companies gained significant growing room.
Internet pioneer Vint Cerf called WILD the start of the 21st century Internet because of the vast implications of advancing to a new system for assigning Internet Protocol (IP) addresses.
IP addresses, which identify computers and other connected devices on the global network, are essential to the Internet’s operation, and the IPv6 protocol is coming not a moment too soon.
Its predecessor, IPv4 could handle 4.3 billion possible IP addresses. While that may seem like a lot, the last unreleased block was assigned by the Internet Assigned Numbers Authority (IANA) last year, and IPv5 was an experimental streaming protocol that never took off.
By contrast, IPv6 spans 340,000,000,000,000,000,000,000,000,000,000,000,000 unique addresses – which means it’s virtually unlimited.
It’s not only more IP addresses that makes IPv6 better than IPv4. There’s also streamlining in how addresses are assigned and connectivity recovered when networks change, along with standardization in how MAC address identifiers are handled. IPsec is also baked in, one of several improvements in overall network security.
If you use Android or the iPhone, or a version of Windows or Mac OS that was released in the past five years, it probably supports IPv6 as well as IPv4. The big problem has been that websites, household routers, and consumer Internet service providers (ISPs) have not supported it.
And for IPv6 to work Internet-wide, everybody needs to get on board — PCs, networks, routers, and websites, too.
A year ago, some companies switched on IPv6 temporarily, just to test it out. But this week to ensure that the Internet can continue to grow and connect billions of more people and devices around the world, thousands of companies and literally millions of websites permanently enabled IPv6 for their products and services as part of WILD.
Participants in WILD included many DCIA Member companies and other web-based businesses in more than 100 countries.
By making IPv6 the new norm, these companies enabled millions of end-users to enjoy its benefits without having to do anything. There’s more on IPv6 at Wikipedia.
WILD was organized by the Internet Society as part of its mission to ensure that the Internet remains open and accessible for everyone – including the five billion people not yet connected to the web.
“The support of IPv6 from these organizations delivers a critical message to the world: IPv6 is not just a ‘nice to have;’ it is ready for business today and will very soon be a ‘must have,’” said Leslie Daigle, Chief Internet Technology Officer, Internet Society.
“We believe that the commitment of these companies to deploy IPv6 will ensure that they remain industry leaders. Any company wishing to be effective in the new Internet should do the same.”
At some point, the entire Internet infrastructure has to move to using the newer address space, since the differences in the protocols mean that computers with IPv4 addresses cannot communicate with machines with IPv6 addresses.
“IPv6 is critical to the future of the Internet’s underlying architecture, and to supporting the billions of devices that will connect to the Internet over the coming years,” said Tom Leighton, Chief Scientist and Co-Founder, Akamai.
“Having expanded our global IPv6 footprint to over 50 countries, Akamai enables websites to reach a growing audience over IPv6 with the performance and reliability that they have come to expect and demand from IPv4.”
Cisco SVP Engineering and General Manager Service Provider Business, Pankaj Patel, added, “The Internet has fueled remarkable economic growth and innovation that would have never happened without a network.”
“Today, we face an explosion of connected devices moving data and content, especially video, and of applications and services coming from the Cloud. IPv6 enables the network — the platform on which innovation is built — to scale and make more growth more possible, today and into the future.”
John Schanz, Chief Network Officer, Comcast, concluded, “We at Comcast take great pride in being an innovator and technical leader. As a result of our team’s hard work, enabling IPv6 in over a third of our network, I am happy to report that by today we have exceeded our goal of 1% of our customer base being enabled with IPv6 for WILD!”
“Thank you to the Internet Society and others for organizing and participating in this important event!”
The World IPv6 Day in June 2011 was a 24-hour “stress test” that focused on websites. It also served as a wake-up call that it was time to upgrade the World Wide Web.
At NANOG 52, the Internet Society’s Phil Roberts provided an introduction to World IPv6 Day and moderated a panel of key participants. Panelists from Akamai, Cisco, and Comcast presented their companies’ results including how they prepared for the event, issues that arose, lessons learned, and the current status of IPv6 in their networks.
World IPv6 Day Observations at the Technical Plenary at IETF81 further outlined the overwhelming industry response and several additional reports were delivered by participants in the IPv6 Operations Group at IETF81.
World IPv6 Day Operators Review described the traffic growth Hurricane Electric saw on World IPv6 Day, including significantly higher IPv6 traffic. In Comcast Experience with IPv6 Deployment, John Brzozowski presented results from Comcast’s trials including increased traffic on Teredo, 6to4, 6rd, and native IPv6 access. He noted that 50% continued to publish AAAA records after World IPv6 Day.
In Investigating IPv6 Traffic: What happened at the World IPv6 Day, authors compared IPv6 activity before, during, and after World IPv6 Day. They examined traffic traces recorded at a large European Internet Exchange Point (IXP) and on the campus of a major US university; analyzing volume, application mix, and the use of tunneling protocols for transporting IPv6 packets.
Comparing IPv6 and IPv4 Performance shared the results of a comparison of performance measurement between IPv4 and IPv6 among their vantage points in the network and 46 of the websites who turned up IPv6 on that day.
And finally, in World IPv6 Day, Phil Roberts summarized the rationale for the 2011 event.
The June 6th, 2012 WILD was a permanent commitment across the distributed computing industry, laying the foundation to accelerate the deployment of IPv6 across the global Internet.
Major Internet companies and ISPs permanently enabled IPv6 on their websites and across a significant portion of their current and all new residential wireline subscribers. Home networking equipment manufacturers enabled IPv6 by default through their router products, and additional commitments to IPv6 by companies beyond websites demonstrated broad support of the new Internet Protocol.
This move was imperative as the last of 4.3 billion IP addresses enabled by the current protocol IPv4 were assigned to the Regional Internet Registries in February 2011.
Already there is no remaining IPv4 address space to be distributed in the Asia Pacific region, and very soon the rest of the globe will follow. IPv4 address space is expected to run out in Europe this year, in the US next year, and in Latin America and Africa in 2014.
IPv6 provides an essentially unlimited number, which will help connect the billions of people that are not connected today, allow a wide range of new devices to connect directly with one another, and help ensure that the Internet can continue its current growth rate indefinitely.
For more information about WILD and the participating companies, as well as links to useful information for users and how other companies can participate in the continued deployment of IPv6, please click here. Share wisely, and take care.
The National Institute of Standards and Technology (NIST) has just issued Special Publication 800-146 reiterating previously published NIST classifications of the various types of cloud computing implementations and their benefits, while also identifying what NIST sees as “23 open issues” regarding cloud computing technology overall.
In the DCIA’s view, most of NIST’s “issues” are well-known aspects of distributed computing that have been “open” for years, but have received much greater prominence as a result of the emergence of cloud computing.
The document itself claims that only some of the issues “appear to be unique to cloud computing.”
The NIST “issues” are organized under five general categories: computing performance, cloud reliability, economic goals, compliance, and information security, with privacy integrated into the last two of these.
An example of the computing performance issue is off-line data synchronization. When users are disconnected from the network, obviously their documents and data don’t stay in synch with versions hosted in the cloud - making version-control an important consideration, especially within group collaboration activities.
Cloud reliability was one of two concerns the DCIA addressed at our CLOUD COMPUTING CONFERENCE at the 2012 NAB Show (CCC at NAB).
The conclusion of our keynote speakers and panelists was that cloud-based solutions can be configured to deliver the level of reliability that a customer desires. As in many other computing approaches and technological processes generally, it’s a question of how much redundancy and what level of fail-prevention features are built into the given deployment.
NIST notes that “for the cloud, reliability is broadly a function of the reliability of four individual components: (1) the hardware and software facilities offered by providers, (2) the provider’s personnel, (3) connectivity to the subscribed services, and (4) the consumer’s personnel.”
We wouldn’t disagree that a serious deficiency in any of these areas could impact the overall performance of a cloud-based system, and therefore each needs to be properly and carefully configured to match the desired level of quality of service (QoS).
NIST’s commentary on economic goals - or the cost savings that can be achieved by using cloud computing - also suggests that several factors need consideration.
In addition, and probably because a key constituency for NIST is the large and growing group of government agencies that are end-users of cloud computing services, it believes that standardization of cloud service agreements could serve the “achievement of economic goals.”
An agreement template, for example “in a machine-readable format using common ontologies” could facilitate automated review, and potentially foster a greater understanding of the functionality and benefits of cloud computing, rather than distracting the focus of agreement participants onto ancillary terms-and-conditions and what are for all practical purposes boiler-plate provisions.
The second area of concern that we explored during CCC at NAB was security in the cloud, or what NIST includes along with privacy considerations and “compliance.”
Regarding this subject, there was a consensus among our conference speakers that in many respects cloud computing is actually more secure than the older technologies it supplants, but that key to delivering the desired levels of security and compliance are the proper application of appropriate tools, such as closed-versus-open networks and encrypted-versus-unencrypted data.
Obviously, the definitional cloud attribute of multi-tenancy - which often translates to a sharing of resources - is a concern especially among new cloud users.
Depending on the type of implementation, as NIST explains, security concerns will vary.
For software-as-a-service (SaaS), different end-user consumers may share the same application or database. For platform-as-a-service (PaaS), different processes may share an operating system (OS) and supporting data and networking services creating another level of security concern. And for infrastructure-as-a-service (IaaS) clouds, different virtual machines (VMs) may share hardware via a hypervisor creating yet another.
NIST sees the potential for flaws in logical separation with this sharing, but as the DCIA has pointed out, with the scale that is possible through cloud computing, greater resources can be dedicated to ensuring that vital components maintain their separate integrity.
NIST’s view of safeguards is that “for clouds that perform computations, mitigation can occur by limiting the kinds of data that are processed in the cloud or by contracting with providers for specialized isolation mechanisms such as the rental of entire computer systems rather than VMs (mono-tenancy), Virtual Private Networks (VPNs), segmented networks, or advanced access controls.”
And of course, because it’s increasingly common for cloud applications to be accessed through their end-users’ browsers, this becomes another vulnerability point. Separate and apart from cloud computing, browser security represents its own concerns.
The bottom line here is that NIST has published another valuable reference, which can serve as a resource for industry to review and in some ways as a voluntary guide to follow as progress continues in this space. NIST’s document is particularly useful for those involved in contracting with and servicing public sector entities.
Having said that, we do not agree with the assertion that reliability and security issues per se are greater with cloud computing than traditional computing. A growing body of evidence demonstrates that the cloud actual provides more trustworthy solutions. Share wisely, and take care.
We commend US Senator Ron Wyden (D-OR) for introducing legislation this week that would clarify the US Trade Representative’s (USTR) obligation to share information on trade agreements with Members of Congress.
Sen. Wyden, who spoke on the floor of the Senate about why this is necessary, has been a critic of the Administration’s handling of international treaties including the Anti-Counterfeiting Trade Agreement (ACTA) and most recently the Trans Pacific Partnership (TPP).
At the heart of this new practice of skipping Congressional oversight and approval on international treaties like ACTA and TPP is the USTR. They have also been responsible for negotiating these treaties in secret.
In his “Statement for the Record” on the introduction of the Congressional Oversight Over Trade Negotiations Act, Wyden pointed out that the USTR has continually stymied Congress’s efforts to learn more about the negotiations between the USTR and other countries.
According to Wyden, the lack of transparency is beyond the pale because corporations and interest groups know more about the negotiations for this latest treaty than lawmakers do:
"Right now, the Obama Administration is in the process of negotiating what might prove to be the most far-reaching economic agreement since the World Trade Organization was established nearly twenty years ago.
The goal of this agreement - known as the Trans Pacific Partnership (TPP) - is to economically bind together the economies of the Asia Pacific. It involves countries ranging from Australia, Singapore, Vietnam, Peru, Chile and the United States and holds the potential to include many more countries, like Japan, Korea, Canada, and Mexico. If successful, the agreement will set norms for the trade of goods and services and includes disciplines related to intellectual property, access to medicines, Internet governance, investment, government procurement, worker rights and environmental standards.
If agreed to, TPP will set the tone for our nation’s economic future for years to come, impacting the way Congress intervenes and acts on behalf of the American people it represents.
It may be the USTR’s current job to negotiate trade agreements on behalf of the United States, but Article 1 Section 8 of the U.S. Constitution gives Congress - not the USTR or any other member of the Executive Branch - the responsibility of regulating foreign commerce. It was our Founding Fathers’ intention to ensure that the laws and policies that govern the American people take into account the interests of all the American people, not just a privileged few.
And yet, Mr. President, the majority of Congress is being kept in the dark as to the substance of the TPP negotiations, while representatives of US corporations - like Halliburton, Chevron, PHRMA, Comcast, and the Motion Picture Association of America - are being consulted and made privy to details of the agreement. As the Office of the USTR will tell you, the President gives it broad power to keep information about the trade policies it advances and negotiates, secret. Let me tell you, the USTR is making full use of this authority.
As the Chairman of the Senate Finance Committee’s Subcommittee on International Trade, Customs, and Global Competitiveness, my office is responsible for conducting oversight over the USTR and trade negotiations. To do that, I asked that my staff obtain the proper security credentials to view the information that USTR keeps confidential and secret. This is material that fully describes what the USTR is seeking in the TPP talks on behalf of the American people and on behalf of Congress. More than two months after receiving the proper security credentials, my staff is still barred from viewing the details of the proposals that USTR is advancing.
Mr. President, we hear that the process by which TPP is being negotiated has been a model of transparency. I disagree with that statement. And not just because the Staff Director of the Senate subcommittee responsible for oversight of international trade continues to be denied access to substantive and detailed information that pertains to the TPP talks.
Mr. President, Congress passed legislation in 2002 to form the Congressional Oversight Group, or COG, to foster more USTR consultation with Congress. I was a senator in 2002. I voted for that law and I can tell you the intention of that law was to ensure that USTR consulted with more Members of Congress not less.
In trying to get to the bottom of why my staff is being denied information, it seems that some in the Executive Branch may be interpreting the law that established the COG to mean that only the few Members of Congress who belong to the COG can be given access to trade negotiation information, while every other Member of Congress, and their staff, must be denied such access. So, this is not just a question of whether or not cleared staff should have access to information about the TPP talks, this is a question of whether or not the administration believes that most Members of Congress can or should have a say in trade negotiations.
Again, having voted for that law, I strongly disagree with such an interpretation and find it offensive that some would suggest that a law meant to foster more consultation with Congress is intended to limit it. But given that the TPP negotiations are currently underway and I - and the vast majority of my colleagues and their staff - continue to be denied a full understanding of what the USTR is seeking in the agreement, we do not have time to waste on a protracted legal battle over this issue. Therefore, I am introducing legislation to clarify the intent of the COG statute.
The legislation, I propose, is straightforward. It gives all Members of Congress and staff with appropriate clearance access to the substance of trade negotiations. Finally, Members of Congress who are responsible for conducting oversight over the enforcement of trade agreements will be provided information by the Executive Branch indicating whether our trading partners are living up to their trade obligations. Put simply, this legislation would ensure that the representatives elected by the American people are afforded the same level of influence over our nation’s policies as the paid representatives of PHRMA, Halliburton and the Motion Picture Association.
My intent is to do everything I can to see that this legislation is advanced quickly and becomes law, so that elected Members of Congress can do what the Constitution requires and what their constituents expect.”
Share wisely, and take care.