Extreme Reach Sponsors CLOUD COMPUTING WEST 2012

The DCIA and CCA proudly announce that Extreme Reach has signed-on as a sponsor of the CLOUD COMPUTING WEST 2012 (CCW:2012) business leadership summit taking place November 8th-9th in Santa Monica, CA.

In 2008, some of the world’s most successful agencies asked Extreme Reach to help them find a better way to deliver and manage their video advertising. With the lines blurring between TV, the web, mobile and countless other video media, it was becoming clear that advertisers needed to think differently about video campaigns.

Extreme Reach was born to unify the execution, viewer experience, rights and tracking of video advertising from screen to shining screen.Today, Extreme Reach is the leading provider of video ad serving, distribution and management solutions.

Its cloud-based platform helps customers unify and simplify every aspect of video ad campaigns across media, including: TV, web, cinema, outdoor, and mobile.

To deliver across all of those channels, Extreme Reach operates the world’s largest digital video ad delivery network, connecting hundreds of advertisers, agencies and post-production houses with thousands of video media destinations.

Extreme Reach connects and simplifies the world of video advertising with one platform, one network, one path to video everywhere.

Extreme Reach is proud to employ the most experienced team in the industry and offers the most advanced production service facilities in its offices across North America: New York, Chicago, Burbank, Seattle, Detroit, Dallas, Louisville, Toronto, and its headquarters in Needham, MA.

CCW:2012 will feature three co-located conferences focusing on the impact of cloud-based solutions in the industry’s fastest-moving and most strategically important areas: entertainment, broadband, and venture financing.

Extreme Reach will participate in a panel discussion at the Entertainment Content Delivery conference within CCW:2012.

CCW:2012 registration enables delegates to participate in any session of the three conferences being presented at CCW:2012 — ENTERTAINMENT CONTENT DELIVERY, NETWORK INFRASTRUCTURE, and INVESTING IN THE CLOUD.

At the end of the first full-day of co-located conferences, attendees will be transported from the conference hotel in Santa Monica to Marina del Rey Harbor where they will board a yacht for a sunset cruise and networking reception.

So register today to attend CCW:2012 and don’t forget to add the Sunset Cruise to your conference registration. Registration to attend CCW:2012 includes access to all sessions, central exhibit hall with networking functions, luncheon, refreshment breaks, and conference materials. Early-bird registrations save $200.

Simplifying Cloud Infrastructure Management

Excerpted from Computer Technology Review Report

The Distributed Management Task Force (DMTF), the organization bringing the IT industry together to collaborate on systems management standards development, validation, promotion, and adoption, announced on Tuesday the release of the new Cloud Infrastructure Management Interface (CIMI) specification.

The new specification standardizes interactions between cloud environments to achieve interoperable cloud infrastructure management between service providers and their consumers and developers, enabling users to manage their cloud infrastructure use easily and without complexity.

Cloud computing allows customers to improve the efficiency, availability, and flexibility of their IT systems over time. As companies have adopted cloud computing, vendors have embraced the need to provide interoperability between enterprise computing and cloud services.

DMTF developed CIMI as a self-service interface for infrastructure clouds, allowing users to dynamically provision, configure, and administer their cloud usage with a high-level interface that greatly simplifies cloud systems management.

“The CIMI standard is a critical piece for cloud infrastructure management because it alleviates complexity while improving flexibility, portability, and security,” said Winston Bumpus, Chairman of the Board, DMTF. “With the release of the CIMI v1.0 specification, DMTF offers a well-rounded, industry-wide solution for simplifying cloud infrastructure management.”

This release includes two components, Cloud Infrastructure Management Interface — (CIMI) Model and REST Interface over HTTP Specification and Cloud Infrastructure Management Interface — (CIMI) Primer. The CIMI specification is the centerpiece of DMTF’s Cloud Management Initiative, and is the first standard created by the Cloud Management Working Group (CMWG).

DMTF’s Cloud Management Initiative includes contributions from additional working groups including the Cloud Auditing Data Federation Working Group (CADF WG), the Network Services Management Working Group (NSM WG), the Software License Management (SLM) Incubator and the System Virtualization, Partitioning, and Clustering Working Group (SVPC WG).

Additional announcements are expected from DMTF cloud-related working groups early next year.

DMTF working groups and incubators collaborate with a number of industry organizations in an effort to unify their cloud management initiatives. These organizations include the Cloud Security Alliance (CSA), the China Communications Standards Association (CCSA), the China Electronics Standardization Institute (CESI), the Open Data Center Alliance (ODCA), the Storage Networking Industry Association (SNIA), the Open Grid Forum (OGF), the Object Management Group (OMG), The Open Group (TOG), the Metro Ethernet Forum (MEF), the Global Inter-Cloud Technology Forum (GICTF) and the TeleManagement Forum (TMF).

Huawei is proud to have contributed to the development of a new standard for simplifying cloud infrastructure management,” said Jeffrey Wheeler, Chief Architect, Cloud Management and Standards at Huawei. “CIMI allows our customers to easily manage their cloud use, while improving portability and security. We look forward to supporting the CIMI standard in future product offerings, while contributing to its ongoing development.”

Report from CEO Marty Lafferty

Photo of CEO Marty LaffertyRepublicans finalized language of their technology position statement at the Republican National Convention Wednesday in Tampa, FL, making the GOP the first of the two dominant US political parties to fully and officially embrace Intern freedom.

The convention center featured a substantial presence by Facebook, Twitter and, especially, Google.

Internet Freedom” would entail the removal of “regulatory barriers” for technology businesses, resistance to international governance of the Internet, and the “constitutional protection” of personal data.

“We will remove regulatory barriers that protect outdated technologies and business plans from innovation and competition, while preventing legacy regulation from interfering with new technologies such as mobile delivery of voice and video data as they become crucial components of the Internet ecosystem,” said the finalized draft.

“We will resist any effort to shift control away from the successful multi-stakeholder approach of Internet governance and toward governance by international or other intergovernmental organizations,” it added.

“We will ensure that personal data receives full constitutional protection from government overreach and that individuals retain the right to control the use of their data by third parties,” it went on to say.

The GOP efforts are currently being modeled on the recent stance taken by the Ron Paul-founded Campaign for Liberty’s Technology Manifesto, as well as more right-of-center libertarian tech policy voices.

These voices include TechFreedom President Berin Szoka, Mercatus Center Senior Research Fellow Adam Thierer, Associate Director of Technology Studies at the Competitive Enterprise Institute Ryan Radia, and Netcompetition President Scott Cleland.

David Segal, Executive Director of Demand Progress, called the platform plank “a big victory for the Internet” and said that under its terms, lawmakers who abided by the language would have opposed the Stop Online Piracy Act (SOPA) and the recent cybersecurity bill.

Fred Campbell, Director of the Competitive Enterprise Institute’s Communications, Liberty and Innovation Project, said that the significance of the Internet language is that it reflects conservatives’ increased attention to the tech sector.

“Conservative movements haven’t been really focused on technology in recent years,” he said. “The significance of it is that it shows they are focused on the issues surrounding the Internet.”

The Republicans’ “Internet Freedom” is also modeled off of the efforts of GOP politicians in both chambers of Congress.

Members of Congress Marsha Blackburn (R-TN), Darrell Issa (R-CA), Mary Bono Mack (R-CA), Ron Paul (R-TX), Fred Upton (R-MI), and Greg Walden (R-OR), are some of leaders in the House on Internet issues.

Senators Kay Bailey Hutchison (R-TX), Rand Paul (R-KY), and Marc Rubio (R-FL) have led Republican efforts in the Senate.

The platform language signals that Republicans are looking to modernize what it called the “woefully out of date” Telecommunications Act of 1996, a law which Republicans in the tech policy community believe has kept the Obama administration “frozen in the past.”

The remainder of the potential party plank focuses on the Federal Communications Commission’s (FCC) responsibilities under that law.

“Today’s technology and telecommunications industries are overseen by the FCC, established in 1934 and given the jurisdiction over telecommunications formerly assigned to the Interstate Commerce Commission, which had been created in 1887 to regulate the railroads,” said the finalized draft language.

“An industry that invested $66 billion in 2011 alone needs, and deserves, a more modern relationship with the federal government for the benefit of consumers here and worldwide,” it said, stating that the technology industry was being governed by precedents set in the 19th century.

The language also criticized the Obama administration and the FCC for the way it has handled its responsibility to conduct auctions for the licenses to various frequency ranges of electromagnetic spectrum.

It states that the Republican Party would “call for an inventory of federal agency spectrum to determine the surplus that could be auctioned for the taxpayer’s benefit.”

“The current Administration has been frozen in the past,” said the language. “It has conducted no auction of spectrum, has offered no incentives for investment, and through the FCC’s net neutrality rule, is trying to micromanage telecom as if it were a railroad network.”

“It inherited from the previous Republican Administration 95 percent coverage of the nation with broadband. It will leave office with no progress toward the goal of universal coverage — after spending $7.2 billion more,” said the language.

The lack of progress on establishing universal broadband coverage, the draft language said, hurt the farmers and ranchers in rural America, and small business manufacturers. It called for the provision of “predictable support for connecting rural areas so that every American can fully participate in the global economy.”

“With special recognition of the role university technology centers are playing in attracting private investment to the field, we will replace the Administration’s Luddite approach to technological progress with a regulatory partnership that will keep this country the world leader in technology and telecommunications,” it said.

The DCIA commends the GOP for articulating its position and challenges Democrats to respond to these important issues and serious concerns in their platform. Share wisely, and take care.

Cloud Strategy: Choose Wisely

Excerpted from GigaOM Report by Dave Roberts

As executives contemplate the emergence of cloud computing, it’s important that they understand the questions they need to ask about why they’d adopt the new IT paradigm. Those deciding should consider the history and decisions made by Borders, the bookstore chain. Its execs chose poorly.

Choices matter. Just ask Indiana Jones. In The Last Crusade, he was forced to pick the Holy Grail out of a lineup of cups that spanned everything from a crude wooden model to a high-end chalice apparently designed by Faberge. The stakes were high. His adversaries chose poorly; Indy chose wisely, and won the day. Cloud computing strategies are a lot like that.

Every decade or so, we’re confronted with the arrival of a new mega-technology that has the power to shape our businesses in powerful ways. In the 1980s, it was personal computing. In the 1990s, it was the Internet. Now, we’re faced with cloud computing. It’s important that we choose a wise strategy for dealing with this technology. In order to understand the future, let’s take a look at the past.

In 1994, Borders Group was already one of the largest book retailers in the world. That same year, Jeff Bezos founded Amazon recognizing that physical bookstores were limited in the number of titles they could stock. In contrast, an online storefront could “stock” as many titles as needed, delivering a huge competitive advantage in product selection.

These two organizations had vastly different Internet strategies. Without the Internet, Amazon simply couldn’t exist — the web was central to its business model. In contrast, Borders largely ignored the Internet through the 1990s. Instead, Borders pursued a traditional retail strategy, opening large stores in the US and expanding internationally in Europe and Asia.

Borders finally took note of the Internet in 2001, deciding to open an e-commerce storefront. The company knew that it didn’t understand e-commerce, and so it made the decision to outsource its online operations to an expert — Amazon. In hindsight, the move was foolish, but it reflects a vastly different strategic vision and set of choices about how to view the new Internet technology.

For Borders, the Internet was a way to take orders for books and augment transactions occurring at its retail stores. Amazon viewed the Internet as a competitive weapon that could deliver strategic differentiation through greater selection and ease of purchase.

Borders ended its Amazon alliance in 2008 and finally developed its own online presence, but the company never found its stride. In early 2011, Borders filed for bankruptcy and finally ceased operations in late 2011.

There are some lessons here:

New technologies can significantly change the way that businesses operate, creating new business models and obsoleting old ones. Make sure you’re on the right side of that transition before you decide to move slowly with the adoption of a new technology.

The market will often take a wait-and-see approach with new technology, particularly established enterprises. While the Amazon vs. Borders comparison provides one of the starkest examples of Internet success and failure, Borders wasn’t the only company that “didn’t get it.” Barnes & Noble also struggled to incorporate e-commerce into its business model, for instance.

So, choose your technology strategy quickly and wisely. And remember that if you choose not to decide, you still have made a choice.

Today, the Internet transition is behind us, but the cloud computing transition is upon us. The conventional wisdom says that: The biggest benefit to cloud computing is the cost reduction associated with efficient external cloud suppliers, operating at massive scale. The biggest risk associated with cloud computing is security.

Do CIOs have their heads in the sand?

That analysis provides a huge mental crutch for people who are comfortable with the status quo. Budget savings are always interesting, but they are rarely compelling. If we’re currently profitable under the current cost structure, there is less pressure to change our behavior to save money. And that’s particularly true if that means taking additional risk on things like security. Better to pay more for the moment and be safe, the conventional wisdom says, than to be overly aggressive and get burned. Let someone else go first. That’s exactly how Borders approached the Internet.

Here are some questions that might help shape your thinking about a cloud computing strategy:

Is information technology be a core input to your business strategy? If you’re a Web 2.0 company, the answer is obviously, “Yes!” If you’re a manufacturing company in a very old, stable market sector, on the other hand, the answer may be “No,” but remember that Borders didn’t think the Internet was core. And they were right as long as book selling remained a brick-and-mortar business.

What is business agility worth to you? Again, if you’re in an inherently slow-moving business sector, the answer might be, “Not much.” But remember that book selling wasn’t very fast-moving in the 1990s either. But Amazon has used the Internet multiple times to evolve its business model, lately innovating with e-books. In 2011, Amazon reached the crossover point where it sold more Kindle e-books than physical books, which really starts to undercut competitors based on brick-and-mortar storefronts.

What is the risk to your company if your competitors embrace cloud computing first? If that’s troubling and you can think of ways the technology can be used against you, then you’ll want to move before they do. It’s important to think outside the box here and consider new market entrants. If you’re Borders in the 1990s, you need to be thinking about young upstart Amazon, not just your historical competitor Barnes & Noble.

The Amazon vs. Borders comparison puts technology adoption strategies in stark perspective. Cloud computing is upon you right now and it’s important that you create a proactive strategy for its adoption in your enterprise.

It may be the case that you can slow-roll your adoption, taking advantage of the wisdom and experience of first-movers, but make sure you aren’t being lulled into a false sense of security, sustaining the status quo just because it’s easy and low risk for the moment. Border did that and got crushed in the process. They chose poorly and paid the price.

Cloud IT Advances in Japan as Softbank Teams with VCE

Excerpted from CIO Today by Barry Levine

Softbank Telecom and the Virtual Computing Environment Company (VCE) have formed a strategic alliance to boost the use of cloud computing by companies of all sizes in Japan. VCE, created by Cisco and EMC with investments from VMware and Intel, offers converged infrastructure and cloud-based computing solutions to reduce IT costs.

Softbank will act as a service provider and reseller of VCE Vblock Systems. And, VCE will be available as a systems consultant for installations for Japanese customers, whether in that country or abroad.

Vblock Systems utilize technologies from VMware, Cisco and EMC. Softbank, which uses Vblock in its data centers around the world, says it has standardized all of its cloud systems around this solution. Vblock will be offered as an infrastructure-as-a-service (IaaS), and will also be available as an on-site private installation or via a hybrid cloud offering.

The Series 300 edition of the Vblock Systems is designed for enterprise data center needs for unified storage, high-bandwidth applications, and ease of use, and is optimized for virtual data center environments. It is architected on EMC VNX storage, Cisco UCS blades and switches, EMC Ionix UIM/P for provisioning, and VMware vSphere for virtualization.

The Series 700 is the top-of-the-line offering. It is targeted at mid-sized to large enterprise customers running thousands of virtual machines, with the most demanding requirements for ERP, CRM, database, collaboration and messaging. It utilizes Cisco UCS blade servers, Nexus and MDS network switches, EMC VMAX and VMAXe storage arrays, EMC Ionix UIM/P for provisioning, and VMware vSphere for virtualization.

Softbank also offers Vblock Data Protection, providing enhanced backup and recovery, data replication, business continuity, and workload mobility.

Ken Miyauchi, Representative Director and COO of Softbank Telecom, said that the relationship with VCE will help extend Softbank’s VMware expertise. The alliance is expected to “help more companies lower costs and improve efficiencies through virtualization and cloud computing.”

The Vblock System, he said, is a “globally standardized platform that [Softbank] can offer either as a service,” or as a private or hybrid cloud solution, with equal cost savings and performance.

VCE CEO Praveen Akkiraju told news media that the joint effort will help companies in Japan “take advantage of the cost and business agility benefits of highly virtualized data centers,” while increasing the adoption of cloud computing through use of an intelligent converged infrastructure.

Softbank Telecom provides network communication services, including environments for the mobile Internet, for the Softbank Group of companies. It also provides information and communication technology services to corporate customers through a cloud computing service called White Cloud.

VCE’s prepackaged solutions, which it describes as “the industry’s first completely integrated IT offering with end-to-end vendor accountability,” are available through a partner network.

Vertical markets serviced by VCE include financial, public sector, retail, education, healthcare, manufacturing, and utilities.

APIs Determine Winners in Cloud Computing Wars

Excerpted from Programmable Web Report by Michael Vizard

One of the things that not many IT people fully appreciate is how much scale really matters when it comes to cloud computing. The more applications that run on a particular cloud computing platform, the more the cost of running those applications is distributed across an increasingly larger number of servers and storage systems.

Eventually, a cloud service provider reaches enough critical mass that every new application winds up helping the cloud service provider to drive infrastructure costs down, while at the same time increase overall performance.

This is clearly the case with Amazon, which is now the leading provider of cloud computing services in the industry, so much so that it’s the shadow being cast by Amazon from Seattle, rather than Microsoft, that is being felt most this week at the VMworld 2012 Conference.

At VMworld this week VMware essentially announced an effort to create a federated cloud computing ecosystem based around a set of vCloud offerings that include APIs to make it easier to manage instances of cloud computing across multiple data centers.

In effect, this extends the virtual data center concept that VMware has been promoting over the last year. On top of that infrastructure VMware is hoping more cloud service providers with deploy Cloud Foundry, a toolkit that VMware created to foster the development of a cloud application development community based on open source code that supports multiple application development languages running on top of multiple platform-as-a-service (PaaS) offerings from different vendors.

While Cloud Foundry can be deployed on top of Amazon and by being configured to support Amazon APIs, it also provides the option of using APIs that are specific to Cloud Foundry.

Cloud service providers such as Piston Cloud Computing see that as a critical capability because Amazon has sole control over its APIs. Piston Cloud Computing CEO Joshua McKenty says service providers prefer to see the adoption of the Cloud Foundry APIs that make it easier for them to differentiate themselves from Amazon, which optimizes its APIs for use in an environment that was never intended to support private cloud computing environments.

As part of a demonstration of what might be possible in that context Piston Cloud Computing at the VMworld conference demonstrated an instance of Cloud Foundry that has been integrated with the OpenStack cloud management platform that has been gaining a fair amount of support from multiple server manufacturers, application vendors and cloud service providers.

Elsewhere, another cloud service provider, Skytap, showcased the ability to launch a PaaS environment based on Cloud Foundry in under a minute, along with support for deployments that would span hybrid cloud computing environments. According to Brett Goodwin, Skytap Vice President of Marketing, the cross-platform capability will prove to be a key differentiator for cloud service providers such as Skytap compared to Amazon approach that is primarily focused on applications that only run on its cloud service.

To a certain degree VMware and its partners are trying to link vCloud and Cloud Foundry together. But there are other PaaS providers such ActiveState that support multiple infrastructure-as-a-service (IaaS) platforms. According to ActiveState CEO Bart Copeland, as more organizations becomes aware of PaaS capabilities, the sooner it becomes apparent that IaaS is a commodity service.

The challenge becomes recognizing how Amazon is essentially leveraging its market position to try to lock developers into its service via its proprietary APIs. PaaS offerings, on the hand, shield developers from that issue because they operate at a higher level of abstraction in the software stack.

AppFog, a rival provider of a PaaS offering that support multiple application development languages, is pursuing a similar IaaS-neutral approach. According to AppFog CEO Lucas Carlson, it doesn’t make any sense to ask developers to master multiple APIs to invoke hardware services when one API delivered on a PaaS service that can be hosted on multiple IaaS offerings will mask that complexity from developers.

Of course, IT organizations need to better understand the role of the PaaS service. Many of them are concerned they are adding yet another layer of expensive middleware to their operations. In reality, PaaS pricing, says Carlson, is now roughly equivalent to IaaS pricing. The next challenge, says Carlson, is getting IT organizations to differentiate between IaaS, open source toolkits such as Cloud Foundry, and what is an actual PaaS does in terms of making application workloads truly portable at the click of a button.

While it’s still early in terms of anything to do with cloud computing it’s apparent that there is a lot of jockeying for position going on at all layers of the cloud computing stack. The question that developers need to ask themselves is just what parts of that debate they actually need to pay attention to, versus the parts they may ultimately be able to simply ignore.

Octoshape & MediaInternet Partner for LatAm Online Video

Excerpted from RapidTV News for Inaki Ferreras

Internet telecommunications software creator Octoshape has forged a partnership with MediaInternet to power broadband TV video distribution services for South American media broadcasters and Internet service providers.

Regarded as a premier provider of web business and web media in South America, MediaInternet is dedicated to improving transactions between broadcast and web media. The collaboration between both parties opens another market for Octoshape and enables MediaInternet to offer a broader range of services to its customer base.

“The expansion of broadband connectivity in the countries of the Southern Cone, namely Chile, Peru, Bolivia, Paraguay, Uruguay and Argentina, is enabling broadcasters to expand into new markets,” said Charlie Deane, CEO, MediaInternet. “Octoshape is helping us accelerate this expansion with its Infinite HD-M platform, which allows us to help broadcasters deliver premium content in high quality to worldwide audiences in multiple devices, broadcast head-ends and operators, over the Internet.”

MediaInternet believes that it will now be able to deliver premium content to viewers via IP connections rather than traditional satellite or cable. The video distribution for MediaInternet’s OTT service will be provided to South American consumers via the Octoshape’s Infinite HD-M Federated Multicast Broadband TV platform.

The Great Challenge and Opportunity of Cloud: Interoperability

Excerpted from GigaOM Report by James Urquhart

Cloud computing and distributed applications are part of a greater shift to building out an ecosystem with inter-dependent parts. This may seem obvious, but what is less obvious is how the industry will interoperate and develop systems that let information flow through the ecosystem.

Interoperability, and the challenge of maintaining control of operations in the face of it, is a central issue for those that operate distributed applications on the Internet — or “in the cloud.”

In this case, however, I’m not talking simply about creating and controlling interoperability from the developer level. Tools and services like Dell’s Boomi or IBM’s CastIron have existed for years, and have some success in delivering more flexibility to integration between applications and services. However, these services are focused on solving the developer’s key issues with integration — how to make sure messages move between components based on a process definition and one or more translations, if needed.

But today application operators see a tangential set of problems, and these problems are increasingly becoming difficult to deal with. For the operators, the problem of interoperability has several parts:

Maintaining interoperability with dependencies. For the developer, the problem of managing dependencies is one of logic—finding the right configuration of code and file dependencies to allow the application to execute successfully. This is largely a static problem, though one that increasingly requires devs to design for resiliency; if one dependency disappears, an alternative method of achieving the task at hand should be attempted instead. For operations, however, the problem is ongoing, as operations has to deal with the reality of why a dependency failed the component or components that depended on it.

Maintaining interoperability for dependents. The rapid growth of cloud services and APIs, on the other hand, make it operations’ job to deliver availability, performance and consistency of the software systems they operate to those that depend on that software. If you plan on earning business via services delivered via APIs, your operations team has to ensure that those services are there when your customers need them, without fail. Even if you simply provide data via batch files to a partner or customer, that mechanism has to run as the customer expects it to, every time.

Maintaining interoperability with things operations controls. The other key aspect of operations focus on interoperability has to do with control. There is a variety of responsibility that is inherent in operating systems that interact with one another. The goal of operations, in this case should be to optimize how these systems work together once deployed. Some of that is going back to developers and asking for changes to the applications or data themselves, but often much of that optimization has to do with network and storage configuration, tuning virtualization platforms, ensuring security systems and practices are in place, and so on.

Maintaining interoperability with things operations doesn’t control. Perhaps the most interesting aspect of application operations in the cloud computing era is the increased need to maintain control of one’s applications in the face of losing control over key elements on which those applications depend. Dealing with upgrades of third party services, handling changes to network availability (or billing, for that matter), or even ensuring that data is shipped to the correct location, on the correct media, by the right delivery service, are all tasks in which operations can only effect one side of the equation, and had to trust and/or respond to changes on the other side.

None of this is a shock to most IT operators, but there is one other element that is creating the rapid expansion of complexity facing operations today, and that is the sheer volume of integrations between software and data elements both within and across organizational boundaries. It’s no longer a good idea to think of individual applications in isolation, or to assume a data element has one customer, or even one set of customers with a common purpose for using that data.

Today we live in a world where almost everything that matters in business is connected by a finite number of degrees of separation from just about everything else in that category. Cloud computing is one driver, but the success of REST APIs is another, as is the explosion of so-called “big data” and analytics across businesses and industries.

We, in business software, exist in large part to automate the economy, in my opinion. The economy is a massive, highly integrated complex adaptive system. Our software is rapidly coming to mimic it.

All of this brings me to the opportunity that this interoperability explosion brings to operators and vendors of operations tools alike. If we are going to manage software and data that interoperates as a system at such a massive scale, we need tools that interoperate in support of that system. We need to begin to implement much of what Chris Hoff called for five years ago from the security software community:

“We all know that what we need is robust protocols, strong mutual authentication, encryption, resilient operating systems and applications that don’t suck.

But because we can’t wait until the Sun explodes to get this, we need a way for these individual security components to securely communicate and interoperate using a common protocol based upon open standards.

We need to push for an approach to an ecosystem that allows devices that have visibility to our data and the network that interconnects them to tap this messaging bus and either enact a disposition, describe how to, and communicate appropriately when we do so.

We have the technology, we have the ability, we have the need. Now all we need is the vendor gene pool to get off their duff and work together to get it done. The only thing preventing this is GREED.”

Amen, Chris. That remains as true today as it was then. Only now the scope has exploded to include all of application and infrastructure operations, not just security software. While everyone is looking for standards that allow one tool to talk to another, we are missing the bigger picture. We need standards that allow every component in the operations arsenal to exchange events with any other component, within understood guidelines. That may be as simple as setting the expectations that any operations software will have both an execution and a notification API set.

Another option is a formal event taxonomy and protocol, but that option doesn’t interest me very much. Those standards tend to become outdated quickly and are far too restrictive.

One last thing: John Palfrey and Urs Gasser have written a book on interoperability. The most interesting aspect of the model they describe is a multi-tiered view of interoperability that supplements data and software interoperability with human and institutional interoperability. The latter two concepts are incredibly important in the new cloud-based systems world.

It’s not good enough to focus on software, protocols and APIs. We have to begin to work together as an ecosystem to overcome the human and institutional barriers to better IT interoperability. Unfortunately, lack of interoperability often benefits software vendors, and as Hoff noted above, the only thing preventing this is greed.

Elemental Sets-Up Video Cloud Service

Excerpted from CED Magazine Report by Brian Santo

Elemental Technologies is preparing a cloud service designed to provide high-volume, enterprise-class video solutions via Amazon Web Services (AWS).

The company plans to introduce Elemental Cloud at IBC in Amsterdam, in two weeks. At the same time, it will begin making available Elemental Server Cloud Edition (CE), a file-based solution for high speed, multi-format video conversion in the cloud.

The combination of Elemental Technologies’ video processing solutions with the scalability, elasticity and flexibility of the AWS infrastructure is aimed at media and entertainment companies looking to enhance multiscreen video offerings and grow audiences while generating greater revenues and decreasing capital expenses.

“The rapid growth of online and mobile video along with the dynamic evolution of formats, standards and distribution make cloud computing a natural complement to on-premise processing for large-scale video encoding applications,” said Sam Blackman, CEO and co-founder of Elemental Technologies.

“This gives our customers maximum speed, flexibility and price-performance in creating video content targeted to multiple screens and devices.”

Oracle Rallies PaaS Providers to Float Cloud Interop Spec

Excerpted from The Register Report by Neil McAllister

A consortium of seven technology vendors, including enterprise software heavyweights Oracle and Red Hat, have teamed up to produce an industry standard that they say will make it easier for customers to manage applications deployed in platform-as-a-service (PaaS) environments.

Called Cloud Application Management for Platforms (CAMP)), the draft specification defines generic APIs for building, running, administering, monitoring, and patching cloudy applications.

So far, PaaS vendors have all provided their own bespoke interfaces for such management functions, making it difficult for customers to move existing cloudy apps to new platforms, which may offer completely different management interfaces than the ones they currently use.

“CAMP defines a simple API that enables customers to have an interoperable solution across multiple vendors’ offerings, manage application lifecycles easily, and move applications between clouds,” Don Deutsch, Oracle’s Vice President and Chief Standards Officer said.

In a blog post on Thursday, Oracle Principal Cloud Strategist Mark Carlson explained that CAMP is designed specifically for PaaS customers, who typically don’t want to become system administrators for the cloud infrastructure hosting their apps.

To that end, CAMP operates at a high level, defining interfaces that represent applications, their components, and any platform components that they depend on, while leaving low-level infrastructure details to the cloud provider.

“It’s important PaaS Cloud consumers understand that for a PaaS cloud, these are the abstractions that the user would prefer to work with,” Carlson says, “not virtual machines and the various resources, such as compute power, storage, and networking.”

Carlson says CAMP solves the problem of migrating cloudy apps from one PaaS vendor to another by mapping the requirements of applications and their components to the specific capabilities of the underlying platform.

What it does not attempt to do, however, is define any interfaces that don’t concern PaaS management. For example, if a platform provides a message-service bus, CAMP does not define a standard way to post a message to it.

CAMP is also programming-language and platform agnostic. It doesn’t define any interfaces that make it easier to migrate applications from Java EE environments to .Net, for example.

Joining Oracle and Red Hat in developing CAMP are CloudBees, Cloudsoft, Huawei, Rackspace, and Software AG, all of which currently offer management solutions for PaaS clouds. Carlson admits that this is a comparatively small group as industry standards efforts go, but he says that this approach has advantages.

“For example, the way that each of these companies creates their platforms is different enough to ensure that CAMP can cover a wide range of actual deployments,” he writes.

So far, the group has completed a first draft of the CAMP specification (with Oracle being the largest contributor). On Thursday it announced that it has formed a technical committee to continue the work under the auspices of the OASIS standards organization, with the goal of defining interfaces for the most widely available platform services within the next 18 months.

Avaya Builds on “Collaborative Cloud” Platform

Excerpted from Channel Partners Report

Avaya on Tuesday released “Collaboration Pods,” components made up of Avaya, EMC and VMware products, to help Avaya’s Collaborative Cloud platforms deploy faster.

Collaborative Cloud is Avaya’s formal strategy for delivering communications and collaboration services via public, private and hybrid cloud models. Collaboration Pods comprise virtualized storage, computing and networking. Combined, they support complex applications such as virtual desktop infrastructure.

Avaya is packaging its pods in several configurations to meet a range of needs. Each bundle does include the Avaya Virtual Services Platform 7000 Ethernet switch.

All of this means cloud providers of all sizes, even smaller ones doing IaaS and CaaS, will see faster time to revenue, Avaya said.

eBay, Fujitsu, Verizon Establish Big Data Working Group

Excerpted from Fierce Telecom Report by Sean Buckley

The Cloud Security Alliance (CSA), a group centered on developing best practices for ensuring cloud computing security, announced Thursday it teamed with Fujitsu Labs, eBay, and Verizon Business to form a working group that will develop methods to tackle Big Data security issues.

Joining group Chairman Sreeranga Rajan of Fujitsu Laboratories of America are Neel Sundaresan of eBay and Wilco Van Ginkel of Verizon.

Given the complexity of managing big data streams that range from everything from Internet traffic to sensor networks, the group will focus on developing “scalable techniques” to tackle data-centric security and privacy issues.

With sales of WiFi-enabled mobile phones and tablet computers booming, high-speed Internet providers are building WiFi access points in public areas to recruit and retain subscribers.

Rajan said that “Everyday 2.5 quintillion bytes of data are being created resulting in a myriad number of data security and cloud-computing security concerns.”

By creating these techniques, the group will be able to accomplish four main goals: crystallize best practices for big data security and privacy; establish liaisons with other organizations in order to coordinate the development of big data security and privacy standards; and accelerate the adoption of novel research aimed to address security and privacy issues.

Specifically, the group will look to provide research and guides on six specific themes: Big Data-scale Crypto, Cloud Infrastructure, Data Analytics for Security, Framework and Taxonomy, Policy and Governance, and Privacy.

Providing security practices for cloud, in particular, will be beneficial for both service providers and the vertical markets that they serve. Service providers that offer cloud and data center services will be to have additional security frameworks they can use to address their customer’s concerns as they offload various IT functions to them.

BitTorrent to Change How Artists Are Paid for their Work

Excerpted from Business Insider Report by Dylan Love

Matt Mason is the author of The Pirate’s Dilemma and one of BitTorrent’s newest employees. We recently had the opportunity to talk with him about BitTorrent, piracy, and the changing face of the media business. Here’s the full Q&A:

BUSINESS INSIDER: Why does BitTorrent seem like the go-to piracy tool?

MATT MASON: This is an incredible technology for moving large amounts of data across asymmetric networks like the Internet. And that’s the reason it’s been used as a tool for piracy. If you look back through history, the tool that’s always been used for piracy has been the most efficient, ruthlessly disruptive thing.

And so often that the birth of new technologies you’ll find this period of chaos where people don’t really understand what’s going on. So when Edison invented the record player, live musicians looked at this device and said “Oh, my God. This device plays live music in venues…”

BI: We’re going to be put out of business.

MM: Right, and that was the birth of the recording industry. And I feel like we’re at a similar point in time right now. So the way we look at it is that this is a really amazing technology, there are lots of great uses for it. The idea that the audience is the server farm can apply to a lot of different business models and can really help a lot of different people save money and do things differently.

And we should be championing that. And with regard to the content industries, I feel like the way we could be at most service in the world…we can’t stop or block piracy more than anybody else can, but we can talk to 160 million people who are using BitTorrent.

And we could say to them “Hey, are you interested in this?” we can’t track our audience but we know who they are. They talk to us, they engage with us when we ask them to. We know a lot about them. They’re mostly men aged 16-24, either in college or just graduated. Secondary audience is sort of 18-34.

There are definitely differences and nuances around the world, but that’s broadly who they are. And they’re really avid, passionate consumers of content. And if you show them a piece of content that they like, they’re much more likely to go and buy the album. They’re much more likely to go and sign up for an email and get updates from their artists. They’re much more likely to go and actually see the show in a theatre or buy the t-shirt. They’re very, very passionate users and consumers of the things that they like.

BI: BitTorrent’s really gotten slammed with the perception that it’s strictly for piracy. What ways have people been using it more legitimately?

MM: If you look at any organization with a large user base, whether it’s Facebook or Twitter or Wikipedia or Blizzard, all of these guys use BitTorrent to push updates to their servers. And frankly, if their CTOs weren’t using BitTorrent then they shouldn’t be CTOs of companies like that. It’s just a really good technology to use for a lot of different things. Being seen as a piracy tool is a burden to us at this point.

Everybody who works at BitTorrent has this fervent belief in the power of distributive technology and what it can be used for next. If you look at the coming problems in the media industries, the things we’re going to be talking about in the next 10 years, it’s still going to be piracy. That’s not going to change. We’ve got more good ways to find alternatives to piracy than we’ve ever had. And we’re very, very much in the business of trying to create more of those and help people figure out more of those.

We’ve released an API called BitTorrent Torque a couple of weeks ago which is a way for developers to actually develop on top of BitTorrent inside the web. And the reason we did that is we want to see people actually using BitTorrent protocol as the back end for websites where people are trying to host content without having massive, massive bandwidth fees.

BI: Your browser treats a torrent download as if it were just a normal file.

MM: We’re not great at building beautiful UIs. We’re not going to build the next social network. But we’re really happy if someone else wants to use it. It’s definitely good if other people are figuring out great new ways to use BitTorrent for really cool stuff. So that’s the idea behind it.

BI: How easy or complicated it was to get Torque off the ground?

MM: it was a dedicated team, a small team. It was really kind of a passion project. At BitTorrent, we’re a big believer in letting people run with a good idea. And it was an engineer named Patrick Williams who was just like “This is a great idea. I’m going to build this.”

The big idea behind BitTorrent is the audience is the server farm and the more of them there are, the more power you have. I think it’s the closest thing we have to a perpetual motion machine. It’s definitely got more uses than people have explored.

BI: Can you talk at all about anything that BitTorrent has down the road?

MM: We’ve got so many thing that are coming up. We’ve been experimenting with new ways to get content creators into the BitTorrent ecosystem and new ways for them to talk to our audiences and also make money. So over the last few weeks, this experiment with DJ Shadow a few weeks ago where he did the first kind of monetized torrent with us where he didn’t ask people to pay for it but there was an advertising offer inside the actual torrent.

The idea being every time the torrent was shared there was at least the opportunity for people to engage with him in a way that helped him monetize the piece of content in that torrent.

BI: So how did that work?

MM: There was a promotional read-me inside the torrent. We also offered it in various places if you were signing up for BitTorrent, we always offer you a piece of content from an artist that comes in and we say here’s something for you to get started as a way to promote artists.

Ways Cloud Computing Will Change by 2020

Excerpted from Cloud Computing Journal by Patrick Burke

Think cloud computing is just the latest IT fad? Think again. Forrester predicts the global cloud computing market will grow from $35 billion in 2011 to around $150 billion by 2020 as it becomes key to many organizations’ IT infrastructures.

According to an article on ZDNet, by 2020 cloud is going to be a major – and permanent – part of the enterprise computing infrastructure.

By 2020, a generational shift will have occurred in organizations. A new generation of CIOs will be in charge that have grown up using cloud-based tools, making them far more willing to adopt cloud on an enterprise scale.

With these developments in mind, here are 10 ways in which the cloud of 2020 will look radically different to the way it does today.

Rackspace has announced the unlimited availability of cloud databases and cloud servers powered by OpenStack, along with a powerful and streamlined new control panel.

These solutions further expand Rackspace’s broad cloud hosting portfolio, used today by more than 180,000 customers worldwide.

These products mark the first time a company deployed a large-scale, open source public cloud powered by OpenStack. Customers can now select from private, public or hybrid offerings and have the flexibility to deploy their solutions in a Rackspace data center or in another data center of their choice.

Rackspace’s open cloud products also give application developers and IT organizations in businesses large and small the ability to build, test and deploy applications in the cloud for the first time without being locked-in.

The new Cloud Servers powered by OpenStack deliver increased efficiency, scalability and agility to customers, who can launch as many as 200 reliable cloud servers in 20 minutes.

Rackspace recently announced a celebration of the second anniversary of the OpenStack open source cloud computing platform for building public or private clouds, as eWEEK reported.

The effort to utilize cloud computing in the federal government has officially begun. In June, the Federal Risk and Authorization Management Program (FedRAMP) reached its initial operating capabilities to certify businesses that meet federal cloud services standards – moving government closer to using the cloud to reduce costs and more effectively serve citizens.

But, according to an article in Federal Times, more commitment is needed to make the cloud vision a reality. Federal decision-makers need to pick up plans for an overarching federal IT strategy. A comprehensive guide would eliminate redundancy and provide agencies with clear guidance on how to prioritize initiatives. It would go a long way toward reducing the cost of government and improving the delivery of services to citizens.

According to author Michael Hettinger, “Cloud computing is transformational, and the government has put the building blocks in place to effectively transform its IT infrastructure, but some core issues must be addressed for federal IT reform to reach its full potential.”

Two of the buzziest competitors in cloud computing are settling into coexistence – and maybe figuring out ways to take on the giant in the market, Amazon, according to the New York Times.

Like its competitor Dropbox, Box offers a little bit of data storage free, then charges for additional amounts. Both companies make money from a relatively small number of paying customers who need large amounts of storage, according to the NY Times.

Both companies are finding ways to put into their online storage more features and user-friendly services than are found in Amazon’s Simple Storage Service, or S3, one of the first big public cloud computing initiatives.

By putting easy-to-use apps on smart-phones, Box and Dropbox appear to be exploiting the dissatisfaction some customers have experienced with Amazon.

According to the NY Times, some Amazon customers still find the service highly technical to use, and complain about a lack of customer service.

Coming Events of Interest

ICOMM 2012 Mobile Expo — September 14th-15th in New Delhi, India. The 7th annual ICOMM International Mobile Show is supported by Government of India, MSME, DIT, NSIC, CCPIT China and several other domestic and international associations. New technologies, new products, mobile phones, tablets, electronics goods, and business opportunities.

ITU Telecom World 2012 – October 14th-18th in Dubai, UAE. ITUTW is the most influential ICT platform for networking, knowledge exchange, and action. It features a corporate-neutral agenda where the challenges and opportunities of connecting the transformed world are up for debate; where industry experts, political influencers and thought leaders gather in one place.

CLOUD COMPUTING WEST 2012 – November 8th-9th in Santa Monica. CA. CCW:2012 will zero in on the latest advances in applying cloud-based solutions to all aspects of high-value entertainment content production, storage, and delivery; the impact of cloud services on broadband network management and economics; and evaluating and investing in cloud computing services providers.

Third International Workshop on Knowledge Discovery Using Cloud and Distributed Computing Platforms – December 10th in Brussels, Belgium. Researchers, developers, and practitioners from academia, government, and industry will discuss emerging trends in cloud computing technologies, programming models, data mining, knowledge discovery, and software services.

2013 International CES – January 8th-11th in Las Vegas, NV. With more than four decades of success, the International Consumer Electronics Show (CES) reaches across global markets, connects the industry and enables CE innovations to grow and thrive. The International CES is owned and produced by the Consumer Electronics Association (CEA), the preeminent trade association promoting growth in the $195 billion US consumer electronics industry.

CONTENT IN THE CLOUD at CES - January 9th in Las Vegas, NV. Gain a deeper understanding of the impact of cloud-delivered content on specific segments and industries, including consumers, telecom, media, and CE manufacturers.