Utility Computing’s Blind Spot

By Outsourcing Center, Kathleen Goolsby, Senior Writer

Utility Computing’s Blind Spot

Major IT hardware and software providers swept through 2003 with an ever-growing array of models that allow enterprises to purchase IT functionality on an as-needed basis. Suppliers introduced utility computing as a means of achieving flexibility and scalability as well as predictable lower costs. But these were already excellent returns on investing in outsourcing initiatives in the first place — so how did utility computing impact outsourcing?

Trend 1: Dollar Signs

After a period of being knocked flat like a KO’d boxer, IT budgets are getting up off the canvas. A survey of 600 IT decision-makers in October 2003 (conducted by Stamford, Connecticut-headquartered research and advisory firm, Gartner, Inc., and research firm Soundview Technology Group, Inc., based in Old Greenwich, Connecticut) indicated small and mid-size companies’ capital spending budgets will grow at a pace of 1.6 percent in 2004. That spending includes outsourcing.

The survey found that spending will be accompanied by a strict focus on return on investment and new projects instead of infrastructure — investing in technology that facilitates competitive advantage strategies.

Cal Hackeman, partner in charge of business advisory firm, Grant Thornton’s technology industry practice, adds that “middle-market companies have deferred investment in technology for a couple of years at least. But businesses always knew they would need to eventually invest.”

“Executives are no longer opting to outsource solely to cut costs,” states Diane Shelgren, chief operating officer for North America for Accenture HR Services, which is based in Chicago, Illinois. “Instead, they choose to outsource to gain more control over business outcomes, support strategic planning and predict business results.”

Yet, costs and return on investment are a critical component of executives’ agendas and often a primary driver for outsourcing. Hence, the buyers’ attraction to service providers’ utility computing models — pay-as-you-go offerings for accessibility to the newest technologies and robust IT services. Like a telephone dial tone, or electricity, users rely on the fact that the service for which they pay a reasonable monthly fee will be available when they want to use it; plus they know someone else is responsible for the headaches and cost of maintaining and upgrading the infrastructure.

At the core of motivating factors for adopting a utility computing model is the fact that technology is no longer a mechanism to support a business; today, it is integral to staying competitive and driving business transformation. The growing trend of globalization initiatives and heavy use of business-to-business applications of the Web world demand scalability, agility and 100% availability of infrastructure — impossible to accomplish with in-house resources.

IBM Global Services first dubbed the concept “on-demand,” Hewlett-Packard refers to it as its “Adaptive Enterprise” initiative, Sun Microsystems has a “virtualization engine,” and Forrester Research is talking about “organic IT.” Whatever its moniker, major IT providers have spent billions of dollars in the last year and a half developing and pooling a smorgasbord of adaptive infrastructure to enable flexibility, scalability, end-to-end robust capabilities with utility-style pricing and no upfront capital investment for customers. As a customer’s infrastructure demands go up or down, so does the price.

Steve Sullivan, vice president of IT sourcing services at Billerica, Massachusetts-based Getronics, notes his company is “seeing an essential shift in what companies want to control. The provisioning of IT services that are not a core competency is becoming acceptable, so, it stands to reason that companies are more inclined to accept utility computing as a reasonable solution to some of the challenges they currently face.”

Mike Atwood, president of the manufacturing, distribution, and retail division of advisory firm Everest Group, believes, “Almost every corporation is going to realize that owning and operating their own computing environment is not a good idea. Twenty years from now, there will be almost no one who runs their own computer infrastructure.”

The list of buyers is already quickly growing; so is the list of providers scrambling to carve out a niche in this growing outsourcing market.

Trend 2: Warning Signs

The last half of 2003 saw naysayers warning the early adopters of the utility computing model. They warn that it’s only a new flavor of supplier hype, an unproven concept, a new spin on the old “hosting” experiment, and that the model is not secure enough for critical data and intellectual property. Negative warnings are a normal accompaniment to change that looms large on the radar screen.

Prediction 1: History Will Repeat Itself

In December 1999, I talked with industry leaders about predictions to highlight in the January 2000 trends and forecasts issue of Outsourcing Journal. Talk was focused on “universal server farms,” “co-location” models and the nascent Application Service Provider (ASP) model — three different approaches to what has later become known as “hosting” environments accessible through an Internet browser. Today’s utility computing model sprouted from those early efforts to meet customer needs.

Although its flexibility and pricing structure attracted quick adoption initially, the ASP model stumbled and took a nose dive. In hindsight, it’s easy to see that there was never a question as to whether or not ASPs would survive – it was only a matter of how they would evolve. Today, three years later, ASPs have risen like a phoenix, with their services honed into today’s strategic and successful Managed Services Provider (MSP) model, providing hosting as well as process expertise and management services.

We predict a few stumbles for the utility computing model — also initially attracting customers with its flexibility and pricing structure. The described benefits of as-needed accessibility to IT infrastructure are almost postcard-perfect scenarios. But this business model is still young and evolving through ever-growing complexities of the “connected” world. Providers will be racing to figure out what the best practices are and how to address the critical issues – and the evolved model will be even more successful than the current one.

Prediction 2: The Blind Spot

One can almost picture the eventual enormous data center, with one computer running everything and one person cost-effectively managing it for everyone else, who is accessing it through a browser — rather like the behind-the curtain view of the Wizard of Oz. But the reality is, there is not one outsourcer that can be all things to all people; nor is there one model that meets everyone’s needs. And the pricing component should not be the main focus of selection criteria.

There’s the blind spot we predict many buyers will encounter — while enterprises are focusing on costs, they can miss the value. With commodity pricing, providers have to eliminate something. Avoiding this blind spot will become even more important for buyers as more providers enter the market, impacting the competitiveness of pricing.

Value is more than a low price. LION bioscience, for example, partnered with an outsourcer to provide a subscription-based informatics functionality (on-demand via the Internet) for companies performing drug-related research. The solution resulted in faster drug research and discovery. JPMorgan Chase tapped into IBM’s IT resources on a pay-as-used basis so it can quickly adapt to changing business conditions. The strategy also ties together the bank’s different servers and storage devices without requiring new applications to be written for each separate system. The Mobile Travel Guide uses on-demand IT infrastructure services to meet its seasonal peak requirements. Certainly, each enterprise benefits by having to pay for only the computing capacity it requires, but their primary goals and provider selections were based on using technology as a key competitive advantage.

Enterprise decision-makers considering the utility computing model need to remember to look for the most appropriate outsourcing solution and the right partner to provide the desired value. Success starts with enterprises understanding where they want to go, strategically, and how they want an outsourcing partner to help them move toward those goals.

Forecasts for 2004:

  • IT spending will increase in 2004, but enterprises will demand higher return on investment.
  • As technology shifts from being a mechanism to support a business to becoming a key competitive advantage, the demand for quick, cost-effective access to the newest technologies will grow.
  • The utility computing model is still young and evolving through ever-growing complexities of the “connected” world. Providers will be racing to figure out what the best practices are and how to address the critical issues – and the evolved model will be even more successful than the current one.
  • As more providers enter the utility computing market, impacting the competitiveness of pricing, some buyers will be blinded by the low costs and mistakenly not focus on the other value delivered by an outsourcing solution.

About the Author: Ben Trowbridge is an accomplished Outsourcing Consultant with extensive experience in outsourcing and managed services. As a former EY Partner and CEO of Alsbridge, he built successful practices in Transformational Outsourcing, Managed services provider, strategic sourcing, BPO, Cybersecurity Managed Services, and IT Outsourcing. Throughout his career, Ben has advised a broad range of clients on outsourcing and global business services strategy and transactions. As the current CEO of the Outsourcing Center, he provides invaluable insights and guidance to buyers and managed services executives. Contact him at [email protected].

Let’s talk more

Consult Form

"*" indicates required fields

This field is for validation purposes and should be left unchanged.