It’s hard to look at what’s happened to enterprise technology over the past few years and not feel a bit of an adrenaline rush. Cloud computing, mobility and data analytics are pretty sexy stuff—full of excitement, packed with benefits and changing the way we do darn near everything. While this triad of technological wonderment has continued to transform IT, storage solutions stood their collective ground—cementing their position as the wallflowers of the data center, doing what they’ve always done, working like they’ve always worked for as long as most of us can remember. Nothing was particularly wrong with traditional storage solutions, but nothing was all that innovative about it, either. Until now.
“Storage has traditionally lagged behind in innovation. SAN (Storage Area Network) works about the same as it did 20 years ago, and until recently, that was enough,” explained Seth Knox, vice president of products for Atlantis Computing. “Companies rely on new applications that have to do more, and do it more quickly, like analyzing data for rapid decision-making, pushing out offers to customers in store and connecting individuals globally in an instant. Everything’s moving faster; companies are processing more data—and all of that requires more storage performance than what most companies have in place.”
Today’s highly virtualized IT environments have added agility and efficiency, but they’ve also placed a significant burden on the storage infrastructure.
“Each virtual server requires a higher number of terabytes of storage. So, storage capacity and the associated storage costs are growing, on average, around 40 percent each year,” Knox said. “For most companies, storage costs already account for almost half of their total IT budgets. No one can afford that number to nearly double every two years—which it will, if they continue to go the traditional route.”
Luckily, storage providers have kicked it into high gear, with innovative approaches that increase performance, reduce costs and provide the Ninja-like agility that every enterprise craves.
The two shining storage solutions stars? In-memory storage and software-defined storage. Although the names aren’t all that exciting, each carries a cool factor that goes far beyond what its moniker implies.
In-memory Storage for “Insanely High” Performance
In truth, the idea of in-memory storage is not 100 percent new, at least not to retro techies.
“Back twenty years ago, communities of computer geeks created RAM disks so their video games would run faster,” Knox said. “In-memory storage has always been fast, but it just hasn’t been ready for mass enterprise use before.”
If you’re not familiar, in-memory storage is located on the server, so it resides in the same place the actual application is running. This close proximity to the application is important because it eliminates latency. Instead of having to transverse the network to get to the data to the storage disk, the storage source is right there. Fewer hops equal radically increased performance.
“The amount of performance with in-memory storage is insanely high. It’s 100,000 times faster than SAN and 1,000 times faster than flash,” Knox said. “But until recently, few companies chose this type of storage because servers weren’t designed to hold enough storage RAM for the technology to be valuable on a large scale.”
The other big caveat? If a server fails or the power goes out, everything on the storage card is lost.
“What we’re seeing now is a hybrid approach that pairs in-memory for speed with flash storage for data protection. Even when you build in data protection by adding flash, this method is still less expensive and far faster than other options,” Knox said.
According to Knox, at least one major hardware provider is re-engineering servers to support as much as 60 TB of in-memory storage on the physical device—a strong indication that in-memory storage is moving from “maybe” to mainstream—and doing it fast.
Software-Defined Storage Solutions Let You Optimize the Storage Capacity You Have
So, we all remember the remarkable impact of virtualization on compute power, right? How instead of adding physical servers for new applications, companies could fully utilize the capacity they had, provision more quickly and reduce costs in the process?
Software-defined storage brings this same agility and scalability to enterprise storage.
In a typical IT environment, an enterprise might have four or five different kinds of storage, including SAN, NAS, DAS, all-flash arrays and other acronym-laden extras.
“Each application is mapped to a different type of storage, which creates a mass of storage silos. If you need more capacity for one or two applications, you have to increase that specific type of storage—even if you have other types of storage arrays that are underutilized,” Knox explained. “It’s a lot like what used to happen with physical servers, before virtualization.”
The only choice before? Over-provisioning for performance on the various storage devices. Wasting money on underutilized capacity. Translation? Big, fat, unnecessary costs. The electronic equivalent of rooms filled with half-empty file cabinets taking up space.
Software-defined storage eliminates the inefficiencies by unifying all storage types into a highly optimized pool of storage resources available to all applications. Policy-based storage volumes optimize capacity, performance and availability based on specific application needs.
“Essentially this does for storage with server virtualization did for compute power, making it possible to do more with less storage by consolidating and increasing the efficiency of existing SAN and NAS devices—as well as use local disk, flash and server memory to create new storage tiers,” Knox said. “But, the real beauty of software-defined storage is that companies don’t have to change their applications to run in this model. As long as you can run the app on a virtual machine, you can run in a software-defined storage environment on your existing hardware. The application ‘thinks’ it’s using a shared SAN hypervisor.”
Ultimately, companies reduce costs, simplify storage management and gain the benefits of rapid, on-demand provisioning.
It Just Gets Better and Better
Knox is quick to point out that these storage innovations are not designed to replace SAN or NAS, now or in the future, but add the agility and performance that today’s companies need. All while reducing costs.
As with any innovation, Knox suggests starting with a proof of concept to prove the benefits. From all indications, nearly every company can see some sort of gain. In fact, one analyst study indicated that by 2016, server-based storage solutions would lower hardware costs by 50 percent or more. While the reality of that number remains to be seen, it’s clear that today’s storage innovations have the potential to become true industry disrupters.
“These technologies are bringing memory to the masses,” Knox said. “They’re helping companies significantly reduce cost, without adding hardware, changing workflows or altering their apps. I don’t know how you get a stronger value proposition than that.”
About the Author: Ben Trowbridge is an accomplished Outsourcing Advisor with extensive experience in outsourcing and managed services. As a former EY Partner and CEO of Alsbridge, he built successful practices in Transformational Outsourcing, BPO, IT Outsourcing, and Cybersecurity Managed Services. Throughout his career, Ben has advised a broad range of clients on outsourcing and global business services strategy and transactions. As the current CEO of the Outsourcing Center, he provides valuable insights and guidance to buyers and managed services executives. Contact him at [email protected].