One of the fast-growing trends in IT today is software-defined everything, in which various components within an IT infrastructure are virtualized and delivered as a service via the cloud.
In this type of environment, management and control of the storage, networking and/or data center infrastructure are automated by intelligent software, rather than by hardware components. Within the overall category of software-defined everything are software-defined storage, software-defined computing, software-defined networking and software-defined data centers.
Gartner identified software-defined anything as one of the top 10 technologies and trends that will be strategic for most organizations this year. The research firm defines a strategic technology as one with the potential to have a significant impact on the enterprise in the next three years. Factors that denote significant impact include a high potential for disruption to IT or the business, the need for a major financial investment or the risk of being late to adopt, according to Gartner.
Currently, the software-defined trend is still in the relatively early stages, and offerings in the market will undoubtedly evolve.
“It’s still an emerging space,” says Henry Baltazar, a senior analyst who covers infrastructure and operations at Forrester Research. “There is a tremendous amount of buzz around software-defined, but the market is still developing.”
In many cases, the transition will require a change in mindset. For example, software-only storage “is still a new segment, and customers are comfortable with hardware appliances,” Baltazar says.
“Most storage arrays today are essentially commodity hardware systems with a proprietary storage software stack. Storage decision-makers will need to get over their appliance-only stance to take advantage of the software-only storage and hyper-converged infrastructure pieces that are entering the market.”
The software-defined strategy presents some compelling potential benefits for organizations.
“The first benefit we will see is enhanced provisioning, with automation freeing up administrators from day-to-day provisioning chores,” Baltazar says. “Most people go to cloud because they believe public cloud services will allow them to get their jobs done faster. Software-defined data centers are needed to get resources faster to clients.”
In the second phase of software-defined storage, “we will see the emergence of software-only storage,” he adds. “These products will leverage commodity hardware and will deliver SAN [storage-area network], NAS [network-attached storage] and object storage resources on demand, without the typical forklift upgrade required for conventional storage arrays today.”
Moving to a Software-Defined Data Center
One organization that is taking advantage of the software-defined trend is the Yale New Haven Health System, a nonprofit organization that operates three hospitals providing multidisciplinary, family-focused care in more than a hundred medical specialty areas. The health care system is in the process of moving to a software-defined data center, with the goal of creating a more service-focused IT infrastructure that will help increase efficiencies and enhance IT service delivery.
The strategy is part of a long-term initiative to consolidate data centers and merge with new health system partners, says Matthew Openshaw, infrastructure architect, Information Technology Services at Yale New Haven. He explains that the standardization of the underling hardware platforms that are in use within the organization, along with the software that runs on these systems, will enable the medical center’s IT organization to become more agile and efficient, as well as being capable of delivering the best service to patients.
“As a health system, we continue to look to implement IT solutions that improve patient care,” Openshaw says. “We are also focused on improving how we deliver IT services from a business standpoint. Moving toward the software-defined data center will help us better deliver infrastructure as a service.”