
Software-defined backup and restore: hyperconverged archives
Throughout its relatively brief history, virtualization has created in enterprise tech a network topology that has saved a great deal of cost – both in terms of CAPEX (simply, fewer physical devices needing purchase) and OPEX (centralized control of multiple virtualized devices).
The empowering factor behind virtualization has always been the increase in power and efficiency of technology. Processors get faster, silicon cheaper, and supporting tech, such as power generation, gets more efficient.
The first instances of virtualization go right back into computing’s history, but the technology came to be used in production systems around the turn of the millennium.
VMWare filed the first patents in 1998 for its virtualization algorithms, and over the next half dozen years, the more significant enterprise technology companies began to develop their own solutions – or more likely, acquired the technology by takeover (EMC’s purchase of VMWare in 2003, for example).
In business-oriented production environments, the first instances of virtualization were amongst servers. There was a definite logic to this; servers were traditionally the machines which had the power and resources to be able to run multiple OSes and platforms.
Virtual machines were not supervised; they were hypervised. This trend in naming has continued with hyperconvergence now in use to an increasing degree in data centers.
In the way servers can be switch out for new images in just a few seconds, network switches, security devices, aggregators and packet shaping hardware can not only be controlled but to a high degree, abstracted so that the whole topology is malleable at will – typically according to the demands of business.
It’s logical for this technology to propagate, and bring with it the savings it makes possible. Full desktop environments are now available as a virtualized service, allowing users to interact with virtualized servers across software defined wide area networks.
Another effect of technology’s falling cost and rising capability is its ubiquity. It’s not just IoT that’s creating the daily petabytes that flow across our data networks. Almost all of our daily interactions are digital – and if they’re not at present, they’re likely to be in the next few years.
Backing up data therefore now presents challenges which enterprise has never really faced before on this type of scale. Not only is data increasingly widespread (and spread widely too!), its integrity is paramount; there aren’t paper copies to fall back on anymore!
But, there’s a great deal more to this particular data segment than its name of “secondary data” suggests. Backup and restore are mission-critical activities, and secondary data is used by development teams duplicating resources for testing. But in many enterprises, this stored data makes up around 80 percent of all the data an organization holds. Managing it, using it, and ensuring its integrity is the cornerstone on which whole businesses pivot.
Over time, the technology around backing up, restoration and disaster recovery have evolved. Where previously it was perfectly acceptable to schedule as-good-as-downtimes – aka backup windows – the nature of the internet today is that there are now no quieter times in the enterprise to grant the luxury of slowdowns for the data replication teams.
Therefore, techniques like compression and inline de-duplication have come into their own even more. With faster processors and storage, plus increased east-west traffic speeds and available bandwidth, virtualized secondary data backup & restore methods can take advantage of several newer methods of optimization:
Direct-to-target backups remove the role of the media server itself, meaning the targeted storage device is written to and read from.
Unique data-only replication. Bandwidth requirements can be significantly lowered by ensuring that data is examined outside of set data block sizes, meaning that data is compared byte for byte, rather than on a less granular block basis. This is achieved by variable-block processes, see below.
Variable block processing. Fixed data block methods can lead to up to around 20 percent data space savings. By using variable block methods, savings in backup media can be up to 80 percent.
Client-side processing. Utilizing some of the capabilities of the application servers being backed up, for example, can ensure that the server itself does not have to flood the network with backup/restore traffic, as only unique data is transmitted.
Additionally, with fewer processing cycles used by backup algorithms on dedicated backup devices, more parallel backups can be taken, more often. This increases the responsiveness of any recovery.
Here at Tech Wire Asia, we’re showcasing two providers of next-generation backup technology suppliers. The companies below are at the forefront of software-defined secondary data management. In an always-on, cloud-centric enterprise environment, the latest in technology here pays dividends to the data-based organization.
QUEST
Quest’s proven DR Appliance’s track record was well-known in enterprise technology circles, as one of the most resilient and dependable replication technologies out there.
The company’s QoreStor technology adds software-defined capabilities to secondary data infrastructure & methods, meaning that storage can become platform-agnostic, faster, more efficient, safer and distinctly more economic in terms of storage media.
The installable software solution is suited to on-premise, cloud and hybrid deployments, and it works with an extensive range of client systems. These can be added to and changed according to the enterprise’s procurement policies – management’s hard-won supplier discounts are therefore preserved.
QoreStor supports multiple software backup solutions too, so there’s no need to rewrite backup policies on a new platform. The nature of abstraction is that it sits between layers, acting seamlessly as intelligent arbiters of data.
It’s in this role QoreStor’s capabilities come into play. Replication times shrink, network traffic isn’t swamped for either replication or restore, de-duping & compression techniques save up to 80 percent of the required archive. Plus, the offering complies with FIPS 140-2 – but you’d expect the best levels of security from Quest.
To learn more about Quest in general, and QoreStor in particular, click here.
IBM
IBM one of the great success stories in computing, having latterly managed to re-engineer its business model, moving from consumer and business-oriented hardware, to fully enterprise-centric services, software, and hardware provider.
For a large enterprise, the IBM Spectrum range utilizes hyperconvergence and software abstraction, and the Virtualize product gives us virtualized storage.
Spectrum Virtualize is an offering that has been available for several years in the IBM SAN Volume Controller, the Storwize family, and FlashSystem range, amongst others. Spectrum Virtualize is now available as a software product, running on x86 servers of any flavor.
The company purports to create benefits of “up to 63 percent” lowered personal and management costs when used in conjunction with IBM Spectrum Control, plus a doubling of storage utilization.
Data archived is encrypted by the software while not impacting on availability. There’s full support for VMWare environments, as well as for Kubernetes and Docker containers.
As is the norm with software-defined infrastructure methods, there’s a single, controlling dashboard and the service is delivered as a SaaS if required from IBM Cloud, at no charge.
With Spectrum Virtualize, enterprises gain a software-defined data structure that’s quite non-dependent on any particular type of infrastructure, providing management across a heterogeneous storage topology.
*Some of the companies featured on this editorial are commercial partners of Tech Wire Asia
READ MORE
- Alibaba Cloud’s AI assistant Tingwu available for public beta testing after successful LLM integration
- Reimagining customer experience through people, technology and data
- Watch out Lazada and Shopee; TikTok Shop is no longer just a sleeping giant
- The Singapore government is tapping into Google Cloud’s AI tools. Here’s what we know so far
- The rise of the warehouse robots