Storage solutions driving innovation in APAC enterprises
In the developed world, we rely less and less on physical media.
This is true in most areas of life. In the consumer sphere, we no longer own physical media: DVDs, Blu-ray, vinyl records, books – all are becoming digitized and piped to us down increasingly faster internet connections.
In the field of commerce, we don’t write checks to the extent we once did; the financial industry’s increasing computerization may be slow, but it’s inescapable. Most commerce now, if not conducted online, ends up as streams of zeroes and ones. Banks’ ledgers are electronic, not handwritten, and billions can be added to or wiped off balance sheets by the click of a mouse, or touch of a screen.
In our workplaces, almost everything we do is digitized, even in old industries that have traditionally been ‘manual’ – construction workers clock on and off electronically and workers digging holes in the roads use the latest tech to make their jobs easier.
Factories are increasingly mechanized, production digitized, supply chains and logistics are electronically monitored, and all the while, the streams of data being captured multiply.
The latest technologies either rely on masses of data or produce great quantities of new data, seeking storage.
The internet of things (IoT) promises billions of interconnected devices, each feeding back streams of zeroes and ones, ripe for storage, interpretation, and pumping into algorithms that find patterns and predict possible outcomes.
Virtual reality and augmented reality (VR and AR) need gigabytes of data, initially captured and stored, for feeding into headsets, earpieces, and tactile devices, to create a new generation of immersive environments.
Once captured and stored, all data is not equal. Behind the scenes, today’s cloud providers (the most usual endpoint for data) classify and treat data according to its probable use. Some data is archived and held offline, with human intervention required to retrieve it. In this area, magnetic tape still reigns supreme.
More accessible, although not instantaneously, is data that’s held on disk (either HDD or SSD variants) and compressed. Access times are usually good as decompression (and possible unencryption) on the fly happens quickly enough, thanks to today’s processor capabilities, so as not to be noticeable.
Great swathes of data are held, uncompressed and unarchived, ready to be retrieved. How long this data is held in readiness depends on what it is, local governance and the needs and motives of its owners, among a bewildering variety of other factors.
Whatever the format, however, what is undeniable is our need for increased data storage capability. Research company IDC has predicted that the planet will need approximately 160 zettabytes total storage by 2025, over ten times the amount of data created worldwide in 2016 alone.
Just to clarify those stupendous figures:
|1024||kB||kilobyte||1 page of plain text|
|10242||MB||megabyte||1 uncompressed image, 256 colors|
|10243||GB||gigabyte||1 hour of SDTV|
|10244||TB||terabyte||2000 hours of audio|
|10245||PB||petabyte||2000 years of mp3 audio|
|10246||EB||exabyte||storage required to hold 1/5 of all words ever spoken by humans|
|10247||ZB||zettabyte||8 million years of UHD 8K video format|
In the long term, research institutions and organizations with advanced R&D departments are working on the next generations of storage.
Some of the more esoteric offerings bubbling under include holographic storage (although this option is probably defunct, since 2010’s liquidation of InPhase Technologies), DNA storage, perhaps as part of a biologically-based computing system, and quantum storage. The latter’s possibility being brought one step closer by recent research conducted by a Japanese team.
While the long-term solutions to our demand for data storage remain the premise of many a science fiction novella (at least, some of them sound suitable), the short- to medium-term will probably consist of one or all of the following, in combination:
- Deployment of RDMA, Thunderbolt or NVMe protocols.
- Hard disk drives rotating at high speeds but filled with inert gases such as helium to counteract air resistance and lower vibrations.
- Shingled magnetic recording (SMR) where layers of data tracks are partially overlaid (shingled) to increase capacity without significant loss of read-write speeds.
- 3D NAND SSD drives, employing advanced manufacturing technologies whereby layers of data-holding silicon are layered so that the same physical area of data space holds data deeper into its surface, multiplying capacity by up to 32 times.
The history of computing has shown us that centralized storage (in today’s data centers) will hold the key to satisfying our storage demands.
In the 1980s and preceding decades since modern computing’s invention, data was held centrally, usually on mainframes run by large organizations. During the 1990s, with the home computing revolution and the spread of computing into the workplace, data had a brief sojourn to being locally held, on in-house rack-based systems such as SAN devices, or on the hard drive of our workstations.
Currently, data has migrated back out to be held remotely, in the so-called cloud – that is, anywhere accessible on the internet that isn’t on-premise or on-edge. Therefore, cloud storage providers looking to steal a march on competitors are seeking the latest storage breakthroughs to increase capacity, decrease access times, and lower costs to themselves and thence to their customers.
Here are three suppliers of storage systems to the cloud which have caught our eye, here at Tech Wire Asia:
Seagate has realized that digital technology is having a transformative effect on everyone’s lives, and its products and strategies are playing their part in making transformation possible.
The buzzphrase coined by the company is “global datasphere”, which it defines as “the sum of all data created, captured and replicated on our planet”.
The company’s predictions for the future have noted that as data becomes more of an intrinsic part of life, at least some of that data (and an increasing amount, year on year) will become life-critical.
Life-critical applications are those governing increasing aspects of our healthcare, of course, but life-critical data is already being collated: safety testing, fire prevention automation systems, building management systems – all of these are now silos of data on which our lives already depend.
As autonomous transport (as a single example) begins to become more widely accepted, the amount of life-critical data will increase, both of its own right but also as a proportion of all data capture.
The company’s product lines currently feature three products of special note: the Nytro SSD range and the Exos X & E ranges. The latter two are ‘traditional’ HDD devices but have been developed with cutting-edge technology that elevates this once humble medium to data-center industry standard. Read their full profile…
Californian-based Pure Storage aims a sizeable proportion of its offering at businesses and organizations which are looking for a simple, plug-and-play type solution that is simple to deploy and use.
Pure’s support systems are available to all its customers, and its products’ inherent simplicity removes all mention of ‘traditional’ storage nomenclature like tuning, storage pools, caching, tiering, planned downtime, forklift upgrades, and the rest.
Their ‘Data Platform’ purports to be self-managing through predictions derived from machine learning (ML). The ML capability is drawn from the collation of all the available data from the millions of Pure’s products across the globe.
Pure Storage’s Pure1 Meta, for instance, receives telemetry data from each of the 1000s of Pure Storage arrays currently deployed. Sensors at multiple levels – from the array itself to external, connected devices – provide in excess of 1 trillion data points per day, which to date has created a data lake of more than 7PB.
Setup of Pure Storage’s solutions can be as simple as the connection of a cable or two with one hand, and an instruction sheet in the other. The self-configuration routines built into the devices can fully deploy the devices with no further involvement. “You’ll be up and running in 30 minutes”, the company’s literature states, with the storage being fully application aware.
Quantum’s storage product lines are designed to fulfill tasks in particular areas of industry specifically. For instance, it has solutions designed with broadcasters & post-production houses in mind, visual effects (VFX), virtual reality (VR) and gaming concerns, surveillance providers, and storage aimed at the cybersecurity industry – the latter emphasizing the encryption and security credentials of the offering.
Quantum’s Xcellis range is a case in point that aptly describes the full gamut on offer. It consists of the Xcellis Workflow Director and a pair of fully-redundant 1U servers. This system delivers a metadata control and client connectivity that’s fully scalable from just a few users to 100s of end-nodes and is suitable for 4K video storage, editing, and manipulation.
The Xcellis solution stores metadata on redundant SSD storage (the so-called “Xcellis MD Array”) and can be configured in combination with online storage or as standalone metadata storage.
Xcellis can use any combination of Fibre Channel and IP clients in mixed SAN and NAS topologies, so can encompass the needs of organizations with a mixture of fast access and write speeds, alongside more traditional office-style NAS storage of documents.
There are StorNext DLC clients for Mac, Windows, and Linux users providing high-end performance (equable to fiber channel connection speeds, according to Quantum) over lower cost Ethernet IP.
*Some of the companies featured in this article are commercial partners of Tech Wire Asia