Spoiled for choice? Cloud services get abstraction (and a little help for IT, too)
As enterprises across the APAC progress down their digital transformation journeys, what’s becoming apparent is that no single cloud provider has all the answers to all the questions. Hence, most organizations operate a hybrid strategy, one that involves a combination of on-premise, private data centers, and different cloud providers.
That way, companies are finding their most effective workflows and are able to explore every possibility, to develop a more agile approach to business. But that increasing IT structure is hampered by tightening IT budgets, shortage of talent capable of supporting nimble, hybrid topologies, and sudden changes in market demands.
Multi-cloud is the new normal
According to IDG the vast majority of businesses in the APAC are using multiple cloud providers.
That’s a trend recognized by the major cloud providers themselves. Products like Azure Arc and AWS Outposts are confirmation that even the most significant cloud businesses know that their customers simply don’t wish to place all their eggs in one basket. By deploying multiple services from different sources, and adding in on-premise or private data center resources, organizations are finding their own way.
In the APAC, there are clearly some regional differences particular to the locale. It’s only the US and China where domestically-based companies use their own nation’s suppliers. In China, the top six cloud providers are all Chinese. In the US, Tencent and Alibaba are minor players, subject to the vagaries of current US foreign policy.
But in the APAC, companies of all sizes are choosing from a wide range of elements with which to create their hybrid business platforms — from NTT (especially popular in its native Japan) to AWS, to Sinnet. And with tools like Kubernetes and a move in DevOps towards microservices-based infrastructures underneath apps & services, the companies in the APAC that are most successful are the ones using cutting edge technologies, open standards, and a platform-agnostic approach to the entire development-to-deployment lifecycle.
Cloud’s risks and cloud choice
The big promise that every cloud provider has made is that the remote XaaS compute, storage & network model helps companies cut overheads and provides elastic, on-demand scalability. Those claims are valid, especially when companies examine their new OPEX vs. CAPEX balance sheets. But the realities of managing identities and access across very complex topologies remains a significant challenge for many organizations, problems exacerbated by tighter IT budgets.
In some ways, the ease with which cloud solutions can be deployed is the services’ Achilles’ heels — it’s all too easy to misconfigure a system or not have a coherent user management/access system. Those issues can create security concerns and have negative impacts on overall service performance.
As the big cloud providers jostle both for storage and compute market share, they are also pushing their own control pane solutions as ways that stretched IT companies can simplify complex hybrid networks. But like Google Cloud not necessarily being the answer to every question, perhaps neither is Anthos or similar.
When hype turns real
As technology journalists at Tech Wire Asia, we are as guilty as anyone else of championing new technologies perhaps before they achieve a maturity level where they can be deployed “in anger” in the enterprise. None more so than the combination of 5G and IoT, an area of technology that has so much possibility, but is only now really just beginning to land in organizations in a meaningful way.
But whatever the use cases that enterprises in the APAC will find for these new platforms, one thing is sure; there’s going to be a great deal more data being produced by smart, connected devices soon.
The household name cloud suppliers are keen to provide entry-level treats for companies willing to try new tech like AI, for example. But with petabytes more data, the problems of infrastructure management, data access & user management, deployment procedure management and the like will only multiply.
Therefore at Tech Wire Asia, we’re featuring two vendors whom we feel have something specific to offer in their space. These two, below, are shifting the focus from a “remote computing” concept of the cloud, over to a business-first cloud management approach. That means the types of business agility required by APAC’s leading companies can be realized without the IT department having to run to keep up.
The headaches involved in, for example, deploying production-ready systems across different clouds in different regions, mean that as it stands, IT management has to work as hard as it did ten years ago when the entire enterprise’s IT stack was in private data centers. But the agility and scale of the cloud, combined with control and data oversight that’s possible with the right platform from one of these two, meaning that companies can concentrate on their business, not the detail of their data.
The Snowflake platform does several very clever things for companies with a complex hybrid- and/or multi-cloud services as part of their overall IT stack, but to focus on those would be missing the overall point of the solution.
Instead, think of the company’s offering as one that handles data at scale for an organization that wants to derive real business value from this most valuable asset better. It normalizes structured and semi-structured data from any data lake or repository, from on-premise to different public clouds. Then, it allows its access, sharing, and management in the most efficient ways possible, and bills customers based on resources used, on a per-second basis.
That means companies can optimize their cloud resources and not have to pay for idle power, storage, and capability. The Snowflake data platform handles transitions from any location (cloud provider and platform agnostic) according to business needs.
Sure, that really helps out the IT department who don’t have to work hard on mere provisioning; but the real winners are dev ops, operations, and the business strategizers. Code can change hosts, resources spin up on demand, and provisioning is optimized on the fly: this is an abstraction of data services for businesses in APAC. You should read more here on the pages of Tech Wire Asia.
As you might expect, Big Blue’s acquisition of Red Hat has accelerated its repositioning as a cloud-first company, with OpenShift rapidly becoming one of those solutions that’s becoming the industry standard.
Also, as you might expect, IBM has a significant number of individual solutions that it can deploy in a modular manner for its big data customers. But it all begins with an open framework and lends itself to free interchange of data (and their schema) across multiple platforms. IBM Z, for example, can come with the capability to deploy AI, hitting the ground running with high-speed, in-memory data processing, running the Z/OS, and optimized for db2.
But you don’t need mainframe money to get mainframe results. Thanks to its reach and capabilities, IBM Big Data Hub can be leveraged by the smallest concern, letting users control their computing costs and cutting overheads like server provisioning and management with one turnkey solution.
The headlines might go to Watson and its interfacing options open to every would-be AI tinkerer, but the IBM big data options are distinctly business-centric. If open-source frameworks and hybrid topologies dominate your IT world, then Big Blue still has the de facto industry standard, ready to roll. Read more about the offerings here.
*Some of the companies featured on this article are commercial partners of Tech Wire Asia