Is edge computing really the best way forward for innovative businesses?
EDGE computing is interesting. It makes it possible for businesses to process data in real-time, at the source, and only send relevant insights back to the organization to store and use.
The technology has been seen as something that enables the future of the internet of things (IoT) ecosystems in the digital, data-driven world, especially as the volume of data grows and storage becomes expensive.
However, given that the vast majority of data that is captured is destroyed soon after as a result of adopting ‘edge computing’ can have severe consequences for innovation engines in organizations — especially when designing artificial intelligence (AI) and machine learning (ML) solutions.
AI and ML technologies consume data voraciously in order to learn, understand, and make sense of a situation or circumstance.
While a few experiences can teach a human that an electric iron should be approached with caution because it may be hot, an AI or ML engine will need hundreds of thousands of data iterations to come to the same conclusion.
As we move to deploy autonomous cars, sensor-powered thermostats, and dozens of connected devices per person, the amount of data generated per minute will be enormous and storing it all will definitely not make economic sense.
Unfortunately, deleting all that data to make today’s operations financially feasible may result in shortchanging tomorrow’s innovations.
Data scientists and business leaders don’t know what they don’t know, and hence, cannot ask questions of the data they collect today if they don’t know what answers they will seek tomorrow.
Data from an autonomous car, for example, can help make operational decisions today, but tomorrow, may help data scientists answer questions about efficiency and reliability, which will be difficult to do if the data is processed in the car and lost moments later.
Obviously, technology experts understand this problem which is why those that think about ‘hoarding data’ worry about the sheer network capacity and power that will be required to fuel tomorrow’s data centers.
Innovations such as the Cisco Silicon One recently announced by the San Francisco-born networking giant can definitely provide relief — but it’s up to innovative businesses to accelerate the move to smarter infrastructures and ecosystems — in order to deploy it in time to support the internet for the future.
The company’s new product delivers over twice the network capacity and twice the power efficiency as compared to any other silicon which means that businesses can not only stop worrying about rising power costs but also put concerns about sustainability and a growing carbon footprint in the digital era to rest.
Facebook, Microsoft, Google, and other businesses that thrive on data are already partnering up with Cisco to create innovative solutions that enable them to successfully innovate in today’s data-rich marketplace.
Companies that foresee the need to expand their capacity to harness data in the digital age might, therefore, do well to think about how they can better prepare their infrastructures to support innovation needs in the coming months. After all, the hard reality is that edge computing might not be the smartest, best, or brightest solution every time.
- Is India finally inching closer to its 5G ambitions?
- Should employees be worried about working in the metaverse?
- One in four consumers are online fraud victims in the Asia Pacific
- Optimizing operational efficiency is a prerogative for the manufacturing industry
- Driver shortages: An increasingly dire issue for e-hailing companies in Malaysia