Microsoft CEO Satya Nadella

Microsoft CEO Satya Nadella speaking at the keynote for MS Build, Seattle 2018. Source: Microsoft

Microsoft : IoT and mixed reality will be the future of workplace

DEVELOPER conferences are useful for gauging what technology businesses should capitalize on, but it can be confusing to navigate through all the announcements.

Here are three key points from Microsoft’s conference this year, and what they mean for your business:

Intelligent Cloud and Intelligent Edge

What Microsoft dubs as intelligent cloud and intelligent edge, is basically IoT – but smarter. The idea is for an ubiquitous computing, where the application model is distributed, event-driven, and serverless.

In real life terms, this means IoT isn’t just a bunch of sensors, that feed into a central server or computer. It is sharing data across the entire network, accessible at every endpoint.

Microsoft splits this neatly into four offerings: public cloud hosting platform Azure, private cloud hosting capabilities Azure Stack, Azure IoT Edge and Azure Sphere. Azure and Azure Stack are Microsoft’s cloud offerings. That is self-explanatory.

The interesting things happen with Azure IoT Edge. Microsoft explains, as more data is generated, it is more practical to deploy compute and AI at the devices.

New this year, Microsoft is finally open sourcing Azure IoT Edge Runtime.  It is a collection of programs that need to be installed on a device so that Azure services and code can be deployed and managed remotely on the device instead of the cloud. Open sourcing them allow customers to modify, debug, and have more transparency and control for edge applications.

Beyond supporting Windows and Linux, Microsoft comes up with a new OS called Azure Sphere that is specially tailor-made for IoT devices. It features security and management services that run natively on the various compute nodes.

What does this mean: IoT is less about sensors and more about an interconnected web of data and information. The future of IoT has analytics capabilities right where you need them, not just in a control center.

Artificial intelligence

Bringing together AI capabilities in vision, speech, and languages, the company announced Project Kinect for Azure. It’s basically a bunch of sensors in a package that can be integrated for various applications. It has spatial understanding, skeletal tracking, object recognition, least amount of depth noise and an ultra-wide field of view.

If you can imagine sensors that can “see” and map the area, in self-driving cars. That’s the kind of thing we’re talking about. AI helps give the sensors better accuracy in “vision”.

Adding to that is conversational AI, for use in chatbots and digital assistants. Microsoft believes each company should be able to brand their own digital assistant, to bring better customer services that are more relatable to people.

The company also released frameworks and supports for publication and integration across various platforms, like messenger, skype, slack and more.

What does this mean: AI improves accuracy in human-like functions. This can complement your current workflow to reduce time and error spent on mundane jobs, such as mapping and measurements.

Multi-sense, multi-device experience

Microsoft’s final announcement is revolving around Microsoft 365. It is advocating a fully integrated suite of hardware and software that blurs the line between real world and virtual reality.

In practical terms, it means users can collaborate in mixed reality using tools like Microsoft Remote Assist. Using hands-free video calling, image sharing, and annotations; this allows experts to troubleshoot issues without traveling to the location.

A similar function, Microsoft Layout allows 3D models to be superimposed in the real-world environment. Users can make more accurate designs with the right context, taking out most of the guesswork. This saves time and money.

These capabilities are also introduced into Microsoft Teams, the company’s workflow communication application that combines workplace chat, meetings, notes, and attachments.

The platform will now support Hololens to provide context to spatial data in meetings, allowing for cross-platform collaboration.

What does this mean: Tools are only useful if you can use it across various platforms, in various devices, to perform different tasks. It saves time wasted, and save money. Mixed reality brings in more accuracy in design, and it helps people collaborate on projects better regardless of physical location.