5 truths about artificial intelligence business leaders need to know
GIVEN all the talk about how much potential artificial intelligence (AI) has and the exciting use cases vendors and companies have created together, it’s easy to start putting it on a pedestal and misunderstanding its power.
Gartner, who speaks to several SME leaders and CXOs, feels that there is a lack of clarity on what AI really is and what it can do — and this causes them to expect more than they should from the technology.
“With AI technology making its way into the organization, it is crucial that business and IT leaders fully understand how AI can create value for their business and where its limitations lie,” explained Gartner Research VP Alexander Linden.
Yet, companies continue to spend more on AI products, projects, and solutions — and experts and consultants fear that, as a result, ROIs won’t be met.
According to IDC, global spending on cognitive and AI systems is forecasted to continue its trajectory of robust growth as businesses invest in projects that utilize cognitive/AI software capabilities.
IDC’s guide suggests spending on cognitive and AI systems will reach US$77.6 billion in 2022, more than three times the US$24.0 billion forecast for 2018. The compound annual growth rate (CAGR) for the 2017-2022 forecast period will be 37.3 percent.
There are no comments on ROIs as yet, as projects are still in their initial stages.
“AI technologies can only deliver value if they are part of the organization’s strategy and used in the right way.”
To help business leaders fight misconceptions about AI, Gartner’s Linden sheds light on five of the most common myths (and sheds light on the truths):
# 1 | AI works in the same way as the human brain does
AI is basically a software program that lives in the computer and solves problems in a way that it has been taught. It’s quicker than the human brain but lacks the creativity that the human brain possesses.
“Some forms of machine learning (ML) – a category of AI – may have been inspired by the human brain, but they are not equivalent.”
“Image recognition technology, for example, is more accurate than most humans, but is of no use when it comes to solving a math problem. The rule with AI today is that it solves one task exceedingly well, but if the conditions of the task change only a bit, it fails.”
# 2 | Intelligent machines learn on their own
Human intervention is required to develop an AI-based machine or system.
The involvement may come from experienced human data scientists who are executing tasks such as framing the problem, preparing the data, determining appropriate datasets, training, and maintaining the data.
In fact, this is one of the reasons some experts argue that AI will create as many jobs it displaces as in the future, they foresee that new jobs will be created specifically to manage and prepare data to train AI, professionals will be hired to maintain a particular AI algorithm, and specialists will be working with AI tools to ensure they learn the right things.
# 3 | AI can be free of bias
“Today, there is no way to completely banish bias, however, we have to try to reduce it to a minimum.”
The question of bias has been raised quite frequently in the recent past. According to a blogpost recently published by the World Economic Forum (WEF), it points out that AI is not sentient but merely a tool and therefore morally neutral, she reminded us that its use depends on the criteria we humans apply to its development.
Bias creeps in when data sets aren’t clean and free from bias, argues the WEF, who claim that algorithms don’t become biased on their own – they learn that from us. So we have to take responsibility for helping to avoid any negative effects spawned from the AI systems that we’ve created.
“In addition to technological solutions, such as diverse datasets, it is also crucial to ensure diversity in the teams working with the AI, and have team members review each other’s work. This simple process can significantly reduce selection and confirmation bias.”
# 4 | AI will only replace repetitive jobs
AI enables businesses to make more accurate decisions via predictions, classifications, and clustering.
These abilities have allowed AI-based solutions to replace mundane tasks, but also augment remaining complex tasks. An example is the use of imaging AI in healthcare.
According to the Wall Street Journal, In one experiment an AI system diagnosed the presence of melanoma on images of skin lesions with 76 percent accuracy, higher than the 71 percent average of eight dermatologists.
However, the article points out that AI can only assert probability, not certainty, of what an image shows. It may point out the things a doctor may have missed, but needs support ascertaining whether that reading is accurate.
The same is the case with robo-advisors in the financial and insurance industry.
Those capabilities don’t eliminate human involvement in those tasks but will rather have humans deal with unusual cases.
With the advancement of AI in the workplace, business and IT leaders should adjust job profiles and capacity planning as well as offer retraining options for existing staff.
# 5 | Not every business needs an AI strategy
Every organization should consider the potential impact of AI on its strategy and investigate how this technology can be applied to the organization’s business problems.
In many ways, avoiding AI exploitation is the same as giving up the next phase of automation, which ultimately could place organizations at a competitive disadvantage.
“Even if the current strategy is ‘no AI’, this should be a conscious decision based on research and consideration. And – as every other strategy- it should be periodically revisited and changed according to the organization’s needs. AI might be needed sooner than expected.”
- Cisco to provide private 5G network to enterprises in Malaysia with TM
- Here are the cities leading the data center growth in Asia Pacific
- For SMBs in Singapore, 5G is not as complicated as it seems
- The Great Layoff has not dampened the demand for tech talent
- Empowering security for mission-critical applications