Data scientists on steroids or ‘citizen developers’ with AI toolkits?

Among IT professionals it’s a matter of debate as to whether IT literacy is on the rise. Certainly, almost everyone now carries with them an object that only thirty or so years ago would have been described as a supercomputer. The mobile phone, tablet or modern laptop all represent the type of advanced technology that was only dreamed of just a few decades ago, and with that power comes acclimatization to technology, inter-connectivity, communication, and sharing.

Putting aside the issue of whether most people wielding a smartphone know how it works, and indeed, whether they need to, what’s undeniable is that almost anyone can deploy rafts of technology at the touch of a screen. While the complexities at a granular level of streaming services are beyond most mere mortals, anyone can start watching a YouTube video or Netflix series from just about anywhere in the world.

For businesses, that fluency with technology platforms has a significant number of advantages: a company’s digitally-conversant staff are happy and accustomed to spinning up a new cloud-based service to get tasks done. Storing files online – simple; requesting a leave of absence on an app – child’s play; getting access to real-time data that’s derived from multiple services, amalgamated on a handy dashboard – just a screen touch away.

This consumer-led acceptance of the power of tech means that now there’s a new breed of software platforms coming online with the ‘citizen’ moniker. ‘Citizen developers’ are using low-code and no-code automation platforms to automate & link dozens of legacy solutions in the enterprise. What begins perhaps as a chance encounter with Zapier or IFTTT to automate a social media post or create a triggered Google Sheet entry, can be translated relatively easily into the ability to robotize processes; that is, interlink software systems to remove repetitive drudge. To string together several industry buzz-phrases, citizen developers are now using low-code platforms as RPA (robotic process management) enablers.

DataRobot

Clearly, there is even to the least cynical observer an element of marketing hype to the whole ‘low code’ and ‘no code’ suggestion: any deployment of such a system will require an above average grasp of key facets of the platform, like API integration, database fields, Boolean logic, and process charts. But what’s undeniable is that whether the traditional IT function likes it or not, its role is changing into one of an enabler, rather than a limiter on workforce activities in the business.

But citizen-level development and deployment can only go so far – after all, if we were to extend the paradigm, we could build our own mega-cities with Lego bricks. In IT, the processing of big data – such as those reams of binary streaming from IoT – needs advanced technology to chop, mince and process the information, so it becomes of practical and commercial use. The data scientist is a new job description, one that describes a professional who has at hand the ability to invoke the right scripts, configure the machine-learning algorithms and general negotiate the data lakes of today’s businesses.

Like the older generation of IT technicians, who wore white lab coats and carried soldering irons to literally repair the damage done by bugs (read: insects) in the mainframes of the last century, today’s data scientists are the new uber-geeks; fond of cognitive modeling, and waxing lyrical about the image recognition possibilities inherent in next-gen 4K video imagery. Data scientists must be technical in an IT sense, but also highly attuned to what makes a business tick. The clever trick is to achieve marketable results with data without spending thousands of hours and millions of dollars of resources getting there. And it’s to create that ability that AI (and its variants) is now being made available in a variety of platforms to any business willing to invest a little time and energy.

There are a high number of conduits that theoretically anyone can make use of to process big data: there’s the household name of IBM‘s Watson, the lesser-known but still incredibly powerful SAS and DataRobot (see below), and plenty more, either open source or proprietary. Potentially, with enough skill, anyone can feed their audio feeds into Watson for instant transcription into text, and intelligent translation into any number of other languages. There are plenty more possibilities on IBM’s platform alone to be found here.

Based in Boston, Massachusetts, DataRobot offers not to replace the data scientist but to make the data professional’s work a great deal easier by automating the development and deployment of machine learning models in commercially-sensitive settings.

The emphasis on the business underpinnings of the modern data professional is a significant differential, as platforms developed for academic research into cognitive routines and machine learning were never conceived for specifically commercial purposes. It’s this type of focus that’s attracted US$225 million from investors such as New Enterprise Associates, Sapphire Ventures, Meritech and DFJ, and the investment is clearly paying dividends.

In Australasia, for example, DataRobot has worked with disruptive finance house Harmoney (a good representative of new-generation banks, as 60 percent of its staff are described as ‘technically-focused’) to empower the company’s data scientists to literally achieve more than they might otherwise have been capable of.

Their results are not measured by the extent of the exciting new, experimental concepts of machine learning, but rather in the effects that the data scientists’ and DataRobot’s work has had on the bank’s customers: better value for borrowers, lower default risks for lenders, and an increasing market share for Harmoney in the region that’s dominated by the “big six” banks. You can read more about DataRobot here on Tech Wire Asia.

Using one of those extensive, yet descriptive titles, the SAS Visual Data Mining and Machine Learning platform “delivers an integrated platform for managing enterprise data requirements and developing machine learning models.” The platform is designed for team use, with collaborative efforts to achieve data processing on a large scale with machine learning on tap from the solution.

The platform boasts a visual interface and offers what it describes as ‘no coding’, but it’s difficult to imagine anyone without a decent level of Python or TensorFlow being able to hit the ground running and start producing viable results. To that end, the description of ‘no coding’ is possibly better altered to say ‘no actual typing of code from scratch.’

SAS

The platform’s ‘wizard’ does guide users through training a model in a graphic interface, but then presents the code it creates for further editing. Per se, there’s nothing wrong with that nor surprising, either (and after editing the visual model returns), but to say ‘no code’ is disingenuous if you want viable results out of the platform if staffed by novices.

For categorization, therefore, the expected users of the SAS VDMaML (this author’s acronym) platform are best described as data scientists, R-conversant statisticians, data engineers, and qualified researchers.

*Some of the companies featured on this editorial are commercial partners of Tech Wire Asia