Machine Learning and Artificial Intelligence: The next foundational technology

Tuesday, 2 July 2019 00:00 -     - {{hitsCtrl.values.hits}}

By Shanil Fernando

When the US Library of Congress ranked history’s most important innovations, it gave a foremost place to the printing press. While the mechanics behind the printing press weren’t far more sophisticated than the other machines of its era, the consequences of its invention were world changing; finally, mankind had a means for the mass distribution of information, improving literacy and changing every industry in the world. 

Technologies such as these are known as foundational technologies, inventions that can be applied to solve a multitude of problems across a vast number of industries. More contemporary examples include the internet which is now used nearly constantly in all industries and in our personal lives, and smartphones, which are so completely integrated with our lives it seems impossible to live without them. 

As we peer into the near future, we can already see some of the next great potential foundational technologies arising. Technologies like 3D printing, solar energy, autonomous vehicles, and perhaps most interesting of all, machine learning (ML) and artificial intelligence (AI). 

One of the reasons for the increase in the frequency with which we see these technologies emerge has to do with how they achieve general market adoption. Usually a foundational technology will only achieve minor adoption at first, until a catalyst catapults it into complete market adoption. In the case of the internet, initial adoption was low, not just in terms of the users of the internet, but also in terms of businesses that were willing to base their businesses on the internet. 

But all of that changed as we saw global improvement in telecommunication infrastructure, leading to a reduction in the costs of data, increase in streaming speeds and the general affordability and adoption of personal computers. The internet as a foundational technology really sparked during the period between 1999 and 2001, and has grown ever since. 

ML is a form of programming in which the developer, instead of writing exactly how the machine should solve a problem, writes how a machine should teach itself to solve the problem. T

he machine then begins to learn and improve its ability to solve the problem with every attempt. In the past, we would give instructions for a computer to perform a particular task. With ML, we help the computer to learn for itself. A well-known example is facial recognition software. While developers have a very difficult time writing how a machine should identify a face, it is much easier to have a machine teach itself how to identify faces through repetition. 

While ML has been experimented with since the 90s when computers like Deep Blue beat the chess world champion Garry Kasparov by analysing all the professional chess games in history, it didn’t receive widescale market adoption until the right catalysts entered the market. In the case of ML, the catalysts are low cost servers for data storage and high processing power for algorithms. 

As is the case with other foundational technologies, now that the catalysts have been introduced into the market, cheap data storage and cloud technologies provide low costs, powerful computing power for algorithms to consume. Like the internet, between now and the next five years you will see an increase in the adoption of ML and AI. Algorithms and libraries will mature and will be commonly available for general use. ML and AI will be a standard technology stack of any application built.  

We are already seeing ML being used in a myriad of industries to solve a diverse array of problems. The self-driving vehicle revolution has the potential to not only innovate personal transportation, but industrial supply chains as well, while data entry technology can easily digitize data that was previously only readable by humans. The potential application of AI in health care is significant. We are seeing early usage of AI based doctors for basic illness consultations and remedies, with its ability to continuously learn, the advancement of AI-based patient consulting is inevitable.

However, ML’s applications aren’t limited to such high-end projects either, the French company Kolibree has used it to invent a smart toothbrush. Using ML together with the toothbrush’s sensors, Kolibree identifies patterns for good brushing and poor brushing and gives you a score you can compare with others, gamifying oral hygiene. 

While these applications are already significant, the truly staggering fact about ML is that we are only just seeing the teenage years of an infinite being. Consider a doctor, even the best doctor in the world, has a limited lifespan within which to grow his/her knowledge and improve. Furthermore, the number of patients that the doctor can see within their tenure is also limited by time and energy, as is the number of colleagues with whom the doctor can share knowledge. 

A ML program can exist forever, constantly learning and improving. The machine is also capable of running a virtually unlimited number of concurrent instances, and because ML technologies are cloud based, it stores all of its learning in a single database that all of its instances draw from. This is as if doctors lived forever, could see millions of patients at once and offer them all the same amount of care and consolidate all the knowledge of all the doctors of the world into a single mind. 



(The writer is the Co-Founder, Managing Director – Sri Lanka and Senior Vice President Engineering Sysco Labs. Prior to Sysco LABS, Shanil was a founding member of Virtusa and overall lead for its global delivery organisation. He became their CTO at a very young age and was involved in the growth of Virtusa from eight developers all the way through taking it public on Nasdaq. Shanil was named in Echelon’s 40 under 40 list of most important risk takers and was conferred the title of ICT Leader of the Year for 2016 by the Computer Society of Sri Lanka. Shanil has a BSc. in Computer Science from the University of Warwick.)

COMMENTS