The hype of Artificial Intelligence (AI) continues to grow with an ever widening mind gap between those who have expectations of a pending singularity versus those who are attempting to make use of AI tools. Since the 1960's, each decade has its resurgence of expectation for AI and then it faded into the mainstream, forgotten about for a short period then comes back again. This describes the technology industry's propensity for creating cyclical marketing of phantom trends which provoke early adoption and lagger catchup reaction to the promise of the latest innovation that will give a company some assumed advantage. The promise of AI has been used within each of these hype cycles as a signifier for a mechanized future that displaces human work with automation through robotics and devices that can perform the task more accurately and faster than humans can. For a Luddite, this future is a dystopia but anyone who reviews the economic performance that has resulted from the adoption of computation into the workforce would need to recognize the increased productivity. The reach of computing devices is now so saturated that new social phenomenon is emerging where people with mobile devices are capable of doing what was science fiction just 50 years ago accompanied by the threat of an Orwellian supranational technology company that lacks any moral or ethical responsibility for what they control.

Face and voice recognition are now taken for granted and no longer considered AI. With the next hype cycle, it is responding to human facial expressions and conversational voice dialog that is the new AI target and this technology is already available. What was once a computational trick of AI, the brute force application of statistical processing to a massive dataset to extract patterns is now being supplanted by neural networks that try to mimic primitive learning strategies using large training samples. This seems to be working but it remains a pattern matching approach which is suited to computation but is far from any notion of an Artificial Intelligence. With enough computation power, the tech giants can entertain the imagination with brute force challenges like IBM Watson playing Jeopardy or Google DeepMind generating freakish visual art. While these examples are great fun they are just effective promotional tactics by large tech companies to push the imagination for the promise of AI further along into the next hype cycle.

What remains elusive is AI processing meaning, not only identifying the object in an image as an Egg but the full significance of the Egg image in the context of its usage and all the cultural, scientific, and associated misconceptions. The notion of an Egg exists within a context which has many conceptual layers and variations. Returning to the initial attempts at constructing AI systems the notion of an Egg would have been placed into a Knowledge Representation, a prescriptive attempt at capturing its meaning in all senses. Early AI systems quickly discovered that the world is a messy place and that human language lacks clarity thus ambiguity easily corrupts attempts at codifying knowledge. For AI to survive it needed to find a useful purpose and thus the talent moved to Information Retrieval instead which resulted in the modern dependency on search engines. The challenge of capturing all web content, which was readily available for processing, became so great that only a few companies could survive doing this. Brute force computation dominated and by the time the tech community realized that it has missed an opportunity to impose semantics onto the web it was too late. The semantic web, as an inference engine for extract meaning, came too late and while much is now understood about how the web should have been constructed the messiness of human activity resisted that kind of control.

The latest battle that is slowly being won by the brute force computation approach is language translation. Language translation has been available for many years and used a syntactical strategy combined with a graph of associations. Since so much information in various languages is available as web content Google used its brute force computation to do translations through patterns. It added the facility to allow users to correct the translations and with that increase in quality was able to achieve translations that were close to the alternative methods. Google then turned to neural networks to train the translation process and this has improved the translations further which is demonstrating that with enough training data the latest approach to AI may continue to advance in where it can be applied.

Googles translation is not processing knowledge, it is doing pattern matching and using a neural network to mechanically learn to perform the translation. This is still a computational brute force approach. Where does marketing of AI go next? IBM is turning to its collection of Watson branded technologies (mostly companies they acquired) and they are trying to create solutions for health care. The main difference between Google and IBM is one of focus, Google needs volumes of users for its advertising business so focused on broad information retrieval problems that can use massive generated (low quality) web data from its users. IBM focused on solutions for businesses which have specific industry domains. The two companies have completely different interpretations for what AI is and where it goes next, even if they share some of the underlying technology for information retrieval problems.

The large tech corporations, with the tech media, continue to elevate the expectations of AI and their various solutions they promote as AI even when it is statistical processing or information retrieval or machine learning. AI has a market definition problem, it is consistently applied to advances that can’t be understood completely by the public or the buying companies. Once the underlying technology is understood then it is given a different name, this switch is what “Data Science” is about and its related marketing monikers of Cloud Computing, BigData, IoT, and a box of scrabble board game pieces that make up the current tech marketing alphabet soup.

Whether someone calls (confuses) the latest advances AI, Data Science, BigData or something else is not so relevant since we can all expect the tech giants to create new marketing lingo for us to chase. Sorting through the marketing, and more importantly, the actual technology components reveal a level of complexity and sophistication that is well beyond what proceeded it in the past. The tech community went from needing to know a programming language and something about hardware to needing to know several languages and a mix of computing platform stacks to get a basic website built. With the latest round of buzz words, the stakes on the competencies are even higher and surpass what any techie could master in a few months. This has pushed the boundaries to need scientist and well-trained computer scientist that deeply understand the technology and the research being deployed in the latest advances. BigData and Data Science is actually Science using Computation or redefines the profession of Computer Scientist.

Like all previous technology advances, it starts with innovations that come out of research that is only understood by a well trained highly competent scientist. It gradually gets commoditized so that the technology is more widely available and accessible to business to put to commercial use. AI has always been a specialty but its current manifestations have an opportunity to turn it into a commodity but first, it requires an integrated toolset with an industry focus.

The next wave of AI might not be Artificial Intelligence, which that terms are reserved for whatever comes next that is inaccessible, the hype of the singularity for example, so it gets reduced to the monikers of Data Science and BigData. This is the normal market adoption pattern in technology so the question is how can Data Science and BigData be commoditized. In the web toolset the term “full stack” development is prevalent, so we can expect that to cross into something like Full Stack AI. This would be an integrated toolset that applied neural networks with data science and BigData. Many competing individual components exist for various portions but only a few well-integrated toolset that would constitute a Full Stack AI development environment are available today. The second dimension is the commercial industry it is applied to, IBM decided to tackle Health Care, but many other industries could benefit that would have very different needs, for example, Financial Services, Insurance, Food Science. Each of these has domain specific knowledge that cannot be easily represented. Adding to the Full Stack AI one can apply it to Vertical AI which is industry specific. So a toolset that makes computation possible for an industry where its processes result in insights that involve massive dataset points to a future where companies that adopt these solutions will gain substantial advantages over the competition. They would be able to make business decisions using evidence that is arrived from the computation that is not possible to achieve through any human means.

The Full Stack Vertical AI toolset is available but it may not be enough. A company has to capture its data, and/or the data can be sourced from the web and proprietary data sources. To do Data Science the data and the Full Stack toolset would be enough but it would not be AI. For really gaining an advantage a process of using neural networks as part of machine learning would be needed and for that it is all about training the system. This remains the hardest part for any company. Even if they have the competencies they may lack the means of training the systems to get results that are useful.

At science4data.com we have a Full Stack Vertical AI solution for Financial Services, Insurance, e-commerce, food science, and agriculture. What distinguishes our solutions from others is that we addressed the main AI gap of how to rapidly train the systems. We quickly seed the system with Knowledge Representation for the industry domain and then through usage the system is trained by experts who get immediate value but their expertise is captured and it aggregates the human judgment into an accumulated knowledge repository that provides breakthrough insights when used for complex decision making. We re-define AI as Augmented Intelligence and leave AI for the next hype cycle. Our focus is making it real and useful commercially so that businesses get results and avoid the trail of hyped promises of AI left unrealized.

Register for FREE to comment or continue reading this article. Already registered? Login here.

1