AI Predictions for 2020: Customizable, Edge Computing, Data Transparency, RPA, Chatbots
By AI Trends Staff
We reached out to a range of AI practitioners for their predictions on AI Trends in 2020. Here is a selection of their responses:
Max Versace, PhD, CEO and co-founder, Neurala:
Max Versace, CEO and co-founder, Neurala
Customizable approaches to deep learning will make or break AI applications. Traditional approaches to deep learning can be tedious and time-consuming due to the need for massive amounts of data which need to be retrained over and over again. Moreover, data is often not available online or is confidential to one organization, so it cannot be combined with other data to create massive AI systems. In 2020, we’ll see the emergence of new paradigms and approaches to deep learning to solve these challenges.
Manufacturers will move toward the edge. With AI and data becoming centralized, manufacturers are forced to pay massive fees to top cloud providers, to access data that is keeping systems up and running. As a result, new routes to training AI that can be deployed and refined at the edge will become more prevalent. As we move into the new year, more and more manufacturers will begin to turn to the edge to generate data, minimize latency problems and reduce massive cloud fees. By running AI where it is needed (at the edge), manufacturers can maintain ownership of their data.
Sean Knapp, Founder and CEO of Ascend:
Sean Knapp, Founder and CEO, Ascend
The Fate of AI Depends on AI: The 2010s closed out with an AI frenzy—with marketing hype and spending around the technology at an all-time high. While organizations worldwide are anticipated to spend more than $1.8 trillion annually by 2021 on big data and AI-driven digital transformation efforts, many will struggle to translate those investments into business success. This is due to insufficient resources and expertise to support data initiatives, difficulty accessing siloed data, and an increased urgency for fast analysis and delivery.
We’ve seen this type of technology gold rush before, and unless we address the core issues at hand, we will be doomed to fail. The fate of AI will depend on AI itself, or rather the ability to utilize automation to ensure successful AI and big data projects. I expect to see that advancements in automated data and delivery systems in the coming decade will help businesses increase their success rates in these AI and big data initiatives across industries.
Rana el Kaliouby, PhD, CEO and co-founder of Affectiva and author of the forthcoming book, Girl Decoded:
Rana el Kaliouby, PhD, CEO and co-founder of Affectiva
In 2020, tech companies (AI in particular) will be held to a stricter standard of transparency. Consumers are waking up to the fact that their personal data is being used for corporate gain—this has been aggravated by recent data scandals and data privacy hacks. Tech companies—especially AI companies that require massive amounts of data to fuel their deep learning algorithms—are not transparent about their use of data. Specifically, how they are collecting data, where they are storing it, who has access to it, what it’s being used for and ultimately, what that means for the end-user. For example, today when you download a new app on your phone, you’re presented with a lengthy click-through agreement filled with legal jargon. In that case, it’s not clear what you are allowing. The AI industry needs to rethink its approach to educating consumers, and take a long, hard look at whether consent is really informed consent.
The AI industry must address issues of power asymmetry. People with access to certain types of AI will be able to work more efficiently and will have a leg-up on those who don’t have access. I worry about the impact this can have on communities and populations that are already disadvantaged, as AI could continue to widen that gap. In 2020, we need to create guidelines that ensure that AI is applied in an equitable way. AI has the potential to improve people’s lives and solve societal problems, but if we don’t start thinking about power distribution now, we risk institutionalizing AI in a way that may exacerbate inequalities.
Dr. Madhav Durbha, Group Vice President of Industry Strategy atLLamasoft:
Dr. Hadhav Durbha, Group VP of Industry Strategy at LLamasoft
Increased operationalization of AI/ML Yields Business Value. The data explosion is at its peak and becoming more mainstream across all industries—the supply chain is no exception. Next year, AI and ML will move beyond its current hype cycle to offer more tangible use cases that deliver real business value. Here are a few examples of AI applications that will take off in 2020:
Predicting Volatile Order Patterns: Predictability with the ordering is a significant challenge and AI models perform at much more optimal levels in these situations.
Market Sensing: AI can help harness the power of external causal data such as weather, GDP, CPI, employment levels and industrial production, will bring better sensory capabilities into the supply chain.
Chargeback Reduction: Retailers charge hefty penalties to brand owners for missed On Time in Full deliveries. Deep learning algorithms allow sifting through key shipment data including order types, times, quantities, locations and transportation modes to identify root causes for chargebacks and predict points of failure.
The GDPR and China’s “Great Firewall” have led to the Splinternet of Things; the internet splintering and dividing due to factors such as nationalism, politics, and regional data legislation. It serves as a proxy for similar policies that will be issued from other governing bodies on a global basis as ownership of data becomes critically important in a digital world.
Closing the Skills Gap in the Digital Era. The gap between the skills needed to compete in an increasingly digital world and those available in the organization is widening. The rise in robotics, algorithmic intelligence and cloud computing is making an entire generation of supply chain professionals increasingly obsolete. These technologies are hollowing out the middle of the jobs’ spectrum, pushing the humans to the edges requiring extreme physical or cognitive dexterity. Among the positions that organizations are struggling to fill are data engineers, modelers and citizen data scientists, and leaders steeped in agile methodologies.
To close the skills gap and turn it into a competitive differentiator, organizations will need to employ new approaches—investing in the upskilling of the workforce through online platforms and continuous learning, embracing intern and co-op programs, promoting inclusiveness, rotating employees through different functions to gain a broader perspective and investing in cognitive automation to relieve employees from routine tasks.
Pat Calhoun, CEO of Espressive:
Pat Calhoun, CEO of Espressive
Chatbots will become relevant across every function. Chatbots for employee self-help will become relevant, not just for IT, but across every part of the organization. Some 62% of respondents to a survey we conducted recently were actively considering adding a chatbot for employee self-help related to IT questions. It takes too long to find answers on employee portals. We got similar results around HR-related questions.
“Chatbot soup” is inevitable, creating an even larger problem. An explosion of chatbots across the organization can lead us to “chatbot soup.” Unlike portals where there is typically one portal per department, what we are now seeing is that departments are actively deploying chatbots for individual applications as well as to support departmental functions. So, think Oracle, think Workday, think SAP. Every one of them have their own embedded chatbot. On top of that, departments are also deploying their own chatbot. Employees are likely to get frustrated; the level of effort required to maintain all these chatbots will be high. This is likely to end up being a true nightmare for most CIOs.
NLP will proliferate more in 2020. Natural language processing (NLP) is a well-defined term. NLP makes a chatbot capable of understanding what someone is expressing, and be able to provide an action or response. To be successful, NLP requires a large amount of data. Alexa is a wonderful example. The reason Alexa has gotten so much better over time is that it learns from millions of consumers using it on a daily basis. A team of people at Amazon are responsible for ingesting that data and using it to tune Alexa, to make it better over time. How one does that for an enterprise without access to data on the scale of Amazon is a challenge.
Prince Kohli, CTO at Automation Anywhere:
Prince Kohli, CTO at Automation Anywhere
RPA will play a pivotal role in global data privacy and governance initiatives. The 2020s are shaping up to be the decade defined by big data—with the advent of 5G and the explosion of connected devices. In this new era, we’ll see even more pressure on companies to be fully transparent about the information they collect and how it’s used, with legislation like GDPR and the upcoming California Consumer Privacy Act (CCPA) representing only the tip of the data governance iceberg. Additionally, as malware increasingly becomes enhanced with artificial intelligence (AI) to identify network vulnerabilities, intelligent, secure bots will be a critical line of defense against data breaches.
Hiring for RPA skills will explode across all industries and job functions by the end of 2020. With more than 5,000 RPA (robotic process automation) jobs open in the U.S., we already see high demand for RPA specialists. Over the next year, we expect to see RPA skills appear across all job roles—developers, business analysts, program and project managers etc.—and in all verticals—IT, BPO, HR, education, insurance, banking, among others. As a result, the number of open roles (and starting salaries) will skyrocket.
Intelligent (AI-driven) automation will replace rules-based automation entirely. While many RPA platforms now offer AI capabilities, today RPA and AI are used as two separate entities—one is rules-based and the other is adaptive and predictive. In the next year, RPA and process analytics will become entirely infused with AI and machine learning (ML), accelerating process mining and discovery, and dramatically simplifying human effort in these areas. Going forward, bots will be able to automatically identify the best processes to automate, act upon this insight and optimize deployments throughout to guarantee the best possible results.
[From the Editor: We invite readers to submit up to 250 words on your top prediction for the impact of AI on business in 2020. Please choose a theme from these categories:
Industry: Healthcare, Finance and so on
Technology: Deep learning, neural nets, machine learning, hardware platforms, network capability, whatever your orientation
People/Organizations: Job Titles, Hiring, working AI into the organization
Ethics:steps to protect society and the world from the power of AI
Please include a short bio of the person responding. Send responses to John Desmond, editor of AI Trends, firstname.lastname@example.org. We will publish a selection in AI Trends. The Editors]