Hear from CIOs, CTOs, and other senior executives and leaders on data and AI strategies at the Future of Work Summit on January 12, 2022. Learn more
AI adoption has exploded over the past 18 months. Besides Joe McKendrick, who wrote the seminal article on HBR, professionals working on AI would readily attest to this statement. Google search also appears to be in this not-so-secret area: When asked “Adopting AI,” its autocomplete gushes “have skyrocketed in the past 18 months.”
The anecdotal evidence and the investigations that we are aware of seem to point in the same direction. Example: The AI Adoption in the Enterprise 2021 survey by O’Reilly, conducted in early 2021, received three times as many responses as in 2020, and corporate culture is no longer the main obstacle to adoption.
In other words, more and more people are working with AI, it is now taken seriously and maturity is increasing. It’s good news. This means AI is no longer a game researchers play – it is becoming applied, taking center stage for Microsoft and Amazon and beyond.
The following examines the pillars that we hope Applied AI will build on in 2022.
Typically, when discussing AI, people think of models and data – and for good reason. These are the parts over which most practitioners believe they can exercise some control, while the material remains mostly invisible and its capabilities considered fixed. but is this the case?
AI chips, a new generation of hardware designed to optimally perform AI-related workloads, are experiencing explosive growth and innovation. Cloud mainstays like Google and Amazon are building new AI chips for their data centers – TPU and Trainium, respectively. Nvidia has dominated this market and built an empire around its hardware and software ecosystem.
Intel is looking to catch up, whether through acquisitions or its own R&D. Arm’s status remains somewhat unclear, with the announced acquisition by Nvidia undergoing regulatory review. Additionally, we have a slew of new players at different stages of their adoption journey, some of whom – like Graphcore and SambaNova – have already achieved unicorn status.
What this means for Applied AI is that choosing where to run AI workloads no longer simply means choosing between Intel processors and Nvidia GPUs. There are now many parameters to consider, and this development is not only important for machine learning engineers, but also for AI practitioners and users. AI workloads run more economically and efficiently mean there will be more resources to be used elsewhere with a faster time to market.
MLOps and data centricity
Selecting the hardware on which to run AI workloads can be seen as part of the end-to-end process of developing and deploying AI models, called MLOps – the art and science of putting it together. machine learning in production. To make the connection with AI chips, standards and projects like ONNX and Apache TVM can help bridge the gap and ease the tedious process of deploying machine learning models to various targets.
In 2021, with lessons learned from operationalizing AI, the focus is now on shiny new models towards perhaps more mundane, but practical, aspects such as data quality and pipeline management. data, all of which are important parts of MLOps. Like any discipline, MLOps sees many products on the market, each focusing on different facets.
Some products are more data driven, others are data pipelines, and some cover both. Some products monitor and observe such things as model inputs and outputs, drift, loss, precision and accuracy of data recall. Others are doing similar, but different, things around data pipelines.
Data-centric products meet the needs of data scientists and data scientists, and perhaps machine learning engineers and data analysts as well. Data pipeline-centric products are geared more towards DataOps engineers.
In 2021, people tried naming various MLOps-related phenomena, slicing and slicing the MLOps domain, applying data version control and continuous machine learning, and performing the equivalent. test-driven development for data, among others.
What we see as the most profound change, however, is the focus on so-called data-centric AI. Prominent thought leaders and AI practitioners such as Andrew Ng and Chris Ré have discussed this notion, which is surprisingly simple in its essence.
We have now reached a point where machine learning models are sufficiently developed and work well in practice. So much so, in fact, that there is no point in focusing efforts on developing new models from scratch or tuning to perfection. According to the data-centric view, what AI practitioners should instead do is focus on their data: cleaning, refining, validating and enriching data can go a long way in improving the results of AI projects. .
Large linguistic models, multimodal models and hybrid AI
Large Linguistic Models (LLMs) may not be the first thing that comes to mind when discussing applied AI. However, knowledgeable people believe that LLMs can internalize basic forms of language, be it biology, chemistry, or human language, and we are on the verge of seeing unusual applications of LLMs develop.
To substantiate these claims, it should be mentioned that we are already seeing some kind of ecosystem building around LLMs, primarily the commercially available GPT-3 API by OpenAI in collaboration with Microsoft. This ecosystem consists primarily of companies offering copywriting services such as marketing copy, email, and LinkedIn posts. They may not have set the market on fire yet, but this is just the start.
We believe that LLMs will experience increased adoption and lead to innovative products in 2022 in several ways: through more customization options of LLMs like GPT-3; through more options for creating LLMs, such as Nvidia’s NeMo Megatron; and through LLM as a service offerings, such as SambaNova.
As VentureBeat’s Kyle Wiggers noted in a recent article, multimodal models are quickly becoming a reality. This year, OpenAI released DALL-E and CLIP, two multimodal models which, according to research labs, are “a step towards systems with [a] better understanding of the world. Based on LLMs, one can reasonably expect to see commercial applications of multimodal models in 2022.
Another important direction is in hybrid AI, which is to infuse knowledge into machine learning. Executives such as Gadi Singer from Intel, Mike Dillinger from LinkedIn, and Frank van Harmelen from the Hybrid Intelligence Center all emphasize the importance of organizing knowledge in the form of knowledge graphs for the future of AI. It remains to be seen whether hybrid AI will produce applied AI applications in 2022.
Applied AI in Healthcare and Manufacturing
Let’s end with something more entrenched: promising areas for applied AI in 2022. O’Reilly’s adoption of AI in the Enterprise 2021 survey cites technology and financial services as the two leading areas. adoption of AI. This is hardly surprising, given the tech industry’s drive to “eat its own dog food” and the financial industry’s drive to gain every inch of competitive advantage possible by using its deep pockets.
But what is happening beyond these two industries? O’Reilly’s survey cites healthcare as the third area of AI adoption, which is consistent with our own experience. As State of AI authors Nathan Benaich and Ian Hogarth noted in 2020, biology and healthcare is experiencing its AI moment. This wave of adoption was already underway, and the advent of COVID-19 has further accelerated it.
“The incumbent pharmaceutical industry is very motivated by an a priori hypothesis, saying, for example: ‘I think this gene is responsible for this disease, let’s go after it and find out if it’s true’. Then there are the more software-oriented people who are in this new era of pharma. They mainly look at large-scale experiments and ask a lot of questions at the same time. Impartially, they let the data draw the map of what they should be focusing on, ”said Benaich, summing up the AI-driven approach.
The only way to validate whether the new age pharmaceutical approach is working is whether it can generate drug candidates that are actually clinically useful, and ultimately get those drugs approved, Benaich added. Among these “new age pharma” companies, Recursion Pharmaceuticals went public in April 2021 and Exscientia filed for an IPO in September 2021. They both have assets generated from their market-based approach. machine learning that are actually used in the clinic.
As for manufacturing, there are several reasons why we choose to highlight it among the many areas lagging behind in AI adoption. First, it suffers from a workforce shortage that AI can help alleviate. According to a study published by Deloitte and The Manufacturing Institute, up to 2.1 million manufacturing jobs may go unfilled by 2030. AI solutions that perform tasks such as automated physical inspections products fall into this category.
Second, the nature of industrial applications requires combining sections of data with the physical world in a very precise manner. This, some people have noted, lends itself well to hybrid AI approaches.
And last but not least, the hard data. According to a 2021 survey by The Manufacturer, 65% of manufacturing leaders are working to drive AI. Warehouse implementation alone is expected to achieve a compound annual growth rate of 57.2% over the next five years.
VentureBeat’s mission is to be a digital public place for technical decision-makers to learn about transformative technology and conduct transactions. Our site provides essential information on data technologies and strategies to guide you in managing your organizations. We invite you to become a member of our community, to access:
- up-to-date information on the topics that interest you
- our newsletters
- Closed thought leader content and discounted access to our popular events, such as Transform 2021: Learn more
- networking features, and more
Become a member