One of the major downstream impacts for better or worse of the COVID-19 pandemic is the rate at which artificial intelligence (AI) technologies are being adopted within the enterprise. This dramatic increase is thanks mainly to the acceleration of digital business transformation initiatives.
A recent survey of 403 business leaders involved in machine learning initiatives at companies with $100M or more in revenue published by Algorithmia, a provider of a platform for managing the deployment of AI models, finds the percentage of organizations that have more than five use cases involving machine learning (ML) algorithms and other forms of AI has increased 74% from a similar survey conducted a year ago.
The survey also reveals 83% of organizations have increased their AI/ML budgets, with the average number of data scientists being employed increasing 76% year over year.
The use cases involving AI/ML technologies have apparently also become more sophisticated. The survey finds that 64% of all organizations take a month or longer to deploy a model. More than a third of respondents (38%) said their data scientists are spending more than half of their time on model deployment versus training those models.
Also read: Collaboration Platforms Will Evolve into Digital Assistants Thanks to AI
Making AI Models More Viable
The survey makes it clear there’s still much work to be done when it comes to operationalizing AI models at scale in production environments, says Algorithmia CEO Diego Oppenheimer.
At the core of that issue is the rate at which AI models are being deployed and updated. It can take the average data scientist team six months to train and deploy an AI model. Once deployed, that AI model needs to be updated as new data sources become available. That AI model will also often need to be replaced as business conditions change because many of the assumptions that were made to train the AI model are no longer relevant.
The lifecycle management of AI models is giving rise to a new IT discipline dubbed MLOps. In turn, MLOps processes needed to be integrated with both DataOps processes that are now needed to manage terabytes, sometimes even petabytes of data, as well as the DevOps processes that are employed to accelerate the building deployment of the applications an AI model will be embedded within.
After those tasks are completed organizations will also need to focus on governance. More than half of survey respondents (56%) rank governance, security, and auditability issues as a concern, with 67% reporting that their AI initiatives will need to comply with multiple regulations.
Handing AI Management Over to IT
Today, MLOps is typically managed by a team of data scientists but Oppenheimer says it’s only a matter of time before IT teams take over responsibility of AI lifecycle management.
“IT is going to take over MLOps,” says Oppenheimer. “We’re approaching the next level of maturity.”
Fortunately, the next year will see many of these tasks becoming increasingly automated as IT teams rely more on platforms to automate the management of AI models. Platforms for building AI models will employ ML algorithms to automate as much of the building and deploying of AI models.
It’s apparent going into 2021 that most enterprise applications will soon be infused with AI. By the end of the year the fact that an application has been infused with ML algorithms may no longer even be remarkable, but rather simply expected.
Also read: AI Priorities Shift in Wake of COVID-19 Pandemic