Table of Contents
The evolution of MLOps
The Fundamentals of MLOps: Moving the Target
Understanding the MLOps landscape
Shifting Organizational Culture to MLOps
Home Technology peripherals AI Enterprises are leveraging machine learning operations to gain business benefits

Enterprises are leveraging machine learning operations to gain business benefits

Apr 10, 2023 am 08:51 AM
AI machine learning

​When companies first begin deploying AI and launching machine learning projects, the focus is often on a theoretical level. Is there a model that can provide the necessary results? How to build this model? How to train this model?

Enterprises are leveraging machine learning operations to gain business benefits

But the tools data scientists use to develop these proofs of concept often cannot Translates well to production systems. As a result, it takes more than nine months on average to deploy an AI or machine learning solution, according to IDC.

IDC analyst Sriram Subramanian said: "We call it 'model speed,' which is the time it takes for a model to get from start to finish. Time.”

This is where MLOps comes into play. MLOps (Machine Learning Operations) is a set of best practices, frameworks and tools that help enterprises manage data, models, deployment, monitoring and other aspects of taking a theoretical proof of concept and putting an AI system into use.

“MLOps can reduce model speed to weeks—sometimes days,” Subramanian said. "Just like using development operations (DevOps) reduces the average time to develop an application, so you need to use MLOps."

By using MLOps, enterprises can build more models, innovate faster, and address more use cases. "The value proposition is very clear," he said.

IDC predicts that by 2024, 60% of enterprises will use MLOps to implement their machine learning workflows. Subramanian said that when companies are surveyed on the challenges they face when using AI and machine learning technologies, a lack of MLOps has emerged as a major barrier to enterprise adoption of AI and machine learning technologies, second only to cost.

Here, we’ll look at what MLOPs are, how they’re evolving, and what organizations need to use and keep in mind to make the most of this emerging approach to implementing AI technologies.

The evolution of MLOps

A few years ago, when Eugenio Zuccarelli first started designing machine learning When it comes to projects, MLOps is just a set of best practices. Since then, Zuccarelli has worked on AI projects at several companies, including some in healthcare and financial services, and over time has seen MLOps evolve to include a variety of tools and platforms.

Today, MLOps provides a fairly powerful framework for implementing AI technology, said Zuccarelli, now an innovation data scientist at CVS Health. As an example, Zuccarelli pointed to a previous project he worked on to develop an app that could predict adverse outcomes, such as hospital readmission or disease progression.

“We were looking at data sets and models and talking to doctors to figure out the characteristics of the best models,” he said. "But to make these models truly useful, we need to put these models in front of actual users."

That means developing a mobile app that is reliable, fast and stable, And on the back end there is a machine learning system connected via API. “Without MLOps, we wouldn’t be able to guarantee that,” he said.

His team created a health dashboard for the model using the H2O MLOps platform and other tools. "You don't want the model to change significantly," he said. “And you don’t want to introduce bias. This health dashboard allows us to understand if changes have occurred in the system.”

Using the MLOps platform also allows us to make updates to production systems. "It's very difficult to replace a file without stopping the application from running," Zuccarelli said. "Even if the system is in production, MLOps tools can replace the system with minimal interference to the system itself."

He said that as the MLOps platform matures , these platforms will speed up the entire model development process, because companies do not have to do some repetitive work in vain for each project. Data pipeline management capabilities are also critical to implementing AI.

“MLOps comes into play if we have multiple data sources that need to communicate with each other,” he said. “You want all the data that flows into your machine learning model to be consistent and of high quality. As they say, garbage in, garbage out. If the information the model is fed is of poor quality, then its predictions themselves will be Very poor.”

The Fundamentals of MLOps: Moving the Target

But don’t assume just because there are platforms and tools now available that you can Ignore the core principles of MLOps. Businesses just starting out in this space should remember that at its core, MLOps is about creating a strong connection between data science and data engineering.

“To ensure that MLOps projects are successful, you need to have both data engineers and data scientists on the same team,” Zuccarelli said.

In addition, equipping the platform with the necessary tools to prevent bias, ensure transparency, provide explainability and support ethics – tools that are still under development, he said. “It definitely still requires a lot of work because it’s such a new area.”

So if there isn’t a complete turnkey solution available, businesses have to be proficient All aspects make MLOps so efficient when implementing AI technology. That means learning expertise on the job, said Meagan Gentry, domestic practice manager for the AI ​​team at Tempe-based Insight technology consulting firm.

MLOps covers everything from data collection, validation and analysis to managing machine resources and tracking model performance. Some of the tools that help businesses can be deployed on-premises, in the cloud, or at the edge. These tools can be open source or proprietary.

But technical knowledge is only part of the solution. MLOps also draws on agile methods from development operations (DevOps) and iterative development principles, Gentry said. Additionally, as in areas related to agile development, communication is crucial.

“Communication between every character is critical,” she said. "Communication between data scientists and data engineers. Communication with DevOps, and communication with the larger IT team."

For companies just starting out , MLOps can be confusing to you. Some general principles can be seen, there are dozens of vendors, and there are even more open source toolboxes.

"There are some pitfalls," said Helen Ristov, senior manager of enterprise architecture at Capgemini Americas. "A lot of these pitfalls are in the development process. There's not a formal set of guidelines like you see with DevOps. It's an emerging technology and some of the guidelines and strategies will take some time to develop. ”

Ristov recommends that companies start their MLOps journey from their data platform. "Maybe they have multiple data sets, but they're in different places and not in a very cohesive environment," she said.

She said enterprises do not need to move all data to one platform, but they do need a way to bring in data from different data sources, which may vary depending on the application. difference. For example, a data lake is a good fit for companies that require low-cost storage and frequently perform large amounts of analysis.

She said that MLOps platforms usually provide tools to build and manage data pipelines while recording different versions of training data, but this is not a one-and-done solution.

Then also provides model creation, version management, logging, measurement feature sets, and other aspects of managing the model itself.

"It involves a lot of coding," Ristov said, adding that setting up an MLOps platform can take months, and when it comes to integration While working, platform providers still have a lot of work to do.

"There's a lot of growth going in different directions," she said. "There are a lot of tools being developed, the ecosystem is very large, and people are just picking and choosing what they need. MLOps is in its immature stage. Most organizations are still figuring out the best configuration."

Understanding the MLOps landscape

The MLOps market size is expected to grow from approximately $185 million in 2020 by 2025, according to IDC’s Subramanian to about $700 million. But he said that could be a serious underestimate because MLOps products are often bundled with larger platforms. The true size of the market could exceed $2 billion by 2025, he said.

Subramanian said MLOps vendors generally fall into three categories, starting with the large cloud providers, including Amazon Web Services (AWS), Azure Cloud and Google Cloud; These cloud platforms provide MLOps functionality as a service.

Then there are some machine learning platform vendors, such as DataRobot, Dataiku, Iguazio, etc.

"The third category is what they used to call data management vendors," he said. "Companies like Cloudera, SAS, and DataBricks. Their strength is in data management capabilities and data operations, and then they expand to have machine learning capabilities, and ultimately to MLOps capabilities."

Subramanian said all three areas are seeing explosive growth, adding that the key to making MLOps vendors stand out is whether they can support deploying models both on-premises and in the cloud, and whether they can implement trustworthy and responsible AI, whether they can provide plug-and-play solutions, and whether their solutions can be easily expanded. "That's where the differences between vendors come in," he said.

According to a recent survey by IDC, a lack of methods for implementing responsible AI is one of the top three barriers to using AI and machine learning technologies, tied for second place with the lack of MLOps itself. .

Sumit Agarwal, research analyst for AI and machine learning technologies at consulting firm Gartner, said this is largely because there is no alternative to using MLOps.

"Every other method is manual," he said. "So, there's really no other choice. If you want to scale, you need automation. You need traceability of your code, data and models."

According to consulting firm Gartner According to a recent survey, the average time it takes for a model to go from proof of concept to production has dropped from 9 months to 7.3 months. “But 7.3 months is still a long cycle,” Agarwal said. “There are many opportunities for organizations to leverage MLOps.”

Shifting Organizational Culture to MLOps

Genpact Global Analytics Business MLOps also requires a change in organizational culture on a company's AI team, said principal Amaresh Tripathy.

“The common image people have of a data scientist is that of a mad scientist trying to find a needle in a haystack,” he said. "A data scientist is a discoverer and explorer, not a factory floor churning out widgets. But that's what you need to do when you really want to scale."

He Says companies tend to underestimate the effort they need to expend.

"People have a better understanding of software engineering," he said. “There are a lot of rules about user experience and requirements. But somehow people don’t think they have to go through the same process when they deploy a model. There’s also a misconception that all data scientists who are good at working in a test environment People will naturally deploy and be able to deploy a certain model, or they can send a few IT colleagues to do it. There is a lack of understanding of what they need to do."

Businesses have yet to realize that MLOps can have a knock-on effect on other parts of the company, often resulting in dramatic changes.

“You can deploy MLOps in the customer service center, but the average response time will actually increase because some simple tasks are handled by machines and AI and handed over to humans. jobs actually take longer because they are more complex," he said. "So you need to rethink what the work is going to be, what kind of people you need, and what kind of skills they should have."

He said that today, in an organization, Less than 5% of decisions are driven by algorithms, but this is changing rapidly. "We predict that over the next five years, 20 to 25 percent of decisions will be driven by algorithms. Every statistic we examine shows that we are at an inflection point for the rapid expansion of AI."

MLOps is a key element, he said.

"One hundred percent," he said. “You can’t sustainably use AI without MLOps. MLOps are a catalyst for expanding the use of AI in the enterprise.”

The above is the detailed content of Enterprises are leveraging machine learning operations to gain business benefits. For more information, please follow other related articles on the PHP Chinese website!

Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn

Hot AI Tools

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Undress AI Tool

Undress AI Tool

Undress images for free

Clothoff.io

Clothoff.io

AI clothes remover

AI Hentai Generator

AI Hentai Generator

Generate AI Hentai for free.

Hot Article

R.E.P.O. Energy Crystals Explained and What They Do (Yellow Crystal)
3 weeks ago By 尊渡假赌尊渡假赌尊渡假赌
R.E.P.O. Best Graphic Settings
3 weeks ago By 尊渡假赌尊渡假赌尊渡假赌
R.E.P.O. How to Fix Audio if You Can't Hear Anyone
3 weeks ago By 尊渡假赌尊渡假赌尊渡假赌
WWE 2K25: How To Unlock Everything In MyRise
4 weeks ago By 尊渡假赌尊渡假赌尊渡假赌

Hot Tools

Notepad++7.3.1

Notepad++7.3.1

Easy-to-use and free code editor

SublimeText3 Chinese version

SublimeText3 Chinese version

Chinese version, very easy to use

Zend Studio 13.0.1

Zend Studio 13.0.1

Powerful PHP integrated development environment

Dreamweaver CS6

Dreamweaver CS6

Visual web development tools

SublimeText3 Mac version

SublimeText3 Mac version

God-level code editing software (SublimeText3)

Bytedance Cutting launches SVIP super membership: 499 yuan for continuous annual subscription, providing a variety of AI functions Bytedance Cutting launches SVIP super membership: 499 yuan for continuous annual subscription, providing a variety of AI functions Jun 28, 2024 am 03:51 AM

This site reported on June 27 that Jianying is a video editing software developed by FaceMeng Technology, a subsidiary of ByteDance. It relies on the Douyin platform and basically produces short video content for users of the platform. It is compatible with iOS, Android, and Windows. , MacOS and other operating systems. Jianying officially announced the upgrade of its membership system and launched a new SVIP, which includes a variety of AI black technologies, such as intelligent translation, intelligent highlighting, intelligent packaging, digital human synthesis, etc. In terms of price, the monthly fee for clipping SVIP is 79 yuan, the annual fee is 599 yuan (note on this site: equivalent to 49.9 yuan per month), the continuous monthly subscription is 59 yuan per month, and the continuous annual subscription is 499 yuan per year (equivalent to 41.6 yuan per month) . In addition, the cut official also stated that in order to improve the user experience, those who have subscribed to the original VIP

Context-augmented AI coding assistant using Rag and Sem-Rag Context-augmented AI coding assistant using Rag and Sem-Rag Jun 10, 2024 am 11:08 AM

Improve developer productivity, efficiency, and accuracy by incorporating retrieval-enhanced generation and semantic memory into AI coding assistants. Translated from EnhancingAICodingAssistantswithContextUsingRAGandSEM-RAG, author JanakiramMSV. While basic AI programming assistants are naturally helpful, they often fail to provide the most relevant and correct code suggestions because they rely on a general understanding of the software language and the most common patterns of writing software. The code generated by these coding assistants is suitable for solving the problems they are responsible for solving, but often does not conform to the coding standards, conventions and styles of the individual teams. This often results in suggestions that need to be modified or refined in order for the code to be accepted into the application

Can fine-tuning really allow LLM to learn new things: introducing new knowledge may make the model produce more hallucinations Can fine-tuning really allow LLM to learn new things: introducing new knowledge may make the model produce more hallucinations Jun 11, 2024 pm 03:57 PM

Large Language Models (LLMs) are trained on huge text databases, where they acquire large amounts of real-world knowledge. This knowledge is embedded into their parameters and can then be used when needed. The knowledge of these models is "reified" at the end of training. At the end of pre-training, the model actually stops learning. Align or fine-tune the model to learn how to leverage this knowledge and respond more naturally to user questions. But sometimes model knowledge is not enough, and although the model can access external content through RAG, it is considered beneficial to adapt the model to new domains through fine-tuning. This fine-tuning is performed using input from human annotators or other LLM creations, where the model encounters additional real-world knowledge and integrates it

Seven Cool GenAI & LLM Technical Interview Questions Seven Cool GenAI & LLM Technical Interview Questions Jun 07, 2024 am 10:06 AM

To learn more about AIGC, please visit: 51CTOAI.x Community https://www.51cto.com/aigc/Translator|Jingyan Reviewer|Chonglou is different from the traditional question bank that can be seen everywhere on the Internet. These questions It requires thinking outside the box. Large Language Models (LLMs) are increasingly important in the fields of data science, generative artificial intelligence (GenAI), and artificial intelligence. These complex algorithms enhance human skills and drive efficiency and innovation in many industries, becoming the key for companies to remain competitive. LLM has a wide range of applications. It can be used in fields such as natural language processing, text generation, speech recognition and recommendation systems. By learning from large amounts of data, LLM is able to generate text

To provide a new scientific and complex question answering benchmark and evaluation system for large models, UNSW, Argonne, University of Chicago and other institutions jointly launched the SciQAG framework To provide a new scientific and complex question answering benchmark and evaluation system for large models, UNSW, Argonne, University of Chicago and other institutions jointly launched the SciQAG framework Jul 25, 2024 am 06:42 AM

Editor |ScienceAI Question Answering (QA) data set plays a vital role in promoting natural language processing (NLP) research. High-quality QA data sets can not only be used to fine-tune models, but also effectively evaluate the capabilities of large language models (LLM), especially the ability to understand and reason about scientific knowledge. Although there are currently many scientific QA data sets covering medicine, chemistry, biology and other fields, these data sets still have some shortcomings. First, the data form is relatively simple, most of which are multiple-choice questions. They are easy to evaluate, but limit the model's answer selection range and cannot fully test the model's ability to answer scientific questions. In contrast, open-ended Q&A

Five schools of machine learning you don't know about Five schools of machine learning you don't know about Jun 05, 2024 pm 08:51 PM

Machine learning is an important branch of artificial intelligence that gives computers the ability to learn from data and improve their capabilities without being explicitly programmed. Machine learning has a wide range of applications in various fields, from image recognition and natural language processing to recommendation systems and fraud detection, and it is changing the way we live. There are many different methods and theories in the field of machine learning, among which the five most influential methods are called the "Five Schools of Machine Learning". The five major schools are the symbolic school, the connectionist school, the evolutionary school, the Bayesian school and the analogy school. 1. Symbolism, also known as symbolism, emphasizes the use of symbols for logical reasoning and expression of knowledge. This school of thought believes that learning is a process of reverse deduction, through existing

SOTA performance, Xiamen multi-modal protein-ligand affinity prediction AI method, combines molecular surface information for the first time SOTA performance, Xiamen multi-modal protein-ligand affinity prediction AI method, combines molecular surface information for the first time Jul 17, 2024 pm 06:37 PM

Editor | KX In the field of drug research and development, accurately and effectively predicting the binding affinity of proteins and ligands is crucial for drug screening and optimization. However, current studies do not take into account the important role of molecular surface information in protein-ligand interactions. Based on this, researchers from Xiamen University proposed a novel multi-modal feature extraction (MFE) framework, which for the first time combines information on protein surface, 3D structure and sequence, and uses a cross-attention mechanism to compare different modalities. feature alignment. Experimental results demonstrate that this method achieves state-of-the-art performance in predicting protein-ligand binding affinities. Furthermore, ablation studies demonstrate the effectiveness and necessity of protein surface information and multimodal feature alignment within this framework. Related research begins with "S

Laying out markets such as AI, GlobalFoundries acquires Tagore Technology's gallium nitride technology and related teams Laying out markets such as AI, GlobalFoundries acquires Tagore Technology's gallium nitride technology and related teams Jul 15, 2024 pm 12:21 PM

According to news from this website on July 5, GlobalFoundries issued a press release on July 1 this year, announcing the acquisition of Tagore Technology’s power gallium nitride (GaN) technology and intellectual property portfolio, hoping to expand its market share in automobiles and the Internet of Things. and artificial intelligence data center application areas to explore higher efficiency and better performance. As technologies such as generative AI continue to develop in the digital world, gallium nitride (GaN) has become a key solution for sustainable and efficient power management, especially in data centers. This website quoted the official announcement that during this acquisition, Tagore Technology’s engineering team will join GLOBALFOUNDRIES to further develop gallium nitride technology. G

See all articles