8 MLops predictions for enterprise machine learning by 2023
View all sessions on demand from Smart Security Summit this.
the scenery of MLops is growing strongly, in a global market that has been estimated will be $612 million in 2021 and is expected to reach over $6 billion by 2028. It is also highly fragmented, however, with hundreds of MLops vendors competing for the right to operate. end use artificial intelligence (AI) ecosystems.
MLops emerges as a collection of best practices less than a decade ago, aimed at addressing one of the key barriers preventing businesses from bringing AI to work — the transition from development and training to a training environment. production school. This is necessary because almost one of two The AI pilot was never put into production.
So what trends will emerge in the MLops scene in 2023? Many AI and ML experts shared their predictions with VentureBeat:
1. MLops will go beyond the hype
“MLops will not be just a topic of hype, but a source of empowerment data scientist to put machine learning models into production. Its main purpose is to streamline the process of developing machine learning solutions.
“As organizations promote best practices for AI production, applying MLops to bridge the gap between machine learning and data engineering will work to seamlessly unify these functions. It will be crucial in the evolving challenges associated with scaling AI systems. The companies that embrace it next year and accelerate this transition will be the ones reaping the benefits.”
—Steve HarrisCEO of Mindtech
2. Data scientists will favor pre-built industry-specific and domain-specific ML models
“In 2023, we will see an increase in the number of built-in machine learning [ML] models are available to data scientists. They encapsulate the area’s expertise in an initial ML model, which then accelerates time to value and time to market for data scientists and their organizations. These pre-built ML models, for example, eliminate or reduce the amount of time data scientists have to spend retraining and refining models. Look at the work cover face The AI community has been driving the market for ready-to-use ML models.
“What I expect to see over the next year and beyond is an increase in industry- and domain-specific pre-built ML models, allowing data scientists to tackle a wide variety of targeted problems. spend more using a well-defined underlying data set without having to spend time becoming a subject matter expert in an area that is not core to their organization.”
— Torsten GrabsProduct Management Manager, Snowflake
3. AI and ML workloads running in Kubernetes will outpace non-Kubernetes deployments
“AI and ML workloads are accelerating but the dominant projects are not yet on Kubernetes. We expect that to change in 2023.
“There has been a lot of focus on tweaking Kubernetes over the last year with new projects that make it even more appealing to developers. These efforts are also focused on tailoring Kubernetes services to enable the compute-intensive needs of AI and ML to run on GPUs to maintain quality of service when hosted on Kubernetes.”
— Patrick McFadinVice President of Developer Relations, DataStax
4. Performance will be a line item for the 2023 ML budget
“Performance-focused investments have been going on for several years, but this will be the focus in 2023, especially as macroeconomic factors unfold and limited talent pool remains. . People who drive their organizations with machine learning (ML) and cutting-edge technologies are having the most success designing workflows that include the human side of the loop. This approach provides much-needed protection if technology struggles or needs additional monitoring, while allowing both parties to work effectively together.
“An initial hesitation and feedback is expected when educating the masses about the ML quality assurance process, largely due to a lack of understanding of how the learning system works. and accuracy of the results. One aspect that remains questionable, but is the core difference between ML and the traditional, static technology we’ve come to know, is ML’s ability to learn and adapt over time. If we can better educate leaders on how to harness the full value of ML — and its guide to performance gains — we’ll see a lot of progress over the next few years. .”
— Tony LeeCTO at Hyperscience
5. ML project priority will focus on revenue and business value
“Looking at ML projects in progress, teams will have to be a lot more productive due to recent layoffs and move towards automation to help move projects forward. Other teams will need to develop more structure and define deadlines to ensure projects are completed efficiently. Different business units will have to start communicating more, improve collaboration and share knowledge so that today’s smaller teams can function as one cohesive unit.
“Additionally, teams will also have to prioritize what kind of projects they need to work on to make the biggest impact in a short amount of time. I see machine learning projects falling into two categories: marketable features that management believes will increase sales and win over the competition, and business optimization projects. revenue has a direct impact on revenue. Sellable feature projects will likely be shelved, as they are unlikely to launch quickly, and instead, the currently smaller ML teams will focus more on revenue optimization as it has can drive real revenue. At this point, performance is essential for all business units, and ML is no exception.”
— Gideon MendelCEO and Co-Founder of the MLops Platform, Comet
6. Enterprise ML teams will be data-centric rather than model-centric
“Enterprise ML teams are becoming data-centric rather than model-centric. If the input data is not good, and if the labels are not good, then the model itself will not be good — leading to a higher rate of false-positive or false-negative predictions. That means more focus on ensuring clear and well-labeled data is used for training.
“For example, if Spanish words were accidentally used to train a model to expect English words, then one might expect the unexpected. This makes MLops even more important. Data quality and ML visibility are emerging as key trends as teams try to manage pre-training data and track post-production model performance.”
— Ashish KakranPrincipal, Thomvest Ventures
7. Edge ML will grow as MLops teams expand to focus on the end-to-end process
“While the cloud continues to provide unparalleled flexibility and resources, more and more businesses are seeing the true value of running ML at the edge — near the data source where decisions are made. This happens for a variety of reasons, such as the need to reduce latency for autonomous equipment, reduce storage and cloud access costs, or due to lack of connectivity in remote locations where system connectivity is not possible. High security with open internet.
“Because edge ML implementations are more than just gluing some code to a device, edge ML will see tremendous growth as MLops teams expand to focus on the entire end-to-end process.”
— Vid JainFounder and CEO of Wallaroo AI
8. Feature engineering will be automated and simplified
“Feature engineering, the process by which input data is understood, classified, and prepared in a way that is usable for machine learning models, is a particularly intriguing area.
“While data warehouses and streaming capabilities have simplified data entry and AutoML platforms have democratized the model development process, the feature engineering required in the middle of the process is still is a challenge that is mostly done manually. It requires domain knowledge to extract context and meaning, data science to transform data, and data engineering to implement ‘features’ into production models. We expect to see significant strides in automating and simplifying this process.”
— Rudina SeseriFounder and Managing Partner of Glasswing Ventures
VentureBeat’s Mission is to become a digital city square for technical decision-makers to gain knowledge of transformative and transactional enterprise technology. Explore our Briefings.