Skip to main content

Your submission was sent successfully! Close

Thank you for signing up for our newsletter!
In these regular emails you will find the latest updates from Canonical and upcoming events where you can meet our team.Close

Thank you for contacting us. A member of our team will be in touch shortly. Close

An error occurred while submitting your form. Please try again or file a bug report. Close

  1. Blog
  2. Article

Andreea Munteanu
on 5 January 2024

AI in 2024 – What does the future hold?


2023 was an epic year for artificial intelligence. In a year when industry raced faster than academia in machine learning (source), the state of the art for AI evolved to include increasingly larger amounts of data, and bringing to bear sufficient computing resources to support new use cases remained a challenge for many organisations. 

With the rise of AI, concerns were not far behind. According to an article published by Stanford, BLOOM’s training run emitted 25 times more carbon than a single air traveller on a one-way trip from New York to San Francisco. 

In light of these trends and challenges, what do we foresee in the AI space this year and where is the AI community focusing its energy? Let’s first look back at 2023 and then explore expectations for AI in 2024.

Rewind of AI in 2023

In 2022, we said that it was the year of AI, but… guess what? 2023 was also the year of AI, and it’s a safe bet that 2024 will follow suit. In the last 12 months, the adoption of AI has grown tremendously. WEKA’s 2023 report informed us that AI pioneers and explorers alike primarily used the public cloud both for training and inference. Organisations started moving projects into production, leading to new challenges and prompting companies to look more in-depth at the options for scaling their infrastructure. 

Following the announcement from NVIDIA and Microsoft, the arrival of DGX Cloud on the marketplace expanded the options that enterprises have to quickly get started with AI. At the same time, it highlighted the need to have hardware and software that are optimised to work together – and certifications such as DGX-Ready Software Solutions emerged to address this need. Charmed Kubeflow is one of the MLOps tools that have been validated on the NVIDIA hardware.

Read more about AI at scale with Canonical and NVIDIA

Machine learning security is still a concern 

According to a report published by the AI Infrastructure Alliance, more than 40% of organisations have all the resources needed to create value with AI and drive AI transformation. However, companies are also reporting challenges related to security and compliance, performance and cost and governance.

Securing the tooling that is used for machine learning projects is crucial. The security breach of Pytorch raised even more awareness about the topic and the possible risks. Data science tools often have access to highly sensitive data, so professionals need to ensure that both environment and artifacts are secured. Read more about securing your MLOps platform.

During KubeconEU 2023, Maciej Mazur and I also approached this topic and talked about secure MLOps on highly sensitive data. We captured some options to secure the environment at different layers of the stack during our keynote.

Kubeflow in 2023

In 2023, MLOps was an important topic for AI practitioners. Canonical offers one of the official distributions of Kubeflow, so naturally, we kept a close eye on the project. Kubeflow had two new releases, 1.7 and 1.8. Daniela Plascencia, part of the Canonical engineering team, was the release lead for Kubeflow 1.8. 

Towards the end of the year, the Kubeflow Summit took place.  With great sessions and working groups, use cases from companies such as Roblox, and challenges brought to the table, the event energised the community. Next year, Kubeflow Summit will be a co-located event together with Kubecon EU. Buy your ticket now and meet us there.

Canonical MLOps in 2023

2023 was, without a doubt, a busy year for us. Our activity this year went beyond Charmed Kubeflow, and in September 2023 we released Charmed MLFlow. We kept working on our documentation, publishing some new guides such as:

While many companies are rethinking their AI strategies, we are aware of how important it is to share our knowledge and help our audience make informed decisions. In 2023, we published more than 50 blogs on our website and on our Medium publication, hosted 10 webinars, launched Ubuntu AI podcast, released 5 whitepapers and went on a world tour to give talks and workshops on different topics. Some of the most successful pieces of content from 2023 were:

Canonical AI Roadshow

In September 2023 we launched Canonical AI Roadshow, a series of events and presentations that highlighted how enterprises can make better use of their own data and make AI use cases a reality. With more than 10 stops across 4 continents over the course of 3 months, Canonical experts talked about artificial intelligence, open source MLOps and how to run AI initiatives in production. We had a joint workshop with NVIDIA at the World AI Summit and a joint event with Microsoft in São Paulo. 

Read the highlights of Canonical AI Roadshow

What’s in store for AI in 2024?

MLOps is here to stay and 2024 is likely to be another extraordinary year for those who are active in the industry. There is no doubt that the percentage of companies that will move their AI projects into production will continue to grow. For all players in the sector, the pressure will be on to improve security standards, document better and continue to integrate existing hardware and software to offer a seamless experience.

At the same time, 2024 will not only be about AI, but also about open source. This is a change that did not surprise many people, but it is of great importance. On one hand, open source gives access to everyone to quickly get started at a lower cost, but it also enables collaboration between different people, communities and organisations. Looking at the latest concerns on sustainability, open source reduces the computing power needed to train models, especially large language models (LLMs) from scratch, and therefore the carbon footprint. Leading open source projects such as Kubeflow and MLflow have already started adding features to enable better results for genAI and LLM-related projects.

It’s a wrap for now…

With 12 months ahead of us, MLOps has plenty of time to surprise everyone in 2024. It is, at the end of the day, a collaborative function that comprises data scientists, DevOps engineers and IT. As the market is going to evolve, new roles are going to be added to the list. However, everyone should be focused on an ongoing goal: run secure AI at different stages, from experimentation to production.

New solutions will likely appear on the market, similar to the ones that we mentioned above, and it is likely that a growing number of open source projects will be integrated in cohesive solutions. Enterprises will probably have higher expectations from machine learning projects, and thus from the tooling behind them. Cost efficiency and time-effectiveness will become more and more important discussions, influencing business decisions related to MLOps. 

Canonical’s promise to deliver secure open source software will continue to include MLOps tooling. Our goal is to help organisations not only by providing the right technology, but also guidance to make the best decisions, depending on the use case, where they are in their AI journey and constraints. At the same time, open source is in our DNA, so we will continue to enable AI enthusiasts to use our tools, educate early adopters on how to get started and encourage everyone to contribute to them.

Learn about the MLOps toolkit

Further reading

Related posts


Andreea Munteanu
11 March 2025

How to deploy Kubeflow on Azure

AI Article

Kubeflow is a cloud-native, open source machine learning operations (MLOps) platform designed for developing and deploying ML models on Kubernetes. Kubeflow helps data scientists and machine learning engineers run the entire ML lifecycle within one tool. Charmed Kubeflow is Canonical’s official distribution of Kubeflow. The key benefits o ...


victoriaantipova
14 July 2025

Let’s meet at AI4 and talk about AI infrastructure with open source

Ubuntu Article

Date: 11 – 13 August 2025 Booth: 353 Book a meeting You know the old saying: what happens in Vegas… transforms your AI journey with trusted open source. On August 11-13, Canonical is back at AI4 2025 to share the secrets of building secure, scalable AI infrastructure to accelerate every stage of your machine learning ...


Andreea Munteanu
19 March 2025

Unlocking Edge AI: a collaborative reference architecture with NVIDIA

AI Partners

The world of edge AI is rapidly transforming how devices and data centers work together. Imagine healthcare tools powered by AI, or self-driving vehicles making real-time decisions. These advancements rely on bringing AI directly to edge devices. However, building a robust architecture for diverse edge environments presents significant hu ...