No Jitter is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

What is an AI Factory?

There are an abundance of AI-specific terms in enterprise communication, and one of the newest is AI Factory. The term does not refer to the use of AI in manufacturing facilities, nor does it refer to a factory in which AI products go rolling off a conveyor belt. What does it mean? Read on for a brief overview.

 

What is an AI Factory?

An AI factory can refer to a data center – the physical infrastructure used to produce artificial intelligence itself – the data and the models. Per No Jitter’s conversation with Omdia’s Bradley Shimmin, the term’s undergone a bit of a context shift. About 18 months ago, the term “AI factory” would have referred to the infrastructure necessary to build and train the big frontier models – e.g., OpenAI’s ChatGPT-4o, Claude’s Anthropic, Google Gemini, etc. Today, however, “AI factory” better describes the product of inferencing – the predictions or conclusions produced by running data through a live, already-trained AI model.

 

Why am I seeing the phrase "AI Factory" so much?

The phrase has been used sporadically over the past several years, but it generated headlines in June 2024 when, during his Computex keynote, Nvidia founder and CEO Jensen Huang announced that multiple companies would use Nvidia networking and infrastructure for enterprises to build AI factories and data centers that would drive generative AI breakthroughs. During that speech, Huang said:

The next industrial revolution has begun. Companies and countries are partnering with NVIDIA to shift the trillion-dollar traditional data centers to accelerated computing and build a new type of data center — AI factories — to produce a new commodity: artificial intelligence. From server, networking and infrastructure manufacturers to software developers, the whole industry is gearing up for Blackwell to accelerate AI-powered innovation for every field.

(Emphasis ours.)

The phrase was also used by Huang during his talk at Dell Technologies World in May 2024, when he compared the “AI factory” concept to factories that, during the industrial revolution, used water to produce electricity. He said that today’s data centers transform data and electricity to produce intelligence that is “formulated [as] tokens that can then be expressed in any information modality that we’d like it to be.”

Huang had previously used the term in March 2022 when he said that AI data centers process massive amounts of data to train and refine AI models: “Raw data comes in, is refined, and intelligence goes out — companies are manufacturing intelligence and operating giant AI factories.” At the time, Huang was pitching Nvidia’s H100 which was built to speed up the training of large language models (LLMs). In this context, it seemed that the AI factory concept centered on data centers designed to produce and train AI models.

The term “AI Factory” itself seems to have originated with Harvard professors Marco Iansiti and Karim Lakhani who in 2020 published the article “Competing in the Age of AI,” which later led to a book by the same name.

 

How might AI Factories soon affect the capabilities of AI products?

According to Omdia’s Shimmin, when an enterprise (via its employees), uses generative AI to do anything – create a meeting summary, a presentation, a brand pitch – that AI-generated information becomes a living part of that company’s enterprise data landscape. So in that sense, the AI factory is producing real data based on events in the real world.

An AI factory can also manufacture synthetic data to help a model produce better outputs. Synthetic data mimics the statistical properties in terms of structure, features and characteristics of the original, real-world dataset. That synthetic data (if the enterprise is generating it based on their own real data), offers additional benefits: reduced costs associated with data management and analysis, greater control over quality and format of the dataset, reduced bias and improved data security, etc.

That synthetic data could help enterprises use smaller models (e.g., Microsoft Phi, Google Gemma, Meta’s Llama and others) which, if trained on good data (real or synthetic), can be extremely competitive with the larger foundation models in terms of speed and efficiency when it comes to inferencing. And the smaller models, particularly the open-source ones, can be run privately so that potential security issues can be avoided.

 

The TL;DR?

While the term AI factory could refer to the physical infrastructure used to build and train big AI models, its definition is shifting and an “AI factory” now refers to the results generated by an AI model. Those results could be synthetic data which can, in turn, be used to train new models. In other words, an AI factory can produce real and synthetic data as well as future AI models.