Artificial intelligence (AI) can distinguish a dog from a cat, but the billions of calculations needed to do so demand quite a lot of energy. The human brain can do the same thing while using only a small fraction of this energy. Could this phenomenon inspire us to develop more energy-efficient AI systems?
Our computational power has risen exponentially, enabling the widespread use of artificial
To get a better understanding of the total footprint, we should take two factors into account: training and inference. First, an AI model needs to be trained by a labeled dataset. The ever-growing trend toward the use of bigger datasets for this training phase causes an explosive growth in energy consumption. Researchers from the University of Massachusetts calculated that during the training of a model for natural language processing, 284 metric tons of carbon dioxide is emitted. This is equivalent to the emission of five cars during their entire life span, including construction. Some AI models developed by tech giants — which are not reported in scientific literature — might emit at a greater magnitude.
The training phase is just the beginning of the AI model’s life cycle. Once the model is trained, it is ready for the real world: finding meaningful patterns in new data. This process, called inference, consumes even more energy. Unlike training, inference is not a one-off. Inference takes place continuously. For example, every time a voice assistant is asked a question and generates an answer, extra carbon dioxide is released. After about a million inference events, the impact will surpass that of the training phase. This process is unsustainable.
Source: Forbes