Kategori : ELECTRICITY ENERGY NEWS, ENERGY AGENDA NEWS, ENERGY EFFICIENCY NEWS - Tarih : 08 February 2021
Artificial intelligence (AI) can distinguish a dog from a cat, but the billions of calculations needed to do so demand quite a lot of energy. The human brain can do the same thing while using only a small fraction of this energy. Could this phenomenon inspire us to develop more energy-efficient AI systems?
Our computational power has risen exponentially, enabling the widespread use of artificial intelligence, a technology that relies on processing huge amounts of data to recognize patterns. When we use the recommendation algorithm of our favorite streaming service, we usually don’t realize the gigantic energy consumption behind it. The billions of operations needed to process the data are typically carried out in data centers. All these computations consume a tremendous amount of electric power. Although data centers heavily invest in renewable energy, a significant part of the power still relies on fossil fuels. The popularity of AI applications clearly has a downside: the ecological cost.
To get a better understanding of the total footprint, we should take two factors into account: training and inference. First, an AI model needs to be trained by a labeled dataset. The ever-growing trend toward the use of bigger datasets for this training phase causes an explosive growth in energy consumption. Researchers from the University of Massachusetts calculated that during the training of a model for natural language processing, 284 metric tons of carbon dioxide is emitted. This is equivalent to the emission of five cars during their entire life span, including construction. Some AI models developed by tech giants — which are not reported in scientific literature — might emit at a greater magnitude.
The training phase is just the beginning of the AI model’s life cycle. Once the model is trained, it is ready for the real world: finding meaningful patterns in new data. This process, called inference, consumes even more energy. Unlike training, inference is not a one-off. Inference takes place continuously. For example, every time a voice assistant is asked a question and generates an answer, extra carbon dioxide is released. After about a million inference events, the impact will surpass that of the training phase. This process is unsustainable.
Source: Forbes