According to The Guardian, new AI models such as ChatGPT are raising important questions about how much energy and water they consume—questions that these AI brainchildren, apparently, cannot answer. (The information is tightly guarded). But science does offer some educated guesses.
A non-peer-reviewed research paper calculated the energy consumption used to train Hugging Face’s language model, Bloom, on a supercomputer over a 118-day period, plus its energy consumption over its lifecycle of 1.08 million hours. Included in the calculation were “the energy used to manufacture the supercomputer’s hardware and maintain its infrastructure; and the electricity used to run the program once it launched.” The answer: around fifty metric tons equivalent of carbon dioxide emissions, or “the equivalent of an individual taking about sixty flights between London and New York.”
The researchers estimate that Bloom’s final training emitted approximately 24.7 tons equivalent of CO2 if “only the dynamic power consumption” (electricity used to power the program) is included.
According to the researchers, limited available data suggests perhaps 500 metric tons equivalent of CO2 were produced in training ChatGPT’s GPT-3 model, or more than a million miles driven by “average gasoline-powered cars.”
Another non-peer-reviewed study estimates that training GPT-3 in Microsoft’s data centers in the US could have potentially consumed 700,000 liters (184,920.45 gallons) of freshwater.
That is enough, say the researchers, “for producing 370 BMW cars or 320 Tesla electric vehicles.”
For now, educated guesses will have to suffice. When asked about its energy consumption, Google’s Bard answered, “My carbon footprint is zero.”
Sources:
Hozzászólások