Greenhouse gas emissions tracking project

Greenhouse gas emissions tracking project

Advertisements
Is generative AI bad for the environment?

Is generative AI bad for the environment?

Advertisements

Is generative AI bad for the environment? A computer scientist explains the carbon footprint of ChatGPT and its cousins

By Kate Saenko, Boston University

The image above is on The Generative AI Race Has a Dirty Secret, credit to WIRED

AI chatbots and image generators run on thousands of computers housed in data centers like this Google facility in Oregon.
Tony Webster/Wikimedia, CC BY-SA

Generative AI is the hot new technology behind chatbots and image generators. But how hot is it making the planet?

As an AI researcher, I often worry about the energy costs of building artificial intelligence models. The more powerful the AI, the more energy it takes. What does the emergence of increasingly more powerful generative AI models mean for society’s future carbon footprint?

“Generative” refers to the ability of an AI algorithm to produce complex data. The alternative is “discriminative” AI, which chooses between a fixed number of options and produces just a single number. An example of a discriminative output is choosing whether to approve a loan application.

Generative AI can create much more complex outputs, such as a sentence, a paragraph, an image or even a short video. It has long been used in applications like smart speakers to generate audio responses, or in autocomplete to suggest a search query. However, it only recently gained the ability to generate humanlike language and realistic photos.

Using more power than ever

The exact energy cost of a single AI model is difficult to estimate, and includes the energy used to manufacture the computing equipment, create the model and use the model in production. In 2019, researchers found that creating a generative AI model called BERT with 110 million parameters consumed the energy of a round-trip transcontinental flight for one person. The number of parameters refers to the size of the model, with larger models generally being more skilled. Researchers estimated that creating the much larger GPT-3, which has 175 billion parameters, consumed 1,287 megawatt hours of electricity and generated 552 tons of carbon dioxide equivalent, the equivalent of 123 gasoline-powered passenger vehicles driven for one year. And that’s just for getting the model ready to launch, before any consumers start using it.

Size is not the only predictor of carbon emissions. The open-access BLOOM model, developed by the BigScience project in France, is similar in size to GPT-3 but has a much lower carbon footprint, consuming 433 MWh of electricity in generating 30 tons of CO2eq. A study by Google found that for the same size, using a more efficient model architecture and processor and a greener data center can reduce the carbon footprint by 100 to 1,000 times.

Larger models do use more energy during their deployment. There is limited data on the carbon footprint of a single generative AI query, but some industry figures estimate it to be four to five times higher than that of a search engine query. As chatbots and image generators become more popular, and as Google and Microsoft incorporate AI language models into their search engines, the number of queries they receive each day could grow exponentially.

AI chatbots, search engines and image generators are rapidly going mainstream, adding to AI’s carbon footprint.
AP Photo/Steve Helber

AI bots for search

A few years ago, not many people outside of research labs were using models like BERT or GPT. That changed on Nov. 30, 2022, when OpenAI released ChatGPT. According to the latest available data, ChatGPT had over 1.5 billion visits in March 2023. Microsoft incorporated ChatGPT into its search engine, Bing, and made it available to everyone on May 4, 2023. If chatbots become as popular as search engines, the energy costs of deploying the AIs could really add up. But AI assistants have many more uses than just search, such as writing documents, solving math problems and creating marketing campaigns.

Another problem is that AI models need to be continually updated. For example, ChatGPT was only trained on data from up to 2021, so it does not know about anything that happened since then. The carbon footprint of creating ChatGPT isn’t public information, but it is likely much higher than that of GPT-3. If it had to be recreated on a regular basis to update its knowledge, the energy costs would grow even larger.

One upside is that asking a chatbot can be a more direct way to get information than using a search engine. Instead of getting a page full of links, you get a direct answer as you would from a human, assuming issues of accuracy are mitigated. Getting to the information quicker could potentially offset the increased energy use compared to a search engine.

Ways forward

The future is hard to predict, but large generative AI models are here to stay, and people will probably increasingly turn to them for information. For example, if a student needs help solving a math problem now, they ask a tutor or a friend, or consult a textbook. In the future, they will probably ask a chatbot. The same goes for other expert knowledge such as legal advice or medical expertise.

While a single large AI model is not going to ruin the environment, if a thousand companies develop slightly different AI bots for different purposes, each used by millions of customers, the energy use could become an issue. More research is needed to make generative AI more efficient. The good news is that AI can run on renewable energy. By bringing the computation to where green energy is more abundant, or scheduling computation for times of day when renewable energy is more available, emissions can be reduced by a factor of 30 to 40, compared to using a grid dominated by fossil fuels.

Finally, societal pressure may be helpful to encourage companies and research labs to publish the carbon footprints of their AI models, as some already do. In the future, perhaps consumers could even use this information to choose a “greener” chatbot.

Kate Saenko, Associate Professor of Computer Science, Boston University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

 

Limiting global warming to 1.5°C would save billions

Limiting global warming to 1.5°C would save billions

Advertisements

Limiting global warming to 1.5°C would save billions from the dangerously hot climate, but not only that.  It could bring improvements to life with cleaner air and improved water and soil quality.  

.

The image above is on the many effects of dangerously high temperatures, including reduced crop yield. Credit Pablo Tosco / Oxfam International

Limiting global warming to 1.5°C would save billions from dangerously hot climate

 

Current climate policies will leave more than a fifth of humanity exposed to dangerously hot temperatures by 2100, new research suggests.

Despite the Paris Agreement pledge to keep global warming well below 2°C (compared to pre-industrial levels), current policies are projected to result in 2.7°C warming by the end of the century.

The new study, led by researchers at the Global Systems Institute, University of Exeter, associated with the Earth Commission, and Nanjing University, assessed what this would mean for the number of people living outside the “climate niche” in which our species has thrived.

It says about 60 million people are already exposed to dangerous heat (average temperature of 29°C or higher).

And two billion – 22% of the projected end-of-century population – would be exposed to this at 2.7°C of global warming.

The paper highlights the “huge potential” for decisive climate policy to limit the human costs and inequities of climate change.

Limiting warming to 1.5°C would leave 5% exposed – saving a sixth of humanity from dangerous heat compared to 2.7°C of warming.

The study also finds that the lifetime emissions of 3.5 average global citizens today – or just 1.2 US citizens – expose one future person to dangerous heat. This highlights the inequity of climate crisis, as these future heat-exposed people will live in places where emissions today are around half of the global average.

In “worst-case scenarios” of 3.6°C or even 4.4°C global warming, half of the world’s population could be left outside the climate niche, posing what the researchers call an “existential risk”.

“The costs of global warming are often expressed in financial terms, but our study highlights the phenomenal human cost of failing to tackle the climate emergency,” said Professor Tim Lenton, director of the Global Systems Institute at the University of Exeter.

“For every 0.1°C of warming above present levels, about 140 million more people will be exposed to dangerous heat.

“This reveals both the scale of the problem and the importance of decisive action to reduce carbon emissions.

“Limiting global warming to 1.5°C rather than 2.7°C would mean five times fewer people in 2100 being exposed to dangerous heat.”

Defining the niche

Human population density has historically peaked in places with an average temperature of about 13°C, with a secondary peak at about 27°C (monsoon climates, especially in South Asia).

Density of crops and livestock follow similar patterns, and wealth (measured by GDP) peaks at about 13°C.

Mortality increases at both higher and lower temperatures, supporting the idea of a human “niche”.

Although less than 1% of humanity currently live in places of dangerous heat exposure, the study shows climate change has already put 9% of the global population (more than 600 million people) outside the niche.

“Most of these people lived near the cooler 13°C peak of the niche and are now in the ‘middle ground’ between the two peaks. While not dangerously hot, these conditions tend to be much drier and have not historically supported dense human populations,” said Professor Chi Xu, of Nanjing University.

“Meanwhile, the vast majority of people set to be left outside the niche due to future warming will be exposed to dangerous heat.

“Such high temperatures have been linked to issues including increased mortality, decreased labour productivity, decreased cognitive performance, impaired learning, adverse pregnancy outcomes, decreased crop yield, increased conflict and infectious disease spread.”

While some cooler places may become more habitable due to climate change, population growth is projected to be highest in places at risk of dangerous heat, especially India and Nigeria.

The study also found:

  • Exposure to dangerous heat starts to increase dramatically at 1.2°C (just above current global warming) and increases by about 140 million for every 0.1°C of further warming.
  • Assuming a future population of 9.5 billion people, India would have the greatest population exposed at 2.7°C global warming – more than 600 million. At 1.5°C, this figure would be far lower, at about 90 million.
  • Nigeria would have the second-largest heat-exposed population at 2.7°C global warming, more than 300 million. At 1.5°C warming this would be less than 40 million.
  • India and Nigeria already show “hotspots” of dangerous temperatures.
  • At 2.7°C, almost 100% of some countries including Burkina Faso and Mali will be dangerously hot for humans. Brazil would have the largest land area exposed to dangerous heat, despite almost no area being exposed at 1.5 °C. Australia and India would also experience massive increases in area exposed.

The research team – which included the Potsdam Institute for Climate Impact Research, the International Institute for Applied Systems Analysis, and the Universities of Washington, North Carolina State, Aarhus and Wageningen – stress that the worst of these impacts can be avoided by rapid action to cut greenhouse gas emissions.

Speaking about the conception of their idea, Professor Marten Scheffer, of Wageningen University, said: “We were triggered by the fact that the economic costs of carbon emissions hardly reflect the impact on human wellbeing.

“Our calculations now help bridging this gap and should stimulate asking new, unorthodox questions about justice.”

Ashish Ghadiali, of Exeter’s Global Systems Institute, said: “These new findings from the leading edge of Earth systems science underline the profoundly racialised nature of projected climate impacts and should inspire a policy sea-change in thinking around the urgency of decarbonisation efforts as well as in the value of massively up-shifting global investment into the frontlines of climate vulnerability.”

The research was funded by the Open Society Foundations and the paper is also an output of the Earth Commission – convened by Future Earth, the Earth Commission is the scientific cornerstone of the Global Commons Alliance.

Wendy Broadgate, Executive Director of the Earth Commission at Future Earth, said: “We are already seeing effects of dangerous heat levels on people in different parts of the world today. This will only accelerate unless we take immediate and decisive action to reduce greenhouse gas emissions.”

Work on climate solutions by the Global Systems Institute at the University of Exeter has identified “positive tipping points” to accelerate action, including a recent report that highlighted three “super-leverage points” that could trigger a cascade of decarbonisation.

The paper, published in the journal Nature Sustainability, is entitled: “Quantifying the Human Cost of Global Warming.”

Read more on University of Exeter

 


We are about to break the 1.5°C limit

Advertisements

We are about to break the 1.5°C limit, but could we get energy from this untapped source?

18 May 2023

Energy-hungry data centres already match the aviation industry in terms of their contribution to global warming. Could they be adapted to heat other buildings as standard, wonders Kunle Barker

An article in the Economist last year entitled, ‘Say goodbye to 1.5°C’ made for depressing reading. It claimed we had already lost one of the critical battles in the climate war. The article suggested that we stood little chance of restricting the world’s post-industrial temperature rise to 1.5°C. The only way I could process this news was to ignore it. I convinced myself we were still on target and that the messaging was helpful as it would chivvy us all into focusing on hitting the 1.5°C target.

Sadly, the UN Climate Report released in March confirmed the Economist s conclusion. And this week, scientists have said the 1.5°C threshold is likely to be broken over the next few years.

Even for an eternal optimist like myself, this is worrying and disappointing. After COP26, as I drove back to London in my EV, I felt hopeful and sure that the world would do what was needed to save the planet. Little did I know, as I triumphantly plugged my EV into a supercharger at Rugby services, it was already too late to save the 1.5-degree target.

I’ve written many columns about the critical role the built environment sector could play in averting a climate disaster. To a large extent, as an industry, our intent is clear: we question, campaign, and push each other to do better. However, something sinister may lurk underneath the surface of our hubris.

A recent BBC story about a swimming pool in Exeter that used a data centre as its heat source grabbed my attention. The story reminded me of a train journey I shared with fellow Manser Medal judge, Joe Jack Williams, in which he described using data centres as heat sources in heritage assets. The use of waste heat fascinated me, but the wider application struck me only while reading this BBC article. Could data centres be used as heat sources for homes, schools even entire developments?

Using excess heat is by no means a new idea. The Churchill Gardens estate, which started construction in 1946, used excess heat from Battersea Power Station. However, my research into the topic revealed a surprising fact about the impact on our environment of data centres: They are ‘sleeping giants’ when it comes to CO2 emissions.

Today, data centres account for 2 per cent of the world’s carbon footprint, similar to the aeronautical industry and only 1.6% less than the petrochemical industry. This is worrying enough, but there are predictions that by 2030 data centres will be responsible for more carbon emissions than both those industries combined.

When this sleeping giant awakens, our industry will shoulder the blame because we will have designed and built the structures these carbon goliaths inhabit.

An obvious solution would be to argue for restricting the growth of the data centre industry, but I believe it’s too late for this. In many ways, it has already happened. Our reliance on online payments, AI, cloud storage and so on is already integrated into our society’s fabric, and it is too late to go back.

But there could be a solution. Around 70 per cent of data centres’ energy is used for cooling, and this is set to climb to 80 per cent as machines used for AI and Blockchain operate more efficiently at lower temperatures. Data centres are usually designed on a large scale but perhaps they could be used to heat individual buildings if they were made smaller and supply and demand of this heat were efficiently balanced.

Designing smaller data centres would allow their integration into large-scale developments. Imagine a mini data centre located in each plot of a development, using the excess energy to heat space and water. This would represent a significant carbon saving for all involved.

This concept is not without its challenges. Although data centres produce heat constantly, it’s not very high quality, and even with the best form of heat exchange, you will struggle to get 30 degrees out of the system. But we as an industry must try, must ask ‘what if?’ and must push for rapid innovation. Unlike the adage, ‘the diet begins tomorrow’, it seems we may have already run out of tomorrows.

Kunle Barker is a property expert, journalist and broadcaster

.

.

Built Environment takes a major leap in Race to Zero

Advertisements

.

Built Environment takes a major leap in Race to Zero with new joiners and sector progress

4 May 2023

The built environment sector is responsible for almost 40 per cent of global energy-related carbon emissions and 50 per cent of all extracted materials. Because of this, the sector is critical for climate action. Critically, the long lifespan of built assets highlights the need to act now to avoid ‘locking-in’ emissions and climate risk long into the future.

The role of the Built Environment extends beyond emissions reduction. As the ‘stage’ on which our lives are played out, the Built Environment is the platform through which a resilient, equitable and nature-positive future is delivered.

In recognition of this, the Climate Champions have been supporting the sector to reach net zero emissions by 2050. As part of this work, the Built Environment team has been tracking the progress of ‘major’ businesses in the Race to Zero campaign across four sectoral stakeholder groups, which include architects and engineers, construction companies, real estate investment companies, and real estate asset managers.

The team found that 49% of major architects and engineers by revenue have joined the campaign, while only 16% of major construction companies by revenue have joined

Furthermore, 19% of major real estate investment companies by revenue and 29% of major real estate asset managers by revenue have joined the campaign, indicating that the sector is making progress towards decarbonization.

In April alone, six new companies joined the Race to Zero, including Kerry Properties Limited, a Hong Kong-based real estate company, and Daito Trust Construction Co., Ltd., a Japanese real estate company. Both of these companies are significant joiners and will contribute to the sector’s efforts to achieve net-zero emissions.

The Built Environment sector has also seen progress in terms of policy, with Dubai announcing its Climate Action Plan to reach net zero and reduce emissions. The WorldGBC has launched its Global Policy Principles, which are driving action in the sector towards achieving net-zero emissions.

In finance, UNEPFI’s Finance Sector Briefing has shown that over 50 major banks and investors have a developed understanding of the physical and transitional risks of real estate. This report paves the way for the finance sector to price the cost of non-resilient and inefficient buildings into their funding decisions.

The sector has several strategically important events coming up, including the World Circular Economy Forum in Helsinki, Finland, and the EmiratesGBC Annual Congress, which will discuss the road to COP28.

Notwithstanding the positive signals of change, currently the Built Environment sector is not on track to achieve decarbonization by 2050. UNEP’s 2022 Buildings Global Status Report shows that whilst decarbonisation efforts have increased since 2015, these efforts are swapped by the growth of the sector globally.

Addressing this call-to-action will require accelerating ‘radical collaboration’ across the value chain, to drive market transformation. The upcoming ‘Buildings Breakthrough’, due for launch ahead of COP28, will provide a forum for driving international collaboration to unlock climate action on buildings.

The Built Environment 2030 Breakthrough Outcome

Our dedicated Built Environment 2030 Breakthrough Outcome page provides information and resources for anyone interested in tracking the sector’s efforts to achieve net zero.

The page highlights the importance of the sector’s transition to a sustainable, low carbon economy and provides updates on the progress being made by key stakeholders, such as major architects/engineers, construction companies, real estate investment companies, and asset managers.

The page also features a list of new members who have joined the Race to Zero, along with relevant events, policy developments, case studies and partners, such as ​​the Buiding to COP initiative.

.

.