32 min

Can you make AI sustainable‪?‬ Technology Untangled

    • Technology

In this episode we are looking at the challenges AI technology faces when it comes to becoming, and then remaining sustainable.The benefits of AI are unquestionable: from improved medical assistance and increased efficiency in the workplace, to autonomous transportation and next-level gaming experiences. But the more expansive the abilities of AI become, the more data storage that’s required.
That data storage uses a lot of energy. In fact, it has been predicted that AI servers could be using more energy than a country the size of the Netherlands by 2030.
For HPE Chief Technologist, Matt Armstrong-Barnes, the rate at which AI has grown in recent years has had an environmental impact, and he believes that’s down to people rushing into training large language models without thinking about longevity, or the need for future change. And that, in turn, has led to data being stored that is no longer needed.
The sustainability issue is something that is also a main focus of Arti Garg, Lead Sustainability & Edge Architect in the office of the CTO at Hewlett Packard Enterprise. Like Matt, Arti has kept a keen eye on the exponential growth of AI data storage and the effect that is having on the environment, and agrees that the key to a more sustainable future is in how we train models.
However, whilst training models well is important, the tech itself is a key component in more efficient AI. Shar Narasimhan is the director of product marketing for NVIDIA's data center GPU portfolio. He believes that a combination of openly available model optimisations and chipsets, CPUs, GPUs and intelligent data centers optimised for AI is a key piece of the puzzle in avoiding energy wastage, and making AI more sustainable all round.
Sources and statistics cited in this episode:Global AI market prediction - https://www.statista.com/statistics/1365145/artificial-intelligence-market-size/#:~:text=Global%20artificial%20intelligence%20market%20size%202021%2D2030&text=According%20to%20Next%20Move%20Strategy,nearly%20two%20trillion%20U.S.%20dollars.AI could use as much energy as a small country report - https://www.cell.com/joule/fulltext/S2542-4351(23)00365-3?_returnURL=https%3A%2F%2Flinkinghub.elsevier.com%2Fretrieve%2Fpii%2FS2542435123003653%3Fshowall%3DtrueIndustry responsible for 14% of earth’s emissions - https://www.emerald.com/insight/content/doi/10.1108/JICES-11-2021-0106/full/htmlNumber of AI startups - https://tracxn.com/d/explore/artificial-intelligence-startups-in-united-states/__8hhT66RA16YeZhW3QByF6cGkAjrM6ertfKJuKbQIiJg/companiesAI model energy use increase - https://openai.com/research/ai-and-computeEuropean Parliament report into AI energy usage - https://www.europarl.europa.eu/RegData/etudes/STUD/2021/662906/IPOL_STU(2021)662906_EN.pdf

In this episode we are looking at the challenges AI technology faces when it comes to becoming, and then remaining sustainable.The benefits of AI are unquestionable: from improved medical assistance and increased efficiency in the workplace, to autonomous transportation and next-level gaming experiences. But the more expansive the abilities of AI become, the more data storage that’s required.
That data storage uses a lot of energy. In fact, it has been predicted that AI servers could be using more energy than a country the size of the Netherlands by 2030.
For HPE Chief Technologist, Matt Armstrong-Barnes, the rate at which AI has grown in recent years has had an environmental impact, and he believes that’s down to people rushing into training large language models without thinking about longevity, or the need for future change. And that, in turn, has led to data being stored that is no longer needed.
The sustainability issue is something that is also a main focus of Arti Garg, Lead Sustainability & Edge Architect in the office of the CTO at Hewlett Packard Enterprise. Like Matt, Arti has kept a keen eye on the exponential growth of AI data storage and the effect that is having on the environment, and agrees that the key to a more sustainable future is in how we train models.
However, whilst training models well is important, the tech itself is a key component in more efficient AI. Shar Narasimhan is the director of product marketing for NVIDIA's data center GPU portfolio. He believes that a combination of openly available model optimisations and chipsets, CPUs, GPUs and intelligent data centers optimised for AI is a key piece of the puzzle in avoiding energy wastage, and making AI more sustainable all round.
Sources and statistics cited in this episode:Global AI market prediction - https://www.statista.com/statistics/1365145/artificial-intelligence-market-size/#:~:text=Global%20artificial%20intelligence%20market%20size%202021%2D2030&text=According%20to%20Next%20Move%20Strategy,nearly%20two%20trillion%20U.S.%20dollars.AI could use as much energy as a small country report - https://www.cell.com/joule/fulltext/S2542-4351(23)00365-3?_returnURL=https%3A%2F%2Flinkinghub.elsevier.com%2Fretrieve%2Fpii%2FS2542435123003653%3Fshowall%3DtrueIndustry responsible for 14% of earth’s emissions - https://www.emerald.com/insight/content/doi/10.1108/JICES-11-2021-0106/full/htmlNumber of AI startups - https://tracxn.com/d/explore/artificial-intelligence-startups-in-united-states/__8hhT66RA16YeZhW3QByF6cGkAjrM6ertfKJuKbQIiJg/companiesAI model energy use increase - https://openai.com/research/ai-and-computeEuropean Parliament report into AI energy usage - https://www.europarl.europa.eu/RegData/etudes/STUD/2021/662906/IPOL_STU(2021)662906_EN.pdf

32 min

Top Podcasts In Technology

No Priors: Artificial Intelligence | Technology | Startups
Conviction | Pod People
Lex Fridman Podcast
Lex Fridman
All-In with Chamath, Jason, Sacks & Friedberg
All-In Podcast, LLC
Acquired
Ben Gilbert and David Rosenthal
Hard Fork
The New York Times
This Week in XR Podcast
Charlie Fink Productions