saveasteading Posted September 27 Posted September 27 The attached is from 'Building' Magazine this week. It ties in to some recent discussion about power where and when it is needed, and seems knowledgeable and well written to me. So the solution is obvious... no I'll let you read it first.
SteamyTea Posted September 28 Posted September 28 Not all the image will load up. But no matter, as it is an opinion piece, I shall spout opinions in return. Generally, a new technology is very inefficient, then gets better over time. My first proper PC used a constant 300W when in use, my much better laptop that I use now, about 10W. Now I know that some of the work is done by server farms elsewhere, and I am using services that did not exist in 1994, but as the 'grid' has got cleaner, even allowing for much greater usage, the total energy usage, and the associated CO2e emissions, are probably down (for my personal usage). The big problem is the access and usage to the energy. Environmentally it may be better to run desalination plants, heat pumps, electrify transport etc, but if we never gave new ideas a go, then we would still be bang rocks together in caves. Energy-hungry AI is already harming health – and it's getting worse The electricity required to support AI could contribute to approximately 600,000 asthma cases and create a $20 billion public health burden by 2030 By Jeremy Hsu 10 December 2024 Servers fill a data centre in Texas Paul Moseley/Fort Worth Star-Telegram/Tribune News Service via Getty Images As data centres consume even more energy to serve the intensive computing needs of artificial intelligence, they increase the emissions of air pollutants. This could already be impacting public health and, by 2030, it could contribute to an estimated 600,000 asthma cases and 1300 premature deaths per year in the US – accounting for more than a third of asthma deaths annually in the country. “Public health impacts are direct and tangible impacts on people, and these impacts are substantial and not limited to a small radius of where data centres operate,” says Shaolei Ren at the University of California, Riverside. Because airborne pollution can travel long distances, increasing levels of pollutants can affect the health of people across the country, he says. Ren and his colleagues developed those estimates based on the projected electricity demand from data centres. In the US, some of that demand is being met by burning fossil fuels, which produce air pollutants known to cause health problems, such as fine particulate matter. For instance, the electricity usage required for training one of today’s large AI models could produce air pollutants equivalent to driving a passenger car for more than 10,000 roundtrips between Los Angeles and New York City, according to the researchers. To model these air pollution and emissions impacts in the US, the researchers used a tool provided by the US Environmental Protection Agency. They calculated that, nationally, data centres will have an overall public health cost that may exceed $20 billion by 2030. That is approximately double the public health cost of the US steelmaking industry and possibly rivals the health impact of pollutants emitted from the tens of millions of vehicles in the largest US states, such as California, according to the researchers. Energy-hungry data centres are already affecting public health. The researchers estimated that the gas-powered generators used as backups in Virginia’s Data Center Alley could already be causing 14,000 cases of asthma symptoms and imposing public health costs of $220 million to $300 million per year. Those are the costs if the generators only emit 10 per cent of the pollutants that state authorities permit to be emitted each year. If they are used frequently enough to emit the maximum permitted level, the total public health cost could be up to $3 billion per year. Such problems affect not only local residents, but also people in distant states such as Florida. Some of the tech companies racing to build data centres are supporting low-emission energy sources, financing construction of renewable energy projects and investing in both conventional nuclear power plants and new nuclear reactor technologies. But for now, many data centres still heavily rely on fossil fuel power such as natural gas – with previous research suggesting that data centres could boost US demand for gas approximately equivalent to another New York State or California by 2030. “The question around the health impacts of artificial intelligence and data centre computing is an important one,” says Benjamin Lee at the University of Pennsylvania. He described the paper as “the first to estimate these costs and quantify them in dollar terms” but also cautioned that the underlying approximations and assumptions behind the specific numbers need to be validated by additional research. Journal reference arXiv DOI: 10.48550/arXiv.2412.06288 The power of AI for environmental stewardship and optimised industry Behind the scenes, industrial artificial intelligence is transforming the efficiency and performance of companies. But the bigger picture is the implication for global sustainability 29 May 2024 Swedish water supplier VA SYD used industrial AI techniques to reduce water leakage rates from 10 per cent to less than 8 per cent Kentaroo Tryman We’ve never had such powerful tools to solve the challenges we face in sustainability,” says James Cole, Chief Innovation Officer at the Cambridge Institute for Sustainability Leadership (CISL), in the UK. “Artificial Intelligence (AI) systems have the potential to help us understand the world in all its complexity to optimise industrial processes for holistic business, social and environmental outcomes.” The arrival of AI across the industrial sector has been heralded as the next great industrial revolution. But unlike past revolutions, which typically accelerated the consumption of resources, AI offers the opportunity to slash waste while boosting efficiency to levels only previously dreamed of. Within this context, global technology company, Siemens, is working to strengthen the emerging link between AI-optimised industry and environmental stewardship as society pushes towards net-zero carbon emissions. At heart, AI is software that performs tasks traditionally requiring human intellect, such as understanding text, identifying complex patterns, modelling processes, and making predictions. But in an industrial context, AI systems must be engineered for reliability and security. That allows them to be built into the industrial backbone of economies, optimising and improving processes in everything from healthcare and mobility to power generation and infrastructure. Take water management. One way to improve water sustainability is to lose less of it through leaks. Yet ageing pipes and ground movements make leaks inevitable. “About 30 per cent of the drinking water the world produces is wasted – a shockingly high figure,” says Adam Cartwright, Siemens’ Industry Strategy Director for Software in Water and Waste Water. “Every time you avoid losses, you’re not only saving money but better managing precious water resources.” VA SYD is one of Sweden’s largest water companies supplying drinking water to over half a million customers. Previously, it was losing 10 per cent of its water but had no means of detecting small leaks. Water Intelligence By training an AI model on historical data from VA SYD’s water network, the Siemens Leak Finder application learned to identify and locate leaks, even small ones losing just 0.25 litres per second. That allowed VA SYD to reduce leaks to a world-leading level of less than 8 per cent. Meanwhile, Siemens is working with Yorkshire Water and the University of Sheffield, using AI to protect the environment at the other end of the water cycle. In combined sewage systems, stormwater runoff and household sewage together flow to water treatment plants. And in times of intense rainfall, combined sewer outlets are designed to release excess water and sewage into rivers to prevent flooding in public areas. One challenge is that blockages in pipes can lead to unnecessary releases, but these obstructions are hard to detect (see diagram). To address this, Siemens developed a blockage-predicting AI trained on data from thousands of sensors on Yorkshire Water’s sewer outlets, in combination with rainfall data. It learned what a properly functioning sewage network looks like under different weather conditions. When its monitoring system detects flows in the network behaving unexpectedly, it simulates potential blockages in different locations to work out where a real blockage may be developing. The predictor finds 90 per cent of potential issues; three times more effective than traditional, statistical approaches. And it provides as much as two weeks’ warning of impending blockages, while halving the previous rate of false alarms. Looking ahead, Cartwright considers the potential for AI and digital technologies to reduce carbon and environmental impact by more carefully managing water and energy use. “Pumping water accounts for 2-3 per cent of a country’s power use,” he says. By ensuring that this pumping is done when energy is at its cheapest and greenest, industrial AI can reduce costs for water and increase resilience for both sectors. AI is already helping manage the environmental impact of another key part of modern society’s infrastructure: data centres. Their energy usage is significant, much of it used to keep their thousands of servers cool. How water works But consider Greenergy Data Centers’ facility in Estonia. Already powered solely by renewable energy, the company further reduced its environmental footprint and energy costs using AI. Servers produce heat depending on their workload but this can change more quickly than conventional cooling systems can react to. To combat this, Siemens developed AI-supported software that uses real-time temperature and airflow data collected by sensors all over the data centre, in addition to information on server workload. The system can then anticipate cooling needs to maintain optimal temperatures throughout the facility. “When we first launched the system, it improved our efficiency by approximately 30 per cent at the push of a button,” says Kert Evert, Chief Development Officer of Greenergy Data Centers. It’s an example of AI being part of the solution to one of its own challenges, given concerns over the amount of energy AI requires. Here Pina Schlombs, Sustainability Lead, Siemens Digital Industries Software, notes the outlook of AI computing efficiency is improving drastically. Consider also an AI model in the industrial space that accelerates product design for optimal environmental lifetime impact or increases resource and energy efficiency, she says. “With a holistic perspective, we can gauge whether the sustainability benefits of AI outweigh the resources to train and run it.” To realise AI’s benefits within any industry, continues Cartwright, the first step is making full use of existing infrastructure, data and sensor networks. If additional hardware is required, it should be easily linked to existing hardware and asset-management software through secure, standard protocols. “This interoperability is key to supporting the diverse, evolving needs of industrial applications,” says Cartwright. Organisations can then benefit from the convergence of AI with other technologies; its ability to help master complex problems at speed and scale. CISL’s accelerator programmes reflect this potential, having supported over 350 startups in the past three years, including those showing the power of AI to solve complex challenges. Monumo, for example, is revolutionising electric motor design, enabling more design simulations to be run than conventional approaches – finding faster answers to energy efficient vehicle design. All this makes AI an ally in humanity’s effort to achieve a smarter, environmentally friendly future. Cole agrees. “AI has the potential to help us make better decisions through greater understanding. This facilitates better collaboration across industries, to foster alignment on how we realise a sustainable tomorrow.” Tech firms claim nuclear will solve AI's power needs – they're wrong Some AI firms think nuclear power can help meet the electricity demand from Silicon Valley’s data centres, but building new nuclear power stations takes too long to plug the gap in the short term By Jeremy Hsu 16 May 2024 The Three Mile Island Nuclear Generating Station in Pennsylvania, which closed in 2019 Michael Ventura/Alamy Silicon Valley wants to use nuclear power to support the energy-hungry data centres that help train and deploy its artificial intelligence models. But realistic timelines show that any US nuclear renaissance will have at best a limited impact during a period of fast-rising electricity demand. Global electricity usage from data centres is already on track to double by 2026. In the US, data centres represent the fastest-growing source of energy demand at a time when the country’s peak electricity demand is rising sharply. Meanwhile, the US government estimates that nuclear power’s share of overall US power generation will remain flat at best in the near future, partly because advanced nuclear reactor technologies remain years or decades away from commercial readiness. “The share of nuclear energy in the power mix has held steady at about 20 per cent for decades, and future declines seem likely,” says Amanda Levin at the Natural Resources Defense Council, a non-profit based in New York. “While our models show that [growing demand for electricity] could result in the extension of operating licenses of many older reactors, there’s no sign that new reactors will be built anytime soon.” Prospects are even more uncertain for the construction of advanced nuclear technologies, such as the small modular reactors that could act like mini nuclear power stations, or experimental fusion reactors. Several start-ups developing such technologies are backed by tech billionaires such as OpenAI co-founder Sam Altman and Microsoft co-founder Bill Gates. But none are likely to start operating commercially before 2030. By then, electricity demand from US data centres is expected to have doubled or even tripled relative to 2022. The data centres will potentially use as much electricity as 40 million US houses. “New technologies like small modular reactors will not be able to contribute at all to data centres by 2030, and it’s really doubtful that they’ll be able to contribute much by even 2040,” says Allison Macfarlane at the University of British Columbia in Canada, a former chairperson of the US Nuclear Regulatory Commission, an independent agency of the US government. She also questions if nuclear power can prove cost-competitive with cheaper renewable power and improved energy storage technologies. Tech companies are also looking to renewables and batteries to reduce their carbon emissions. But major US utilities are already planning to build more natural gas plants and delay retirement of coal plants to meet electricity demand. Conventional nuclear plants can still play a role in the short term, says Adam Stein at the Breakthrough Institute, a research centre in California. Stein says that companies such as Amazon and Microsoft are buying data centres located next to existing nuclear power plants or seeking to purchase a stable electricity supply from nuclear plants at fixed prices. Developers are also proposing to build massive data centres co-located with nuclear plants. The future of AI: The 5 possible scenarios, from utopia to extinction How will the rise of artificial intelligence ultimately pan out for society? We sketch the most likely outcomes, including a world where AIs solve all our problems and another in which they wipe us out As a proponent for both nuclear power and renewable power, Stein argues that advanced nuclear technologies will be important for meeting electricity demand growth beyond 2030 – especially in providing more jobs than comparable renewable technologies, to soften the economic blow of job losses as coal plants are shut down. But he acknowledges the uncertainties surrounding the timeline for more experimental bets such as fusion power. “Fusion still needs to achieve one or more breakthroughs in the engineering, and you can’t schedule a breakthrough,” says Stein. China’s first underwater data centre is being installed To hold and cool computer servers, China has installed a 1300-tonne watertight cabin on the shallow seafloor – it is the first of 100 planned for an underwater data centre By Jeremy Hsu 4 December 2023 Off the coast of Hainan, China, an underwater data centre is being built Tang Wai Chung/Alamy An underwater data centre that harnesses the ocean’s natural cooling capability is taking shape near China’s Hainan Island in the South China Sea. Keeping computers cool can slash power usage and carbon emissions, and this project could pave the way for putting supercomputers and data farms underwater. The first phase of construction is nearing completion with a 1300-tonne watertight cabin installed on the shallow seafloor, according to the Chinese state broadcaster CGTN. The report described the project by Beijing Highlander Digital Technology Company as “a step toward the world’s first commercial data centre under the sea” with 100 such cabins being planned for deployment. “Big data centres are warehouses that have racks and racks of computers that all have to be cooled, and there are really expensive ways of cooling them using huge air conditioners,” says John Abraham at the University of St. Thomas in Minnesota. “The nice thing about seawater is that water is much better at transferring heat than air.” The Highlander underwater data centre combines traditional cooling systems for the computer servers within each cabin – which still use electricity – with a passive seawater cooling system located on top of each cabin. That design could be 40 to 60 times more energy efficient than typical land-based data centres, according to Highlander. “With AI and the number of graphics processing units that are being put in [data centres] and cloud-based storage, the demand for cooling is going to rise tremendously into the future,” says Abraham. “This could be a breakthrough technology that allows us to navigate the rollout of even bigger data farms.” However, water pressure is a concern for submerged data centres. Highlander’s approach to handling this is primarily suited to shallow-water deployments, says Maxie Reynolds, founder of Subsea Cloud, a start-up focused on underwater data centres. But she described it as “still effective” and a “significant step change” toward sustainable data management. By comparison, Subsea Cloud’s design relies entirely on passive cooling by immersing computer servers in a special liquid to passively transfer heat through the pod walls into the surrounding ocean. The startup has also tested pods designed to survive in the deep, reaching 3000 metres – Highlander’s shallow-water facility near Hainan Island is installed at a depth of 35 metres. Microsoft previously tested an underwater data centre prototype with its Project Natick deployments between 2015 and 2020, but has not followed up with commercial deployments since that time. Subsea Cloud has established US partnerships and is giving potential clients a preview of its first European Union development that could be operational in 2024. 1
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now