This programming language could halve the land used by data centers
Usually, the decision about which programming language to use in any given project only matters to software developers using it and the customers who use the end product, but the programming language Rust offers something for all of us: a way to reduce data centers' massive carbon footprint.
The problem of data center carbon emissions is well-documented. According to a 2021 paper in the journal Environmental Research Letters, the need to run cloud computing systems (and store the unfathomable amount of data they produce and collect in massive data center sites) accounts for as much as 1.8% of all the electricity used in the United States in a given year.
The reason for this is reliance on specific programming languages like JavaScript, Python, and Go that require an expensive computational process to remain stable and secure enough for cloud use. As companies increasingly rely on these less-efficient languages, their resource needs in terms of both physical land to occupy and energy needed to operate continue to grow every year.
Some languages, like C and C++, are far more efficient but are notoriously vulnerable from an information security and system stability perspective. In an environment where these kinds of nasty tradeoffs are increasingly problematic, many large companies are turning to Rust, a niche programming language that aims to let developers have their computational cake and eat it too.
What's the issue with Java (and JavaScript, and Python, and Go, and...)?

To understand why data centers use so much energy, we need to look at a very fundamental concept in how programs work: memory access.
When you run a program, a certain amount of system memory is allocated for its use. Within this allocation, the program runs its instructions and processes data, constantly swapping values in and out of specific, physical locations in the system's memory reserve. In older languages like C and C++, these memory addresses can be directly accessed from within the program itself without having to ask for permission (which was already granted when the memory was allocated).
The problem is that even if memory has an ironclad electrical logic built into it, the programmers who are manipulating it are humans, and humans make mistakes like forgetting to give memory allocations back to the OS when they are done with them or attempting to access parts of system memory for which they do not actually have permission.
Older languages will compile these errors into the program and run it without safeguards, leading to program crashes when an operating system denies a program's attempts to access forbidden memory. Or even worse, bad programming can leave sensitive data behind in memory after the program is done with it, making it accessible to malware and viruses.
In order to prevent these issues, newer languages have guardrails in place that periodically clears out the memory that is no longer being used. Many will also put a digital wall around the physical hardware that programs cannot bypass, making programmers write code to ask permission each time from a kind of digital manager who is very strict about filing the proper forms whenever you need a new piece of office equipment.
Generally, computers are so fast now that the processing delays created by these guardrails aren't always noticeable to the end user, but at a system level, the delays are substantial, and when these programs are scaled up to the level of cloud computing data centers like those used by Amazon, Google, and Microsoft, the seemingly insignificant inefficiencies in newer languages scale up along with them.
Which is what has so many companies interested in the Rust programming language.
What is Rust?

Rust is a programming language first created back in 2006 by a programmer frustrated by older programming languages' propensity to crash unpredictably. In a new piece in the MIT Technology Review, the origin story and development journey of the Rust language is laid out in detail and is well worth the read, but what really stands out is the long process of refactoring a close-to-machine-level programming language that can manipulate memory freely, but with the rigid system bureaucracy of a language like Java.
When older languages were created, things like security vulnerabilities weren't well understood, so these blindspots are effectively cured into the language's foundation, never to be extricated or fixed. Rust, meanwhile, recreates the efficiencies of those older languages in a so-called memory-safe way, giving it the stability and security of a more modern language without the resource overhead that can slow things down.
At the scale of a data center, shifting from these less efficient languages to Rust could produce absolutely massive energy savings and a reduction in the carbon emissions these facilities produce, and we're not talking about a 5% or 10% reduction, we're talking as much as 50% or more.
Now, that's not to say that we will achieve those savings, even if we suddenly programmed everything in the cloud using Rust tomorrow. Like adding a couple of new lanes to a traffic-congested highway, reducing the energy demands of existing cloud computing services simply frees up that extra energy for...more cloud computing services.
But that's not to say there will be no energy efficiency gained in the process since smaller data centers can be more widely distributed nearer or even within cities which can reduce the distance data networks have to travel, which in itself will save a great deal of energy.
Still, there is a very clear issue with how data centers are running right now, and with the costs of climate change only increasing as we put more carbon emissions into the atmosphere, any energy savings are worth pursuing, especially if it also comes with faster, more responsive software with less instability, better security, and fewer crashes.