I've been absolutely chuffed to be part of the Green Software Foundation, as one of the many members and contributors to the Software Carbon Intensity specification.
From my perspective, the overall goal of the specification is to drive action to reduce the carbon impact of the software in the world around us. Many of us have taken a module on software profiling at university - using tools like gprof to track the performance of software, and identify areas for rewriting and refactoring, to improve software speed.
Until now though, how many of us actually go and do this? The glut of cheaply available compute has made us somewhat spoilt - and I'd boldly say (or whisper) lazy at times.
Carbon emissions of our software changes this imperative. It isn't just important to refactor software for speed and efficiency, but also to improve the electrical performance, and thus potentially the carbon emissions generated by running software.
Standing on the shoulders of many others in the foundation, the specification itself describes three actions that should be taken to reduce the carbon emissions of software:
- Energy Efficiency: Actions taken to make software use less electricity to perform the same function.
- Hardware Efficiency: Actions taken to make software use less physical resources to perform the same function.
- Carbon Awareness: Actions taken to time or region-shift software computation to take advantage of clean, renewable or low carbon sources of electricity.
(Original source, Green Software Foundation, . (2021). Software Carbon Intensity Standard (Version 1.0.0))
These actions can come in many forms.
For cloud aficionados, energy efficiency can be approached both by improving the efficiency of the underlying software (or using techniques like transfer learning when training a machine learning model), but also by rightsizing the cloud infrastructure being used. For example, Sara Bergman has written an excellent article on calculating the CO2eq of a virtual machine; using a similar formula, we can roughly estimate the CO2eq of a container, or an Azure Function.
Whilst the target figures are likely some way off from the reality (presumably only your cloud provider knows the true figures), we can be confident enough to say that Containers and Serverless Computing are likely significantly more energy & carbon-efficient than running an application in a whole virtual machine.
Hardware efficiency in turn could mean moving to a cloud, with economies of scale, and the ability of a cloud provider to be carbon neutral. Microsoft commits to be carbon negative by 2030 for example, and to remove all historic carbon since it was founded in 1975, by 2050.
However, hardware efficiency could also include extending the life of your existing hardware. For example, the Open Compute Project (OCP) has been looking at how to apply circular economic principles to your IT infrastructure.
The final action that the specification describes, is Carbon Awareness. That is, knowing the current fuel mix of your energy grid, and running applications when the energy has been created through renewables, as opposed to fossil fuels.
Some datacentres have private energy supplies; but others are connected to the public and national grid, in whatever country they happen to be located.
For the UK, you can see the UK's energy fuel mix in real time at the electricity info website. At the time of writing, 22.7% of the UK's energy in the grid right now, is from renewable sources. In the last 24 hours, 43% of the energy came from renewable sources.
If you can embed this awareness into your software, you can enable your software to lower its own emissions.
It's exciting. This specification enables everyone to take an action to reduce the carbon emissions of the software we use, day-to-day.
Are you ready for a return to software profiling? Be prepared to dust off those skills.