How to stop data retention from killing planets

The age of data has a hidden environmental cost that few of us think about – and it’s time for a change.

While the average content management system may be full of business intelligence information to revolutionize how an enterprise operates, it only accounts for a fraction of the petabytes of data stored every day.

Cited by a report from IDC the conversation Estimates suggest that by 2025, society will store 175ZB (zettabytes) of data – an exponential explosion from the 59ZB total in 2020. To put it into perspective, that’s enough to fill about 1.5 trillion mobile phones.

And the truth is that the way data is stored has turned businesses and consumers into a race of hoarders, logging everything, in case it’s needed again one day—much of which can never be accessed again.

But for better or worse, it has to be stored somewhere, and regardless of your stakeholders’ green credentials, it comes at a cost. A single plain text email generates about 4g of CO2. Add pictures and it’s more like 50g. It’s not trivial – and yet it’s an environmental impact that’s rarely talked about.

heated debate

Computer hardware, like most electronic equipment, generates heat. Plenty of it. And one of the big ironies is that heat is extremely damaging to chips and diodes, which means it has to be cooled. As a result, Even the smallest server room needs cooling equipment To bring the temperature below the ambient temperature of an empty room. In effect, we are using twice as much electricity, twice as much energy and twice as much carbon to maintain the status quo.

The way data is stored has turned us into a race of hoarders, logging everything, just in case it’s needed again one day. This comes at a cost – a single plain text email generates around 4g of CO2; Add pictures and it’s more like 50g

So what can be done about it? It’s a question that has plagued the IT industry for years, and the lack of a definitive answer often makes it easy to turn on another air-conditioning unit and look the other way. But it does more damage. So what are the alternatives?

Storing less data seems like an obvious answer, but it’s almost impossible to implement, because who decides which parameters are worth recording and which aren’t? The BBC learned this the hard way when it trashed much of its TV archive in the 1970s and 1980s, assuming it was of no use. Then came VCRs, DVD players, and of course streaming. Any query Who is the doctor? Fans and stars alike will drool over the number of first episodes of long-running sci-fi series that were lost due to lack of foresight, perhaps forever.

So, this is done to justify the digital hoarding. But it all has to be stored somewhere, and those facilities have to be environmentally controlled.

Deep freeze data

Not all information needs to be immediately accessible. Offline storage still has a legitimate place in the online world.

Take CERN, for example, its home Large Hadron Collider. Much of the data it has generated from countless experiments over the past 50 years is still kept on spools of tape, and is available only upon request by a university. It can take anywhere from 30 minutes to two hours for cold storage data to become available – but it’s there. Of course, in another great irony, this shining beacon would be better served if all that data were digitized and cross-referenced—but at least, as it is, it has a much lower carbon footprint.

There is an even colder type of storage – 250 meters below the permafrost inside the Arctic Circle in Svalbard, Norway. GitHub Arctic Vault. As of 2 February 2020, a snapshot of each data GitHub – every repository, every codebase, every user profile – was taken and buried in this secluded fortress near the North Pole. It is not designed to be accessed, but rather to serve as a disaster recovery plan for the end of the world. But this raises the question – does our data really need to be online?

Another is growing A popular way to keep computer equipment cool is with water – Gallons of recycled fluid are cooled and sent through tubes to pass over heat-generating hardware, reducing the need to cool the entire environment. The downside is that such systems are extremely difficult to restore – they usually require the removal of walls and floors to create an entire plumbing system for the sole purpose of cooling.

Enterprises that don’t want to tear down their buildings can replace their on-premise systems with a colocation benefits. Many have water coolant as standard, but more importantly, by sharing the physical location of their data, they also share cooling costs and there is less waste – after all, it costs the same to cool an on-premise datacenter whether it’s full or not. why Burst or three quarters blank.

Many companies are already investigating more radical solutions that are greener and ultimately cheaper. Launched by Microsoft Project Natick in 2018 As a test, two datacenters were submerged 117 feet below the Pacific Ocean. The units were placed in a dry nitrogen atmosphere, while the surrounding water acted as a natural coolant. Future iterations of Project Natick could add offshore wind farms to create fully self-sufficient, carbon neutral data storage.

Talking about the project After its initial pilot, Project Nautics, a member of Microsoft’s Special Projects Research Group, explained that underwater datacenters also had a much lower failure rate than land-based data storage facilities. “Our failure rate in water is one-eighth of what we see on land,” Cutler said. .”

acres of land

NASA has just begun sending humans back to the Moon as part of its first unmanned test mission Project Artemis, which would ultimately see the first manned mission to the Moon in more than half a century. The temperature on the dark side of the moon is capable of dipping to a rather cold -230 degrees Celsius and Nokia has already signed a contract to provide a Chandra 4G service, which raises the question of whether datacenters on the Moon could be a future possibility.

The equipment would have to be underground, perhaps in craters and lava tubes of the type near the poles, which have frozen the water as huge deposits of ice for thousands of years – otherwise they would be at the other extreme of the moon – a blistering 120 ° C. daytime temperature

The dark side of the moon is capable of plunging temperatures to -230 degrees Celsius, raising the question of whether datacenters on the moon could be a future possibility. The downside to storing data in space is latency – and maintenance

In moving spacecraft, the temperature is controlled by rotating the craft like a rotisserie to expose it to the sun. For a stable environment on the Moon, overcoming these extremes remains a challenge that Artemis will seek to explore during its mission.

The downside to storing data in space is latency – the almost imperceptible delay in voice communication between Earth and the Moon would be much more noticeable in data transfer, and as such, cold storage in space needs to be just that – an archive for all the information that might come in handy one day, but no longer. can never be accessed.

There is another problem with storage at sea and in the air alike – maintenance. It’s not easy for a man to pop down to the beach to swap out a faulty drive. As such, the infrastructure must include enough redundancy to keep everything running between scheduled visits, which will likely be once or twice per decade.

And so, for many of these innovations, an immediate revolution is unlikely. Project Natick has just begun its second phase of experiments, while any lunar infrastructure is likely decades away. So back to Earth, what else can we do to reduce the environmental impact of our digital hoarding?

A concept is involved Carbon Offsetting, and data scientists could see the growth of increasingly green fingers. In these huge cathedrals made of endless racks of hardware, there are huge spaces that can be filled with nature’s solution – plants. Datacenters need a dry environment, but then so do cacti, which, like most plants, are voracious consumers of carbon dioxide, which they can absorb from the air and store in their soil. Cacti have evolved to survive in the dry conditions of the desert, so they’ll feel right at home surrounded by computer equipment. Who knows – maybe one day the best tequila will be distilled using agave grown by a billion data points.

However, it is important to note that carbon offsetting only works if the company produces a carbon reduction that was not previously planned. Paying someone not to cut trees that can’t be cut anyway, as is common with many offset schemes, simply doesn’t add up – it green washPure and simple.

Solutions in training

It should also be considered that, under the right circumstances, all this heat can be a good thing. Instead of wasting it, if the excess temperature can be harnessed, it can be used as an inexpensive source of heat for homes and businesses.

At a disused London Underground station in Islington, Bunhill 2 Energy Center harnesses all the waste heat produced by the Northern Line and converts it to heat local office buildings and an entire housing estate. Plans for more similar schemes are in the pipeline If it can be done for waste transport heat, there’s no reason not to do it for waste-sparing datacenters. – All it takes is desire.

Talking about this projectAndy Lord, Managing Director of London Underground said: “Capturing waste heat from tube tunnels and using it to provide heating and hot water to thousands of local homes has never been done before in the world so this ground-breaking partnership with Islington Council is a really important step forward. Heat from the London Underground has the potential to be a significant low-carbon energy source and we are continuing further research to identify opportunities for similar projects across the Tube network, as part of our Energy and Carbon Strategy.”

As wonderful as it would be to be able to conclude that we have solved all the problems caused by waste and environmental impact caused by our obsession with data, all these assumptions are flawed. Some are practical, like how to swap a hard drive under a hole. Some are financial, as installing water cooling is not something any company will invest in willingly.

What matters is that there are possibilities – many of them – that will make our dirty data a bit cleaner, and even have a positive impact on the environment. It takes the collective will of the industry to take these ideas forward and make the revolution part of the solution rather than part of the problem.

Source link