Your use of AI in marketing might be destroying the planet

Your use of AI in marketing might be destroying the planet

31st October 2023

Generative AI is great for customer targeting, personalisation, and campaign optimisation. But it’s not a magic marketing fix-all. From copywriting to research panels, there are many things that solutions like ChatGPT aren’t quite ready for — including helping you meet your net zero targets. Steven Millman, global head of research and data science at first-party data platform Dynata, is here to tell us why. 

‘It’s not necessarily commonly understood, but these language models are enormously computationally intensive, and the computers necessary to produce them require excessive cooling and a lot of electricity,’ says Steven. 

Let’s take GPT-3 as an example. In order to switch between data it already knows and new data from user input, it needs to be trained using a huge amount of information — 175 billion parameters, to be precise. Each of these parameters is a way of managing the connection between two basic units within a neural network. As you can imagine, at scale, this requires a lot of power. 

Steven estimates that merely training GPT-3 required 1,287 megawatt hours of electricity and generated 552 tonnes of carbon dioxide — the equivalent of 123 petrol-powered cars at average levels of use in the US for an entire year. Not to mention the approximate 700,000 litres of clean freshwater used to keep the data centres cool. 

It sounds bad, but remember: these figures are all estimates. The truth might be even worse. 

It’s difficult to assess how much energy generative AI solutions use because the companies running them, like OpenAI and Google, are reluctant to discuss just how environmentally damaging they can be. And when the use of AI technology is said to produce similar levels of carbon emissions to the aviation industry, you can see why. 

But does that mean you should stop using generative AI? 

Unless you can’t sleep at night knowing your conversation with ChatGPT earlier required 500ml of water, no. Even sending emails or Googling produces carbon dioxide somewhere down the line, and if it makes your marketing that bit more efficient and effective, it’s probably worth it. Put it this way: will your competitors be worrying about the environmental impact of using generative AI? 

It’s worth remembering that embedding AI in your operations doesn’t have to mean sticking to the energy-guzzling mainstream offerings. Steven suggests a move to smaller solutions could make a difference. 

‘If you take a small language model like BERT, which is trained on 110,000,000 parameters, they estimate that it consumes about the same amount of energy just to train that model as it would if you took a transcontinental flight.’ Still not ideal, but better than 626,000 lbs of carbon dioxide equivalent required to train larger-scale models. 

And there are many more benefits to developing your own generative AI solutions. Known as ‘walled gardens’, many businesses are creating their own language models limited to in-house and client use to mitigate the downsides of public-facing tools.

‘You might have hundreds of thousands of pages of technical documentation for certain kinds of products or services. And if you train the language model very tightly on those, it is much less likely that your chatbot is going to stray off or do something that, as we talked about last time, these very large models, they do have bias in them,’ says Steven. ‘So if you exclude all of that and only really allow it to focus on the documents you care about, it’s going to be much less [risky] to have that be client facing.’ 

Walled gardens not only isolate your sensitive PII, client data, and intellectual property from the source of your language model, but they also protect you from changes in the way the solution works. And the best part is, you don’t necessarily need to build them yourself. 

‘We get the benefit of it through applications that other people are building. So we don’t actually need to build our own data science or AI department because the applications that we’re using every day will just have it built in, like Salesforce or Office already does,’ says Steven. ‘You absolutely have to have staff on hand who understand in a great detail how they work. But I do think that as a service, that’s definitely the model where most companies are going to go.’ 

You heard it here first, folks: walled gardens and small solutions are the way forward for both planet and profit. And if you’re still feeling bad about it, get yourself a reusable water bottle and start taking the bus. 

From driver tag-powered recommendation engines to how to build your own walled garden, discover more expert data insight from Steven Millman in the full podcast.

Related Articles:

ChatGPT and AI exchange software costs 10x ordinary search


Written By
Ornella Weston is the managing director of boutique agency Duckman Copy Ltd. Throughout her career, she’s written everything from white papers to websites, from billboards to board game instructions. Now it’s time to add op-eds to the list. The marketing world is full of flops and failures, often at great...
  • This field is for validation purposes and should be left unchanged.