IN THE ever-evolving landscape of artificial intelligence (AI), the trends point towards an insatiable appetite for larger, more powerful models. Large language models (LLMs) have become the torch-bearers of this trend and epitomise the relentless quest for more data, more parameters, and inevitably, more computational power.
But this progress comes at a cost, one not adequately accounted for by Silicon Valley or its patrons – a carbon cost.
The equation is straightforward yet alarming: Larger models equate to more parameters, necessitating increased computations. These computations, in turn, translate to higher energy consumption and a more substantial carbon footprint.
While the benefits of...