Addressing AI’s Hidden Energy Costs: NeoLogic’s Efficient Microprocessor Solution

·

·


Let’s talk about the elephant in the room — no, not the potential destruction of humankind by AI (though that’s a fascinating topic). I’m talking about the hidden costs of AI, specifically energy and resource strain. It’s a pressing issue that’s been flying under the radar amidst all the groundbreaking breakthroughs. But here’s the thing: our AI-powered devices are becoming unparalleled energy consumers, and it’s only going to get worse.

Take it from Dr. Avi Messica, NeoLogic’s co-founder and CEO, who says the common use of AI is already leaving a mark on the workload of data centers. NeoLogic was founded to address the increasing workload of video, artificial intelligence, machine learning, and data analytics in the cloud and at the edge, all dependent on CMOS processors. In other words, the company is at the forefront of the fight to make AI more energy-efficient.

Beyond Moore’s Law

For years, the development of semiconductors followed Gordon Moore’s observation that the number of transistors in a dense integrated circuit doubles about every two years. However, that 2-year doubling cycle has lost steam, with chip densities no longer doubling. Even Nvidia CEO Jensen Huang recently commented that “the ability for Moore’s Law to deliver twice the performance at the same cost, or the same performance [for] half the cost in every year and a half, it’s over.”

But don’t fret! NeoLogic’s founders have invented a new way of microprocessor’s operation that goes beyond Moore’s Law. Their approach to digital circuit design relies on complexity reduction of digital circuits by modifying their topology. This results in more power-efficient and compact processors than those achievable with state-of-the-art CMOS technology.

The Heavy Burden on Data Centers

AI and machine learning, particularly large language models like ChatGPT, are becoming a heavy burden on data centers. Each transmission of data between memory and processors consumes energy, and the more extensive and data-intensive computation becomes, the more memory and energy we need.

According to the International Energy Agency, data centers and transmission networks make up almost 3% of global electricity use, which adversely affects CO2 emissions. Also, increased power consumption leads to increased operational costs. In Ireland, for example, the power consumption of data centers has more than tripled since 2015.

Enterprises are looking for ways to reduce their operational costs and slow down expansion. Google, Facebook, and Amazon, for example, have purchased locations in Sweden for their future data centers, as cooler regions require fewer resources for cooling.

A Solution at the Processor Level

Dr. Messica and his partner Ziv Leshem are confident that their new technology can provide a solution at the processor level. Data center server providers, they claim, will get instant value using their solution, especially in the ChatGPT-4 era. ChatGPT-4 has a mind-boggling number of 100 trillion parameters that require massive and long training to work, and the cost of training such models can easily skyrocket into tens to hundreds of millions of dollars.

NeoLogic currently provides chip designs that are manufacturing-ready, but in a later stage, the company plans to introduce its own data center processors.

As for the future, Messica believes we’ll see an increasing trend of application-specific AI processors, such as AI-processors that specialize in large language models only, computer vision only, and so on. This specialization could optimize the performance of specific AI applications.

Innovation is a blessing in many ways, but we must remain ahead of the game when it comes to solutions. Addressing the energy and resource strain of AI is a crucial step in ensuring a more sustainable future for all.

Source: www.forbes.com