How neuromorphic ‘brain chips’ will begin the next era in computing

How neuromorphic ‘brain chips’ will begin the next era in computing

IBM recently released new details about the efficiency of its TrueNorth processors, which sport a fundamentally novel design that cribs from the structure of the brain. Rather than line up billions of digital transistors all in a line, TrueNorth chips have a million computer ‘neurons’ that work in parallel across 256 million inter-neuron connections (‘synapses’). According to these reports, the approach is paying incredible dividends in terms of performance and, more importantly, power efficiency. Make no mistake: neuromorphic computing is going to change the world, and it’s going to do it more quickly than you might imagine.

The development of neuromorphic computers is thematically pretty similar to the development of digital computers: First figure out the utility of an operation (say, computing firing trajectories during wartime), then develop a crude way of doing it with the tools you already have available (say, rooms full of people doing manual arithmetic), then invent a machine to automate this process in a much more efficient way. Part of the reason a digital computer is more efficient than a human being is its transistors can fire with incredible speed — but so can our neurons. The bigger issue is a digital computer is designed from the ground up to do those sorts of mathematical operations; from a certain perspective, it’s a bit crazy we ever tried to do efficient mathematical work on a computer like the human brain.

Similarly, we will eventually look back at the attempt to do learning operations with digital chips, including GPUs, as inherently unwise or even silly. The much more reasonable approach is to design a thinking machine suited to such operations from the most basic hardware level, as naturally predisposed to machine learning as a Celeron chip is to multiplication. This could not only greatly increase the speed of the processor for these tasks, but dramatically reduce the energy consumed to complete each one. That’s what IBM has in the works, and it’s much further along than many expect.

When tasked with classifying images (a well understood machine learning task), a TrueNorth chip can churn through between 1,200 and 2,600 frames every second, and do it while using between 25 and 275 mW. This leads to an effective efficiency of more than 6,000 frames per second per Watt. There’s no listed standard frames/second/Watt figure for conventional GPUs using the same sorting algorithm and dataset, but considering modern graphics cards might draw 200 or even 250 watts all on their own, it’s hard not to imagine a host of low-power, high-performance applications.

Most obviously, there is the incredible expense of modern machine learning. Companies like Apple, Facebook, and Google can only deliver their advanced services by running expensive arrays of super-computers designed to execute the machine learning algorithms as efficiently as possible, and that specialization comes at a crushing cost. Even leaving that aside, electricity alone becomes a major expense when you’re running that many computers at or near their capacity, 24 hours a day. Just ask bitcoin miners.

So, early, expensive neuromorphic hardware will likely be a major boon to service providers, and we can only hope this will be passed along to consumers in the form of improved performance and wide-ranging savings. But the speed and efficiency offered by neuromorphic chips won’t stop there — reducing power draw by several orders of magnitude will allow such tasks to come out of the cloud entirely.

 

Share it:
Share it:

[Social9_Share class=”s9-widget-wrapper”]

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

You Might Be Interested In

Big Data Is Changing the Game for Recruiters

15 Jul, 2017

Companies deal with challenges when it comes to searching for and hiring the best suitable candidates. Even though in this …

Read more

Drowning in Data, Cities Turn to ‘Citizen Scientists’

6 Feb, 2018

Government has a data problem. Put simply, it collects so much of it that it struggles to analyze most of …

Read more

Explainable AI: Why visualizing neural networks is important

19 Mar, 2021

Last week, researchers from OpenAI and Google introduced Activation Atlases, a tool that helps make sense of the inner workings …

Read more

Recent Jobs

Senior Cloud Engineer (AWS, Snowflake)

Remote (United States (Nationwide))

9 May, 2024

Read More

IT Engineer

Washington D.C., DC, USA

1 May, 2024

Read More

Data Engineer

Washington D.C., DC, USA

1 May, 2024

Read More

Applications Developer

Washington D.C., DC, USA

1 May, 2024

Read More

Do You Want to Share Your Story?

Bring your insights on Data, Visualization, Innovation or Business Agility to our community. Let them learn from your experience.

Get the 3 STEPS

To Drive Analytics Adoption
And manage change

3-steps-to-drive-analytics-adoption

Get Access to Event Discounts

Switch your 7wData account from Subscriber to Event Discount Member by clicking the button below and get access to event discounts. Learn & Grow together with us in a more profitable way!

Get Access to Event Discounts

Create a 7wData account and get access to event discounts. Learn & Grow together with us in a more profitable way!

Don't miss Out!

Stay in touch and receive in depth articles, guides, news & commentary of all things data.