EnCharge AI (@enchargeai) 's Twitter Profile
EnCharge AI

@enchargeai

ID: 1864387683117588480

calendar_today04-12-2024 19:14:08

1 Tweet

9 Followers

45 Following

EnCharge AI (@enchargeai) 's Twitter Profile Photo

ChatGPT uses ~10x more energy than a Google search, and AI demand is surging. EnCharge is tackling the energy crisis with analog in-memory computing, cutting power use by up to 20x vs. today’s chips. No more data shuttling. Just fast, efficient AI.

ChatGPT uses ~10x more energy than a <a href="/Google/">Google</a> search, and AI demand is surging. EnCharge is tackling the energy crisis with analog in-memory computing, cutting power use by up to 20x vs. today’s chips. No more data shuttling. Just fast, efficient AI.
EnCharge AI (@enchargeai) 's Twitter Profile Photo

AI could consume 500 TWh/year by 2027. That's almost like adding a new France to the grid! At #WebSummitVancouver on May 29, our CEO Naveen Verma explores how rethinking chip architecture can make AI more accessible and sustainable. Join us: vancouver.websummit.com

AI could consume 500 TWh/year by 2027. That's almost like adding a new France to the grid!

At #WebSummitVancouver on May 29, our CEO Naveen Verma explores how rethinking chip architecture can make AI more accessible and sustainable. Join us: vancouver.websummit.com
EnCharge AI (@enchargeai) 's Twitter Profile Photo

At this year’s Web Summit in Vancouver, our CEO Naveen Verma & @CerebrasSystems' Naor Penso explored why current chip architectures can’t sustainably handle AI’s computational demands and the critical need for specialized hardware in a session moderated by the The Guardian's Dara Kerr.

EnCharge AI (@enchargeai) 's Twitter Profile Photo

Traditional chips quickly hit a wall when trying to deliver AI within the confines of a laptop. But, as highlighted by EE Times | Electronic Engineering Times, we’ve cracked the code with our flagship EN100 AI accelerator. eetimes.com/encharge-picks…

Traditional chips quickly hit a wall when trying to deliver AI within the confines of a laptop. But, as highlighted by <a href="/eetimes/">EE Times | Electronic Engineering Times</a>, we’ve cracked the code with our flagship EN100 AI accelerator.

eetimes.com/encharge-picks…
EnCharge AI (@enchargeai) 's Twitter Profile Photo

With AI increasingly taking over critical workflows and vital activities, security is a fundamental component of AI’s utility. At Web Summit Vancouver, our CEO Naveen Verma and @CerebrasSystems' Naor Penso discussed the impact of security in all aspects of #AI architecture.

Alisa Cohn (@alisacohn) 's Twitter Profile Photo

💡 Empathy doesn’t mean going easy on people. Naveen Verma EnCharge AI , Princeton professor turned startup CEO , has learned that being an empathetic leader sometimes means making hard calls and making them quickly. In this clip, he shares why compassion in leadership still

EnCharge AI (@enchargeai) 's Twitter Profile Photo

Proud to see our co-founder & CEO Naveen Verma take the stage as a featured innovator at #CPI2025. Thank you Princeton University for spotlighting our mission to enable a new era of advanced AI at the edge.

EnCharge AI (@enchargeai) 's Twitter Profile Photo

Today’s GPUs waste 95% of inference energy moving data between memory and compute. We're eliminating that commute with analog in-memory compute that processes data where it lives, for 20x efficiency gains and a dramatic drop in data center power demand. bain.com/insights/can-t…

EnCharge AI (@enchargeai) 's Twitter Profile Photo

The U.S. is losing ground on the AI-defined battlefield, not for lack of models, but for lack of ultra-efficient chips to run them at the tactical edge. The tech is here. Our analog IMC makes this real. What’s missing isn’t innovation, it’s deployment. csis.org/blogs/strategi…

EnCharge AI (@enchargeai) 's Twitter Profile Photo

The Sept 27th All-In Podcast covered a paper in Nature Computational Science claiming 7,000× speedups and 70,000× energy efficiency gains. Caveats? It’s not that simple, and it’s not that new. Read more: open.substack.com/pub/aiafterhou…

The Sept 27th All-In Podcast covered a paper in Nature Computational Science claiming 7,000× speedups and 70,000× energy efficiency gains. 

Caveats? It’s not that simple, and it’s not that new.

Read more: open.substack.com/pub/aiafterhou…