Free Quiz
Write for Us
Learn Artificial Intelligence and Machine Learning
  • Artificial Intelligence
  • Data Science
    • Language R
    • Deep Learning
    • Tableau
  • Machine Learning
  • Python
  • Blockchain
  • Crypto
  • Big Data
  • NFT
  • Technology
  • Interview Questions
  • Others
    • News
    • Startups
    • Books
  • Artificial Intelligence
  • Data Science
    • Language R
    • Deep Learning
    • Tableau
  • Machine Learning
  • Python
  • Blockchain
  • Crypto
  • Big Data
  • NFT
  • Technology
  • Interview Questions
  • Others
    • News
    • Startups
    • Books
Learn Artificial Intelligence and Machine Learning
No Result
View All Result

Home » Cornell’s Tiny “Microwave Brain” Chip Could Transform Computing and AI

Cornell’s Tiny “Microwave Brain” Chip Could Transform Computing and AI

Tarun Khanna by Tarun Khanna
October 14, 2025
in Artificial Intelligence, Technology
Reading Time: 2 mins read
0
Cornell’s Tiny “Microwave Brain” Chip Could Transform Computing and AI

Photo Credit: https://scitechdaily.com/

Share on FacebookShare on TwitterShare on LinkedInShare on WhatsApp

Cornell engineers have generated the world’s first “microwave brain” — a revolutionary microchip that computes with microwaves in place of traditional digital circuits.

This tiny, low-energy processor conducts real-time tasks like signal decoding, radar tracking, and data analysis at the same time as utilizing less than 200 milliwatts.

Cornell’s “Microwave Brain” Breakthrough

Cornell University scientists have generated a new kind of low-power microchip referred to as a “microwave brain,” capable to processing both ultrafast data and wireless communication alerts by using the precise properties of microwaves.

Also Read:

AMD and OpenAI Strike Multi-Billion-Dollar AI Chip Partnership

DeepSeek launch ‘sparse attention’ model that cuts API costs in half

Engineers generate Soft Robots That Can Literally Walk on Water

Former Microsoft execs release AI agents to end Excel-led finance

Currently defined in the journal Nature Electronics, this processor is the first absolutely functional microwave neural network built directly on a silicon chip. It plays real-time computations in the frequency domain for challenging tasks which include radio signal decoding, radar tracking, and digital data processing, all even as consuming under 200 milliwatts of power.

A Chip That Rewrites Signal Processing

“Because it’s capable of distort in a programmable way across a extensive band of frequencies immediately, it could be reutilized for numerous computing tasks,” stated lead author Bal Govind, a doctoral student who carried out the research with Maxwell Anderson, also a doctoral student. “It bypasses a massive variety of signal processing steps that digital computer normally must do.”

The chip’s overall performance comes from its structure, which capabilities as a neural network—a system stimulated via the human brain. It uses linked electromagnetic modes within tunable waveguides to notice patterns and adapt to incoming information. Unlike standard neural networks that depend upon digital operations and clock-timed instructions, this systems operates in the analog microwave range, permitting it to process data streams within the tens of gigahertz, far surpassing the speed of most digital processors.

Throwing Out the Digital Playbook

“Bal threw away quite a lot of conventional circuit design to obtain this,” stated Alyssa Apsel, professor of engineering, who was into co-senior author with Peter McMahon, associate professor of applied and engineering physics. “Instead of seeking to mimic the structure of digital neural networks exactly, he developed something that looks more like a managed mush of frequency behaviors which could finally provide you with high-performance computation.”

The result is a chip which could take care of each simple logic operations and more advanced tasks, together with noticing binary sequences or identifying patterns in high-speed data. It obtained accuracy rates of 88% or higher throughout numerous wireless signal classification challenges, matching the performance of digital neural networks at the same time as using fraction in their energy and space.

Smarter Computing With Less Power

“In traditional digital systems, as tasks get more complex, you want more circuitry, more power, and more error correction to maintain accuracy,” Govind stated. “But with our probabilistic method, we’re able to keep high accuracy on both simple and complicated computations, without that added overhead.”

The chip’s extreme sensitivity to inputs makes it well-ideal for hardware security applications like sensing anomalies in wireless communications across multiple bands of microwave frequencies, according to the researchers.

Toward On-Device AI and Edge Computing

“We also think that if we decrease the power consumption more, we are able to deploy it to applications like edge computing,” Apsel stated, “You could set up it on a smartwatch or a mobile phone and build local models to your smart device rather of having to rely upon a cloud server for everything.”

Though the chip is still experimental, the researchers are optimistic about its scalability. They are experimenting with approaches to enhance its accuracy and combine it into existing microwave and digital processing platforms.

ShareTweetShareSend
Previous Post

Bitcoin Price Prediction as Trump Softens Stance on China – New Bull Market beginning Again?

Next Post

“Trump Insider” Whale who made $160M from BTC Crash is structure huge shorts again – Another Meltdown forward?

Tarun Khanna

Tarun Khanna

Founder DeepTech Bytes - Data Scientist | Author | IT Consultant
Tarun Khanna is a versatile and accomplished Data Scientist, with expertise in IT Consultancy as well as Specialization in Software Development and Digital Marketing Solutions.

Related Posts

The Trump administration is going after semiconductor imports
Artificial Intelligence

The Trump administration is going after semiconductor imports

September 26, 2025
AI Cracks the Code for the Next Generation of Solar Power
Artificial Intelligence

AI Cracks the Code for the Next Generation of Solar Power

September 26, 2025
Microsoft discloses Microfluidic Cooling Breakthrough for AI Chips
Artificial Intelligence

Microsoft discloses Microfluidic Cooling Breakthrough for AI Chips

September 25, 2025
What does the future hold for generative AI?
Artificial Intelligence

What does the future hold for generative AI?

September 24, 2025
Next Post
“Trump Insider” Whale who made $160M from BTC Crash is structure huge shorts again – Another Meltdown forward?

“Trump Insider” Whale who made $160M from BTC Crash is structure huge shorts again – Another Meltdown forward?

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

+ 81 = 83

TRENDING

Computer Vision Guide – Object Detection Use Cases

computer-vision-object-detecion
by Tarun Khanna
October 7, 2021
0
ShareTweetShareSend

Data Science – Key to Bridging the Gap Between Tech And Business

Tech And Business
by Tarun Khanna
February 26, 2021
0
ShareTweetShareSend

Binance Gets New APAC Head to Oversee Regulatory Engagement

Binance Gets New APAC Head to Oversee Regulatory Engagement

Photo Credit: https://cryptonews.com/

by Tarun Khanna
September 1, 2025
0
ShareTweetShareSend

How can Hydroponics Get a Boost from Artificial Intelligence?

Artificial Intelligence scaled

artificial intelligence concept design with face

by Tarun Khanna
February 25, 2021
0
ShareTweetShareSend

WhatsApp Vs. Signal: Where should you be? Explained.

whatsapp-vs-signal
by Sarah Gomes
January 16, 2021
0
ShareTweetShareSend

AI learns how vision and sound are connected, without human intervention

AI learns how vision and sound are connected, without human intervention

Photo Credit: https://karlobag.eu/

by Tarun Khanna
May 22, 2025
0
ShareTweetShareSend

DeepTech Bytes

Deep Tech Bytes is a global standard digital zine that brings multiple facets of deep technology including Artificial Intelligence (AI), Machine Learning (ML), Data Science, Blockchain, Robotics,Python, Big Data, Deep Learning and more.
Deep Tech Bytes on Google News

Quick Links

  • Home
  • Affiliate Programs
  • About Us
  • Write For Us
  • Submit Startup Story
  • Advertise With Us
  • Terms of Service
  • Disclaimer
  • Cookies Policy
  • Privacy Policy
  • DMCA
  • Contact Us

Topics

  • Artificial Intelligence
  • Data Science
  • Python
  • Machine Learning
  • Deep Learning
  • Big Data
  • Blockchain
  • Tableau
  • Cryptocurrency
  • NFT
  • Technology
  • News
  • Startups
  • Books
  • Interview Questions

Connect

For PR Agencies & Content Writers:

connect@deeptechbytes.com

Facebook Twitter Linkedin Instagram
Listen on Apple Podcasts
Listen on Google Podcasts
Listen on Google Podcasts
Listen on Google Podcasts
DMCA.com Protection Status

© 2024 Designed by AK Network Solutions

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Artificial Intelligence
  • Data Science
    • Language R
    • Deep Learning
    • Tableau
  • Machine Learning
  • Python
  • Blockchain
  • Crypto
  • Big Data
  • NFT
  • Technology
  • Interview Questions
  • Others
    • News
    • Startups
    • Books

© 2023. Designed by AK Network Solutions