Free Quiz
Write for Us
Learn Artificial Intelligence and Machine Learning
  • Artificial Intelligence
  • Data Science
    • Language R
    • Deep Learning
    • Tableau
  • Machine Learning
  • Python
  • Blockchain
  • Crypto
  • Big Data
  • NFT
  • Technology
  • Interview Questions
  • Others
    • News
    • Startups
    • Books
  • Artificial Intelligence
  • Data Science
    • Language R
    • Deep Learning
    • Tableau
  • Machine Learning
  • Python
  • Blockchain
  • Crypto
  • Big Data
  • NFT
  • Technology
  • Interview Questions
  • Others
    • News
    • Startups
    • Books
Learn Artificial Intelligence and Machine Learning
No Result
View All Result

Home » Cornell’s Tiny “Microwave Brain” Chip Could Transform Computing and AI

Cornell’s Tiny “Microwave Brain” Chip Could Transform Computing and AI

Tarun Khanna by Tarun Khanna
October 14, 2025
in Artificial Intelligence, Technology
Reading Time: 2 mins read
0
Cornell’s Tiny “Microwave Brain” Chip Could Transform Computing and AI

Photo Credit: https://scitechdaily.com/

Share on FacebookShare on TwitterShare on LinkedInShare on WhatsApp

Cornell engineers have generated the world’s first “microwave brain” — a revolutionary microchip that computes with microwaves in place of traditional digital circuits.

This tiny, low-energy processor conducts real-time tasks like signal decoding, radar tracking, and data analysis at the same time as utilizing less than 200 milliwatts.

Cornell’s “Microwave Brain” Breakthrough

Cornell University scientists have generated a new kind of low-power microchip referred to as a “microwave brain,” capable to processing both ultrafast data and wireless communication alerts by using the precise properties of microwaves.

Also Read:

AI in 2026: Experimental AI concludes as self-operating systems rise

Trump Administration Plans 1,000-Member ‘U.S. Tech Force’ to Build Federal AI Infrastructure

Johns Hopkins Study Challenges Billion-Dollar AI Models

Inside the playbook of corporations winning with AI

Currently defined in the journal Nature Electronics, this processor is the first absolutely functional microwave neural network built directly on a silicon chip. It plays real-time computations in the frequency domain for challenging tasks which include radio signal decoding, radar tracking, and digital data processing, all even as consuming under 200 milliwatts of power.

A Chip That Rewrites Signal Processing

“Because it’s capable of distort in a programmable way across a extensive band of frequencies immediately, it could be reutilized for numerous computing tasks,” stated lead author Bal Govind, a doctoral student who carried out the research with Maxwell Anderson, also a doctoral student. “It bypasses a massive variety of signal processing steps that digital computer normally must do.”

The chip’s overall performance comes from its structure, which capabilities as a neural network—a system stimulated via the human brain. It uses linked electromagnetic modes within tunable waveguides to notice patterns and adapt to incoming information. Unlike standard neural networks that depend upon digital operations and clock-timed instructions, this systems operates in the analog microwave range, permitting it to process data streams within the tens of gigahertz, far surpassing the speed of most digital processors.

Throwing Out the Digital Playbook

“Bal threw away quite a lot of conventional circuit design to obtain this,” stated Alyssa Apsel, professor of engineering, who was into co-senior author with Peter McMahon, associate professor of applied and engineering physics. “Instead of seeking to mimic the structure of digital neural networks exactly, he developed something that looks more like a managed mush of frequency behaviors which could finally provide you with high-performance computation.”

The result is a chip which could take care of each simple logic operations and more advanced tasks, together with noticing binary sequences or identifying patterns in high-speed data. It obtained accuracy rates of 88% or higher throughout numerous wireless signal classification challenges, matching the performance of digital neural networks at the same time as using fraction in their energy and space.

Smarter Computing With Less Power

“In traditional digital systems, as tasks get more complex, you want more circuitry, more power, and more error correction to maintain accuracy,” Govind stated. “But with our probabilistic method, we’re able to keep high accuracy on both simple and complicated computations, without that added overhead.”

The chip’s extreme sensitivity to inputs makes it well-ideal for hardware security applications like sensing anomalies in wireless communications across multiple bands of microwave frequencies, according to the researchers.

Toward On-Device AI and Edge Computing

“We also think that if we decrease the power consumption more, we are able to deploy it to applications like edge computing,” Apsel stated, “You could set up it on a smartwatch or a mobile phone and build local models to your smart device rather of having to rely upon a cloud server for everything.”

Though the chip is still experimental, the researchers are optimistic about its scalability. They are experimenting with approaches to enhance its accuracy and combine it into existing microwave and digital processing platforms.

ShareTweetShareSend
Previous Post

Bitcoin Price Prediction as Trump Softens Stance on China – New Bull Market beginning Again?

Next Post

“Trump Insider” Whale who made $160M from BTC Crash is structure huge shorts again – Another Meltdown forward?

Tarun Khanna

Tarun Khanna

Founder DeepTech Bytes - Data Scientist | Author | IT Consultant
Tarun Khanna is a versatile and accomplished Data Scientist, with expertise in IT Consultancy as well as Specialization in Software Development and Digital Marketing Solutions.

Related Posts

Google released its deepest AI research agent but — at the same day OpenAI dropped GPT-5.2
Artificial Intelligence

Google released its deepest AI research agent but — at the same day OpenAI dropped GPT-5.2

December 15, 2025
Sam Altman’s World Venture Upgrades App With Encrypted Chat, In-Chat Crypto Pay Options
Artificial Intelligence

Sam Altman’s World Venture Upgrades App With Encrypted Chat, In-Chat Crypto Pay Options

December 12, 2025
Research Finds Executive AI Skills Gap is Slowing ROI on Enterprise AI Investments
Artificial Intelligence

Research Finds Executive AI Skills Gap is Slowing ROI on Enterprise AI Investments

December 10, 2025
OpenAI strikes deal on US$4.6 bn AI center in Australia
Artificial Intelligence

OpenAI strikes deal on US$4.6 bn AI center in Australia

December 9, 2025
Next Post
“Trump Insider” Whale who made $160M from BTC Crash is structure huge shorts again – Another Meltdown forward?

“Trump Insider” Whale who made $160M from BTC Crash is structure huge shorts again – Another Meltdown forward?

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

+ 43 = 45

TRENDING

Are AI Models on the Autism Spectrum? Exploring the Parallels

Are AI Models on the Autism Spectrum? Exploring the Parallels

Photo Credit: https://opendatascience.com/

by Tarun Khanna
September 17, 2025
0
ShareTweetShareSend

Stochastic Optimization Algorithms:- A Gentle Introduction

Stochastic-optimisation
by Manika Sharma
February 16, 2021
0
ShareTweetShareSend

Top 10 Python Libraries for Machine Learning

Top-10-Python-Libraries-for-Machine-Learning
by Tarun Khanna
December 25, 2021
0
ShareTweetShareSend

YouTube Uses Artificial Intelligence and Machine Learning

YouTube-Uses-Artificial-Intelligence-and-Machine-Learning
by Tarun Khanna
September 5, 2021
0
ShareTweetShareSend

Exclusive: Meta starts testing its first in-house AI training chip

Exclusive: Meta starts testing its first in-house AI training chip

Photo Credit: https://gadgetadvisor.com/

by Tarun Khanna
March 12, 2025
0
ShareTweetShareSend

Digital Euro is set to Advance, Awaits Legislative Action: ECB’s Christine Lagarde

Digital Euro is set to Advance, Awaits Legislative Action: ECB’s Christine Lagarde

Photo Credit: https://cryptonews.com/

by Tarun Khanna
December 19, 2025
0
ShareTweetShareSend

DeepTech Bytes

Deep Tech Bytes is a global standard digital zine that brings multiple facets of deep technology including Artificial Intelligence (AI), Machine Learning (ML), Data Science, Blockchain, Robotics,Python, Big Data, Deep Learning and more.
Deep Tech Bytes on Google News

Quick Links

  • Home
  • Affiliate Programs
  • About Us
  • Write For Us
  • Submit Startup Story
  • Advertise With Us
  • Terms of Service
  • Disclaimer
  • Cookies Policy
  • Privacy Policy
  • DMCA
  • Contact Us

Topics

  • Artificial Intelligence
  • Data Science
  • Python
  • Machine Learning
  • Deep Learning
  • Big Data
  • Blockchain
  • Tableau
  • Cryptocurrency
  • NFT
  • Technology
  • News
  • Startups
  • Books
  • Interview Questions

Connect

For PR Agencies & Content Writers:

connect@deeptechbytes.com

Facebook Twitter Linkedin Instagram
Listen on Apple Podcasts
Listen on Google Podcasts
Listen on Google Podcasts
Listen on Google Podcasts
DMCA.com Protection Status

© 2024 Designed by AK Network Solutions

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Artificial Intelligence
  • Data Science
    • Language R
    • Deep Learning
    • Tableau
  • Machine Learning
  • Python
  • Blockchain
  • Crypto
  • Big Data
  • NFT
  • Technology
  • Interview Questions
  • Others
    • News
    • Startups
    • Books

© 2023. Designed by AK Network Solutions