Free Quiz
Write for Us
Learn Artificial Intelligence and Machine Learning
  • Artificial Intelligence
  • Data Science
    • Language R
    • Deep Learning
    • Tableau
  • Machine Learning
  • Python
  • Blockchain
  • Crypto
  • Big Data
  • NFT
  • Technology
  • Interview Questions
  • Others
    • News
    • Startups
    • Books
  • Artificial Intelligence
  • Data Science
    • Language R
    • Deep Learning
    • Tableau
  • Machine Learning
  • Python
  • Blockchain
  • Crypto
  • Big Data
  • NFT
  • Technology
  • Interview Questions
  • Others
    • News
    • Startups
    • Books
Learn Artificial Intelligence and Machine Learning
No Result
View All Result

Home » AI Breakthrough: Scientists Transform Everyday Transistor Into an Artificial Neuron

AI Breakthrough: Scientists Transform Everyday Transistor Into an Artificial Neuron

Tarun Khanna by Tarun Khanna
April 16, 2025
in Artificial Intelligence, Technology
Reading Time: 3 mins read
0
AI Breakthrough: Scientists Transform Everyday Transistor Into an Artificial Neuron

Photo Credit: https://scitechdaily.com/

Share on FacebookShare on TwitterShare on LinkedInShare on WhatsApp

NUS researchers have proven that a single transistor can copy both neural and synaptic behaviors, marking a significant step toward brain-inspired computing

Researchers on the National University of Singapore (NUS) have proven that a single, widespread silicon transistor, the core factor of microchips discovered in computer systems, smartphones, and nearly all modern electronics, can mock the functions of both a biological neuron and synapse while operated in a nontraditional manner.

The research, led by using Associate Professor Mario Lanza from the Department of Materials Science and Engineering at NUS’s College of Design and Engineering, offers a promising path toward scalable, energy-efficient hardware for artificial neural networks (ANNs). This improvement marks a significant step forward in neuromorphic computing, a discipline that targets to replicate the brain’s efficiency in processing statistics. The research was posted in Nature on March 26, 2025.

Also Read:

AI memory need propels SK Hynix to historic DRAM market leadership

Anthropic is releasing a new program to study AI ‘model welfare’

Huawei readies new AI chip for mass shipment as China seeks Nvidia options, sources stated

ChatGPT search is developing quickly in Europe, OpenAI data suggests

Putting the brains in silicon

The world’s most cosmopolitan computers already present inside our heads. researches show that the human brain is, by and large, more energy-efficient than electronic processors, way to almost 90 billion neurons that shape some 100 trillion connections with each other, and synapses that tune their strength over the time — a method called synaptic plasticity, which underpins learning and memory.

For many years, scientists have pursued to duplicate this efficiency using artificial neural networks (ANNs). ANNs have lately pushed brilliant progresses in artificial intelligence (AI), loosely inspired by how the brain approaches information. But whilst they borrow biological terminology, the resemblances run only skin deep — software based ANNs, which include those powering large language models like ChatGPT, have a voracious urge for computational resources, and therefore, electricity. This makes them impractical for many programs.

Neuromorphic computing objectives to mock the computing power and energy efficiency of the brain. This demands no longer re-designing system architecture to carry out memory and computation at the same region — the so-called in-memory computing (IMC) — however additionally to develop digital devices that take advantage the physical and digital phenomena able to replicating more faithfully how neurons and synapses works. However, recently neuromorphic computing systems are stymied by the need for complex multi-transistor circuits or emerging materials that are yet to be confirmed for big-scale production.

“To allow real neuromorphic computing, where microchips behave like biological neurons and synapses, we need hardware that is both scalable and power-efficient,” stated Professor Lanza.

A Breakthrough Using Standard Silicon

The NUS research team has now validated that a single, standard silicon transistor, whilst ordered and operated in a particular way, can replicate both neural firing and synaptic weight adjustments — the essential mechanisms of biological neurons and synapses. This was attained through adjusting the resistance of the bulk terminal to particular values, which permit controlling two physical phenomena taking place into the transistor: punch by effect ionization and change trapping. Moreover, the team constructed a two-transistor cell capable of operating either in neuron or synaptic regime, which the researchers have called “Neuro-Synaptic Random Access Memory”, or NS-RAM.

“Other methods need for complicated transistor arrays or novel materials with unsure manufacturability, but our method uses commercial CMOS (complementary metallic-oxide-semiconductor) technology, the same platform found in modern computer processors and memory microchips,” defined Professor Lanza. “This way it’s scalable, reliable, and compatible with current semiconductor fabrication approaches.”

Through experiments, the NS-RAM cell validated low power intake, maintained strong overall performance over many cycles of operation, and exhibited steady, predictable behavior throughout different devices — all of which are favored explains for constructing reliable ANN hardware ideal for real-world applications. The team’s breakthrough marks a step change in the development of compact, power-efficient AI processors that might allow faster, more responsive computing.

ShareTweetShareSend
Previous Post

The Rise of AI: Leading Computer Scientists anticipate a Star Trek-Like Future

Next Post

As the trade war increase, Hence launches an AI ‘advisor’ to help enterprises manage risk

Tarun Khanna

Tarun Khanna

Founder DeepTech Bytes - Data Scientist | Author | IT Consultant
Tarun Khanna is a versatile and accomplished Data Scientist, with expertise in IT Consultancy as well as Specialization in Software Development and Digital Marketing Solutions.

Related Posts

As the trade war escalates, Hence launches an AI ‘advisor’ to help companies manage risk
Artificial Intelligence

As the trade war increase, Hence launches an AI ‘advisor’ to help enterprises manage risk

April 21, 2025
EU to set up network of ‘AI factories’ and ‘gigafactories’ as part of newly unveiled action plan
Artificial Intelligence

EU to set up network of ‘AI factories’ and ‘gigafactories’ as part of newly unveiled action plan

April 14, 2025
Vana is letting users own a piece of the AI models trained on their data
Artificial Intelligence

Vana is letting customers own a piece of the AI models trained on their data

April 9, 2025
Anthropic develops ‘AI microscope’ to reveal how large language models think
Artificial Intelligence

Anthropic invented ‘AI microscope’ to show how large language models think

April 1, 2025
Next Post
As the trade war escalates, Hence launches an AI ‘advisor’ to help companies manage risk

As the trade war increase, Hence launches an AI ‘advisor’ to help enterprises manage risk

TRENDING

List Of Common Machine Learning Algorithms

List Of Common Machine Learning Algorithms

List Of Common Machine Learning Algorithms

by Tarun Khanna
February 7, 2021
0
ShareTweetShareSend

Why is Artificial Intelligence the Future of Growth?

Why-is-Artificial-Intelligence-the-Future-of-Growth
by Ritam Chattopadhyay
January 10, 2022
0
ShareTweetShareSend

Three Things In Quantitative Research That Leverage Your Data Aspect

by Manika Sharma
February 20, 2021
0
ShareTweetShareSend

Binance NFT Market launches NFT subscription mechanism

binance-nft-market
by Tarun Khanna
January 6, 2022
0
ShareTweetShareSend

How blockchain is arising as the best bid for small businesses?

How-blockchain-is-arising-as-the-best-bid-for-small-businesses
by Manika Sharma
May 17, 2021
0
ShareTweetShareSend

A new, challenging AGI test shuffles most AI models

A new, challenging AGI test shuffles most AI models

Photo Credit: https://techcrunch.com/

by Tarun Khanna
March 25, 2025
0
ShareTweetShareSend

DeepTech Bytes

Deep Tech Bytes is a global standard digital zine that brings multiple facets of deep technology including Artificial Intelligence (AI), Machine Learning (ML), Data Science, Blockchain, Robotics,Python, Big Data, Deep Learning and more.
Deep Tech Bytes on Google News

Quick Links

  • Home
  • Affiliate Programs
  • About Us
  • Write For Us
  • Submit Startup Story
  • Advertise With Us
  • Terms of Service
  • Disclaimer
  • Cookies Policy
  • Privacy Policy
  • DMCA
  • Contact Us

Topics

  • Artificial Intelligence
  • Data Science
  • Python
  • Machine Learning
  • Deep Learning
  • Big Data
  • Blockchain
  • Tableau
  • Cryptocurrency
  • NFT
  • Technology
  • News
  • Startups
  • Books
  • Interview Questions

Connect

For PR Agencies & Content Writers:

connect@deeptechbytes.com

Facebook Twitter Linkedin Instagram
Listen on Apple Podcasts
Listen on Google Podcasts
Listen on Google Podcasts
Listen on Google Podcasts
DMCA.com Protection Status

© 2024 Designed by AK Network Solutions

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Artificial Intelligence
  • Data Science
    • Language R
    • Deep Learning
    • Tableau
  • Machine Learning
  • Python
  • Blockchain
  • Crypto
  • Big Data
  • NFT
  • Technology
  • Interview Questions
  • Others
    • News
    • Startups
    • Books

© 2023. Designed by AK Network Solutions