Free Quiz
Write for Us
Learn Artificial Intelligence and Machine Learning
  • Artificial Intelligence
  • Data Science
    • Language R
    • Deep Learning
    • Tableau
  • Machine Learning
  • Python
  • Blockchain
  • Crypto
  • Big Data
  • NFT
  • Technology
  • Interview Questions
  • Others
    • News
    • Startups
    • Books
  • Artificial Intelligence
  • Data Science
    • Language R
    • Deep Learning
    • Tableau
  • Machine Learning
  • Python
  • Blockchain
  • Crypto
  • Big Data
  • NFT
  • Technology
  • Interview Questions
  • Others
    • News
    • Startups
    • Books
Learn Artificial Intelligence and Machine Learning
No Result
View All Result

Home » Scientists Just Made AI at the Speed of Light a Reality

Scientists Just Made AI at the Speed of Light a Reality

Tarun Khanna by Tarun Khanna
December 1, 2025
in Artificial Intelligence
Reading Time: 2 mins read
0
Scientists Just Made AI at the Speed of Light a Reality

Photo Credit: https://scitechdaily.com/

Share on FacebookShare on TwitterShare on LinkedInShare on WhatsApp

Researchers have presented single-shot tensor computing at the speed of light, indicating a exquisite step in the direction of next-generation AGI hardware powered via optical as opposed to electronic computation.

Tensor operations are a form of mathematical processing that underpins many current technologies, particularly artificial intelligence, but they go a long way past the primary math most people stumble upon. A useful comparison is the complex movements included in rotating, slicing, or reorganizing a Rubik’s cube in numerous dimensions at once. Humans and traditional computers need to destroy those steps into a chain, while light can carry out all of them simultaneously.

In AI, tasks starting from image recognition to language understanding rely closely on tensor operations. As data volumes continue to increase, however, preferred computing hardware such as GPUs is being driven to its limits in speed, scalability, and energy use.

Also Read:

Google released its deepest AI research agent but — at the same day OpenAI dropped GPT-5.2

Sam Altman’s World Venture Upgrades App With Encrypted Chat, In-Chat Crypto Pay Options

Research Finds Executive AI Skills Gap is Slowing ROI on Enterprise AI Investments

OpenAI strikes deal on US$4.6 bn AI center in Australia

How Light Becomes a Calculator

Driven via the requirement for rapid and more efficient computing, an international studies team led by Dr. Yufeng Zhang of the Photonics Group at Aalto University’s Department of Electronics and Nanoengineering has created a new way to perform complicated tensor calculations using a single pass of light. This technique allows single-shot tensor computing on the actual speed of light.

“Our approach performs the identical kinds of operations that today’s GPUs cope with, like convolutions and attention layers, but does them all at the speed of light,” stated Dr. Zhang. “Instead of counting on electronic circuits, we use the physical properties of light to carry out many computations concurrently.”

The team carried out this by means of encoding digital information into the amplitude and phase of light waves, turning numerical values into measurable features of an optical field. As these dependent light fields move, engage, and merge, they essentially perform mathematical approaches which includes matrix and tensor multiplications, which are critical to deep learning. Announcing multiple wavelengths permitted the researchers to expand this method so it is able to assist even more advanced, higher-order tensor operations.

“Imagine you’re a customs officer who ought to look at each parcel via multiple machines with different kind of functions after which kind them into the proper bins,” Zhang explains. “Generally, you’d procedure every parcel one by one. Our optical computing approach merges all parcels and all machines collectively — we generate more than one ‘optical hooks’ that join each input to its accurate output. With simply one operation, one pass of light, all inspections and sorting happen right away and in parallel.”

Passive, Efficient, and Ready for Integration

Another main benefit of this method is its simplicity. The optical operations take place Inactively as the light propagates, so no energetic control or electronic switching is required throughout computation.

“This approach can be carried out on almost any optical platform,” says Professor Zhipei Sun, leader of Aalto University’s Photonics Group. ‘In the future, we plan to incorporate this computational framework directly onto photonic chips, permitting light-based totally processors to perform complex AI tasks with extremely low power intake.’

Finally, the purpose is to set up the procedure on the existing hardware or platforms established by big corporations, says Zhang, who conservatively estimates the approach can be included to such platform within 3-5 years.

“This will create a new generation of optical computing systems, significantly increasing complex AI tasks across a myriad of fields,” he concludes.

ShareTweetShareSend
Previous Post

OpenAI claims teen circumvented safety features before suicide that ChatGPT supported plan

Next Post

Study disproves Major Myth: AI’s Energy Usage Is Notably Less Than Feared

Tarun Khanna

Tarun Khanna

Founder DeepTech Bytes - Data Scientist | Author | IT Consultant
Tarun Khanna is a versatile and accomplished Data Scientist, with expertise in IT Consultancy as well as Specialization in Software Development and Digital Marketing Solutions.

Related Posts

Softbank’s Son says super AI ought to make human like fish, win Nobel Prize
Artificial Intelligence

Softbank’s Son says super AI ought to make human like fish, win Nobel Prize

December 9, 2025
NVIDIA’s New AI Server Delivers Tenfold Performance Increase for Emerging Models
Artificial Intelligence

NVIDIA’s New AI Server Delivers Tenfold Performance Increase for Emerging Models

December 8, 2025
Trump implies AI Executive Order to Undercut State-Level Regulation
Artificial Intelligence

Trump implies AI Executive Order to Undercut State-Level Regulation

December 8, 2025
Study disproves Major Myth: AI’s Energy Usage Is Notably Less Than Feared
Artificial Intelligence

Study disproves Major Myth: AI’s Energy Usage Is Notably Less Than Feared

December 1, 2025
Next Post
Study disproves Major Myth: AI’s Energy Usage Is Notably Less Than Feared

Study disproves Major Myth: AI’s Energy Usage Is Notably Less Than Feared

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

− 4 = 4

TRENDING

AI is rewriting the regulations of the insurance industry

AI is rewriting the regulations of the insurance industry

Photo Credit: https://www.artificialintelligence-news.com/

by Tarun Khanna
July 15, 2025
0
ShareTweetShareSend

Toward a latest framework to accelerate large language model inference

Toward a latest framework to accelerate large language model inference

Schematic diagram of SPECTRA and other existing training-free approaches. Photo Credit: https://techxplore.com/

by Tarun Khanna
August 8, 2025
0
ShareTweetShareSend

Bitcoin Price expectation: Coinbase CEO Says $1M BTC Is Coming – And The Money Flood Hasn’t Even begun Yet

Bitcoin Price expectation: Coinbase CEO Says $1M BTC Is Coming – And The Money Flood Hasn’t Even begun Yet

Photo Credit: https://cryptonews.com/

by Tarun Khanna
September 25, 2025
0
ShareTweetShareSend

How Is Data Science Used Across Industries?

data science
by Sarah Gomes
January 31, 2021
0
ShareTweetShareSend

Introduction to Computational Learning Theory

Introduction to Computational Learning Theory
by Manika Sharma
February 18, 2021
0
ShareTweetShareSend

What is Gemma 3, Google’s new light-weight AI model that can run on a single GPU?

What is Gemma 3, Google’s new light-weight AI model that can run on a single GPU?

Photo Credit: https://indianexpress.com/

by Tarun Khanna
March 13, 2025
0
ShareTweetShareSend

DeepTech Bytes

Deep Tech Bytes is a global standard digital zine that brings multiple facets of deep technology including Artificial Intelligence (AI), Machine Learning (ML), Data Science, Blockchain, Robotics,Python, Big Data, Deep Learning and more.
Deep Tech Bytes on Google News

Quick Links

  • Home
  • Affiliate Programs
  • About Us
  • Write For Us
  • Submit Startup Story
  • Advertise With Us
  • Terms of Service
  • Disclaimer
  • Cookies Policy
  • Privacy Policy
  • DMCA
  • Contact Us

Topics

  • Artificial Intelligence
  • Data Science
  • Python
  • Machine Learning
  • Deep Learning
  • Big Data
  • Blockchain
  • Tableau
  • Cryptocurrency
  • NFT
  • Technology
  • News
  • Startups
  • Books
  • Interview Questions

Connect

For PR Agencies & Content Writers:

connect@deeptechbytes.com

Facebook Twitter Linkedin Instagram
Listen on Apple Podcasts
Listen on Google Podcasts
Listen on Google Podcasts
Listen on Google Podcasts
DMCA.com Protection Status

© 2024 Designed by AK Network Solutions

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Artificial Intelligence
  • Data Science
    • Language R
    • Deep Learning
    • Tableau
  • Machine Learning
  • Python
  • Blockchain
  • Crypto
  • Big Data
  • NFT
  • Technology
  • Interview Questions
  • Others
    • News
    • Startups
    • Books

© 2023. Designed by AK Network Solutions