×
The Future of AI Supercomputing with Light , dramatically accelerating artificial general intelligence.
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

Unlocking the Future of AI Supercomputing with Light

As artificial intelligence (AI) continues to advance, experts agree that the next big leap in the field will depend on building supercomputers on an unprecedented scale. One startup, Lightmatter, has proposed a novel solution to this challenge – connecting GPUs, the crucial chips for AI training, using light instead of electrical signals.

Case in point: Lightmatter’s technology, called Passage, uses optical or photonic interconnects built in silicon to allow its hardware to interface directly with the transistors on a silicon chip like a GPU. This could enable data to move between chips at much higher speeds than is possible today, potentially allowing for distributed AI supercomputers with over a million GPUs running in parallel.

Go deeper: OpenAI CEO Sam Altman, who has sought up to $7 trillion in funding to develop vast quantities of chips for AI, was in attendance at the Sequoia event where Lightmatter pitched its technology. The company claims Passage should allow for more than a million GPUs to run in parallel on the same AI training run, a significant leap from the 20,000 GPUs rumored to have powered OpenAI’s GPT-4.

Why it matters: Upgrading the hardware behind AI advances like ChatGPT could be crucial to future progress in the field, including the elusive goal of artificial general intelligence (AGI). By reducing the bottleneck of converting between electrical and optical signals, Lightmatter’s approach aims to simplify the engineering challenges of maintaining massive AI training runs across thousands of interconnected systems.

The big picture: The chip industry is exploring various ways to increase computing power, with Nvidia’s latest “superchip” design bucking the trend of shrinking chip size. This suggests that innovations in key components like the high-speed interconnects proposed by Lightmatter could become increasingly important for building the next generation of AI supercomputers.

The bottom line: As AI continues to advance, the race is on to develop the hardware capable of powering the most ambitious algorithms. Lightmatter’s approach to using light to connect GPUs could be a significant step towards unlocking the future of AI supercomputing, with potentially far-reaching implications for the field’s progress.

To Build a Better AI Supercomputer, Let There Be Light | WIRED

Recent News

Time Partners with OpenAI, Joining Growing Trend of Media Companies Embracing AI

Time partners with OpenAI, joining a growing trend of media companies leveraging AI to enhance journalism and expand access to trusted information.

AI Uncovers EV Adoption Barriers, Sparking New Climate Research Opportunities

Innovative AI analysis reveals critical barriers to electric vehicle adoption, offering insights to accelerate the transition.

AI Accelerates Disease Diagnosis: Earlier Detection, Novel Biomarkers, and Personalized Insights

From facial temperature patterns to blood biomarkers, AI is enabling earlier detection and uncovering novel indicators of disease.