News On Japan

Former PlayStation Chip Engineers Take on Nvidia

TOKYO, Jan 02 (News On Japan) - A new chapter is emerging in the race for AI semiconductors, as a Japanese startup founded by former PlayStation chip engineers sets its sights on challenging industry giant Nvidia with a radically different approach to processor design.

The company, LENZO, is developing a next-generation AI chip known as CGLA, short for Coarse-Grained Logic Architecture. The key selling point is power efficiency. Compared with Nvidia’s GPUs, the chip is designed to run AI workloads using up to 90 percent less electricity, a claim that could have far-reaching implications for data centers and AI infrastructure worldwide.

LENZO’s core team includes engineers who previously worked on the PlayStation 2 and PlayStation 3 processors, as well as specialists who helped develop supercomputer chips during their time at Fujitsu. The company aims to bring its first commercial chip to market in the spring, with manufacturing handled by Taiwan Semiconductor Manufacturing Co. The finished chip is expected to measure roughly five millimeters square.

At a time when Nvidia dominates roughly 90 percent of the global AI chip market and boasts a market capitalization exceeding 600 trillion yen, LENZO’s challenge may seem audacious. Yet its founders argue that Nvidia’s dominance is built on an architecture that is approaching its physical limits.

At the heart of the issue is power consumption. Conventional CPUs and GPUs are based on what is known as the von Neumann architecture, in which memory and computation units are separated. This structure requires constant data movement between memory and processors, consuming vast amounts of energy in the process. In fact, studies show that moving data just one millimeter inside a chip can consume more power than performing an arithmetic operation itself.

While GPUs improved on this by processing data in large batches, they still suffer from heavy energy loss caused by frequent memory access. Today, more than half of the electricity used by AI servers is consumed not by computation, but by data transfer between memory and processing units.

Google’s Tensor Processing Unit, or TPU, addressed this issue by adopting a dataflow architecture optimized for matrix calculations used in AI. By streaming data in a fixed sequence, TPUs reduce memory access and improve efficiency. However, they are designed almost exclusively for matrix-based AI workloads, limiting their flexibility.

LENZO’s CGLA takes a different approach. Rather than fixing the data flow in advance, it allows the flow of data between processing elements to be reconfigured freely. This enables the chip to handle not only current AI models such as transformers, but also future algorithms that may rely on entirely different computational structures.

According to the company, this flexibility allows CGLA to combine high power efficiency with broad applicability, something neither GPUs nor TPUs can fully achieve. While GPUs offer versatility at the cost of power efficiency, and TPUs offer efficiency at the cost of flexibility, CGLA is designed to deliver both.

Another advantage lies in cost. Modern AI chips rely heavily on high-bandwidth memory, which has become increasingly expensive. By reducing the need for constant data movement, CGLA can operate with less memory, lowering both energy use and production costs.

Yet perhaps the biggest obstacle facing any new AI chip is not hardware, but software. Nvidia’s CUDA platform has become the de facto standard for AI development, deeply embedded in research and commercial applications alike. Many developers write their code specifically for CUDA, making it difficult for alternative hardware to gain traction.

LENZO acknowledges this challenge but sees opportunity in shifting industry trends. As cloud providers and AI developers seek alternatives to Nvidia’s ecosystem, interest in non-GPU solutions is growing. The company believes that demand for energy-efficient and flexible chips will increase as AI workloads expand and power costs rise.

The company also sees long-term value in adaptability. Today’s AI systems rely heavily on transformer models, but new approaches are already emerging. If the dominant algorithms change, hardware designed for a single method could quickly become obsolete. CGLA, by contrast, is designed to adapt through software rather than hardware redesign.

In this sense, LENZO is not simply trying to build a faster chip, but to redefine how AI processors are structured. Whether the company can overcome Nvidia’s entrenched ecosystem remains to be seen, but its technology highlights a growing recognition that the future of AI will depend not only on performance, but on efficiency, flexibility, and sustainability.

Source: テレ東BIZ

News On Japan
POPULAR NEWS

The admission fee for the World Heritage-listed Himeji Castle in Himeji, Hyogo Prefecture, was revised on March 1st for the first time in 11 years, introducing a dual pricing system that significantly raises costs for visitors from outside the city.

An avalanche struck an advanced-level course at Madarao Kogen Ski Resort, which spans Niigata and Nagano prefectures, on February 28th, leaving four people injured, including two family members.

An eight-year-old Australian girl died after a snowmobile overturned in Hakuba Village, Nagano Prefecture, at around 11 a.m. on February 28th, with authorities investigating the cause of the accident.

The assembly of a massive shield machine for tunnel construction at the Kanagawa Station site of the Linear Chuo Shinkansen has been completed, with the site opened to the media as excavation prepares to move forward toward Nagoya.

Although February is typically the height of the hibernation season, bears have already been sighted across Japan, raising concerns of another wave of deadly encounters.

MEDIA CHANNELS
         

MORE Web3 NEWS

The fluffy “Tomita no Tamago Baumkuchen,” produced at the cafe Yuuhi Terrace in Miyakonojo, Miyazaki Prefecture, has become a local specialty sweet made with locally sourced eggs and ingredients from across Kyushu.

An AI startup that emerged almost overnight, Akari had long been known only to insiders due to its limited media exposure, but after receiving investment from Mitsubishi Electric at the end of January and seeing its corporate valuation surge past 100 billion yen, the Tokyo-born venture has rapidly positioned itself as a leading unicorn candidate in Japan’s AI sector.

Mizuho Financial Group has decided on a policy to improve operational efficiency through the use of artificial intelligence, aiming to reduce administrative work equivalent to as many as 5,000 employees over the next decade.

An analysis of posts on the creator platform note has produced a ranking of the most talked-about generative AI foundation models, based on a surge in articles about how these tools are being used across industries, with the top spot going to an AI increasingly adopted in education.

How will AI transform marketing? The answer, according to leading marketer Kazuki Nishiguchi, lies not in marginal efficiency gains but in a dramatic restructuring of business itself, as AI agents move closer to consumers and potentially displace even dominant platforms such as Amazon.

Business leaders gathered at the 64th Kansai Business Seminar held at the Kyoto International Conference Center on February 5th and 6th to debate pressing issues facing the regional economy—including AI adoption, the legacy of the Osaka–Kansai Expo, and the use of foreign talent—offering a snapshot of where Kansai stands and where it may be headed.

Anthropic’s latest Claude rollout is reigniting a familiar fear across Silicon Valley: that AI “agents” will hollow out the software-as-a-service business by replacing subscription tools with a single model that can handle office workflows end to end.

Statistics show that over 12.41 million Japanese people use cryptocurrency. That’s about 15% of the country’s adult population.