At its annual re:Invent conference in Las Vegas, Monday, Amazon’s AWS cloud computing service disclosed the third generation of its Trainium computer chip for training large language models (LLMs) and other forms of artificial intelligence (AI), a year after the debut of the second version of the chip.
The new Trainium3 chip, which will become available next year, will be up to twice as fast as the existing Trainium2 while being 40% more energy-efficient, said AWS CEO Matt Garman during his keynote on Tuesday.
Also: AWS says its AI data centers just got even more efficient – here’s how
Trainum3 is the first chip from AWS to use a three-nanometer semiconductor manufacturing process technology.
In the meantime, the Trainium2 chips unveiled a year ago are now generally available, said Garman. The chips are four times faster than the previous generation. The chips are geared toward LLM training, and Garman emphasized performance on Meta Platforms’s …