Google may use Intel Foundry’s EMIB packaging for TPUv8e, boosting Intel’s AI chip ambitions and reshaping advanced ...
Most of the companies that have fully committed to building AI models are gobbling up every Nvidia AI accelerator they can ...
Google is packing ample amounts of static random access memory into a dedicated chip for running artificial intelligence ...
Google Cloud has unveiled its eighth-generation Tensor Processing Units, splitting into TPU 8t for large-scale training and TPU 8i for inference, with major gains in performance, memory, and ...
Google is for the first time splitting its AI chips into two lines, a sign that a new AI battleground is emerging.
Hosted on MSN
Google Cloud launches TPU 8t/8i to power next-gen AI
Google Cloud has introduced its eighth-generation Tensor Processing Units, the TPU 8t and 8i, aimed at supporting trillion-parameter AI models and large-scale inference. The chips promise major gains ...
At Google Cloud Next ‘26, the company unveiled two AI chips, each tailored specifically for training and inference.
Google released not one but two eighth-generation tensor processing units, or TPUs, at the Google Cloud Next 2026 event in ...
Google is redesigning its AI hardware and software playbook, introducing separate chips for training and inference while ...
Google unveiled new generation tensor processing units (TPUs) for training artificial intelligence and powering digital ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results