Meta’s new AI model has the tech industry revving its engines. With the release of Llama 3 just dropping, chipmakers are already flexing their muscles to show they’re ready to juice this large language monster to the max.
Leading the pack is Intel, which wasted no time putting Llama 3 through its performance paces across a buffet of hardware – Xeons, integrated graphics, discrete GPUs, you name it. The chip giant is practically frothy at the mouth to tout its “robust” validation of the 8B and 70B parameter models on its AI portfolio.
“We are excited to support the launch of Meta Llama 3,” Intel gushed, talking up derivative spin-offs like Purple Llama that it says “advances the development of open software to build generative AI trust and safety.” Sounds promising – if a bit hand-wavy – but the main event here is those sweet, sweet performance gains Intel is promising across the board.
Not to be outdone, Qualcomm is also rushing to get a piece of that Llama 3 action. It says dev tools “will be available” for optimized on-device execution on upcoming Snapdragon chips. The company’s on-device AI smarts with a “mix of NPU, CPU and GPU technology” could unlock new AI awesomeness like real-time personalization and improved privacy, they claim.
AMD, meanwhile, has been a bit more coy about showing off its Llama 3 credentials so far beyond just namechecking its partnership with Meta. No performance figures or product specifics as yet from Team Red – though you can bet they’re cooking up some AI benchmarks to flaunt before long.
Llama 3 Readiness Is The New Obligatory Chip Flex
The flurry of Llama 3 prep from these semiconductor juggernauts underscores just how quickly generative AI has become the new must-win battleground for the tech titans.
Now the AI arms race has every major player desperately trying to position themselves as the premier platform for this potentially terrain-shifting tech. Whether it’s client devices, cloud servers, or beefy data center silicon, companies want to become the default engine powering these large language behemoths as they grow more capable by the month.
After all, as AI keeps proving itself to be an incredible productivity multiplier that can rewrite entire industries, the firms that stitch themselves into the AI fabric could be cooking up the chips of the future. So you can expect plenty more Llama 3 benchmark bragging and optimization cheerleading in the coming days.
Will Llama 3 truly prove to be a game-changing leap over previous models? One thing’s for sure: the race to establish AI supremacy is only accelerating from here.