It seems very likely that most AI Training will be run on Nvidia GPUs. The software environments for training are too numerous and changing rapidly, which favors the soft, warm familiarity of CUDA over speculative gains offered by new entrants.
Margin Stacking and the Cost of AI – How much of the value of “AI” will accrue to hardware makers who have to add significant silicon content to make their products stand out.
Heterogeneous Compute – The competitive dynamics of the processor markets were static for years, but the demand for “AI” and the emergence of new customers/competitors means the market for all processors is likely to shift considerably in coming years.
Amazon AI Assemble – Amazon launches its ML accelerator, demonstrating how hard it is to supply the data center industry with chips.
AI accelerators are special purpose chips whose appeal is economic rather than technical. And it is now likely that China has more stand alone AI accelerator companies than the US has, or maybe ever will have.
In our corner of the tech world there is an endless tempo of 5G ‘excitement’. This is largely manufactured because, as we wrote about last March, 5G brings no immediate, […]