China has won the AI Accelerator Race

The Financial Times and Nikkei published a great article (paywall) looking at China’s Artificial Intelligence (AI) market. The headline says it all – over the past four years, Chinese AI companies have raised a combined $30 billion in venture funding. And while the article does not specify how many companies this encompasses, it mentions that China currently has over 100 ‘Unicorns’ (companies valued at over $1 billion), and that notes that “most of these” are AI start-ups. This is, to put it mildly, a lot of money.

Now to be clear, this covers “AI” companies broadly including many that are not chip companies. Most of the specific names mentioned in the article are hardware companies of one sort or another. From our research, we believe that there are at least 200 chip companies in China tackling AI in some form or another. Even if two-thirds of that $30 billion went to software and data companies, it should be clear by now (and here) that China is spending a lot of money to ‘leapfrog’ ahead of the AI chip market. And it has probably already accomplished this.

We think this is an important topic so we want to dig a bit deeper on the theme, because we think this is one area where China is emerging as a global player in semis.

First, what are “AI semis”. These chips go by a lot of different names, AI accelerators, TPUs, NPUs, NN engines and more. Set aside the acronym soup, the point of these chips is to perform specialized math in very high volumes. Much as the world seems to think that AI is magic, but people who actually build these systems recognize that AI is just a particular set of algorithms that allow computers to weigh multiple choices. (You could say decide, but that implies a level of awareness that is totally misleading.) More specifically, AI usually refers to Neural Networks, which at heart is just linear algebra – adding, subtracting, multiplying and dividing arrays of numbers – matrix math. The most advanced AI algorithms do an incredible number of these, and we do not want to diminish the immense amount of work behind neural networks, but the reality is that we got a (very generous) B- in college linear algebra and we can still manage these calculations.

So AI accelerators are specialized linear algebra chips. Semiconductors perform best when given a specific task and then made to that task a lot, really quickly. To give a sense of scale, AI accelerators measure performance in ‘TOPS’ or trillions of operations per second. So yeah, a lot of math, but basically the same type of calculation over and over again. String enough of these operations together and you get systems that can recognize faces or someday drive cars.

In this sense, AI accelerators are not terribly complicated. We can do these calculations on other chips – CPUs and GPUs – that we find in standard PCs, but these accelerators can do the calculations orders of magnitude more efficiently. So the appeal of AI accelerators is not technical, it is economic. To put this in context, Google, who really invented the whole category, said that their accelerator (which they call a TPU) would halve the number of data centers they would need to build. And since each data center costs around half a billion dollars, this is a lot of money. This is the magic of semiconductors which we discussed here in context of autonomous vehicles.

That being said, there is a very important wrinkle in this market. Little differences matter a lot when performed trillions of times a second. If one AI formula performs its math as divide, multiply, divide, but it is run on a chip that is built for divide, divide, multiply, most of the gains are lost, and a GPU will probably work just as well. (We are oversimplifying a bit, but not that much.) In practice, this means the software that runs on these chips matters a lot (i.e. Software Eating the World). For an AI accelerator chip to succeed economically it needs to be designed in a very specific way, tied closely to the application running on it. This turns out to be a major constraint on the AI chip segment. It likely means that there will not be a general purpose AI accelerator the way there are general purpose processors for compute (aka a CPU) or graphics (GPU). Instead there is likely to be a plethora of solutions, tied closely to specific customers.

Turning back to China, with hundreds of AI chip companies, it is not clear how many of them can grow to scale. And to be clear, there are probably 50 or 60 AI chip companies in the US facing a similar problem. True, we have seen some high profile exits, with Intel notably paying nine figures for several companies, but it is important to note that many of the acquired companies had won specific customer designs.

So China has a lot of companies chasing an amorphous market. Without doubt, many of these will find customers – in fields like facial recognition, automotive, IoT, etc. – but equally true that many more will not. Thinking through how that market will consolidate is a topic for a different post, but eventually consolidation will come.

However, in the interim there will be a lot of interesting AI chip companies coming out of China. And it is likely that China will soon have more stand alone AI chip companies than the US. In fact, writing this, we recognize that China probably already has more AI chip companies that are capable of standing on their own than the US. Even though the US already has a lot of start-ups in the space, we think far fewer of them will ever be able to stand up as independent companies. And given the state of the world today, we think it is important to note this outcome is not the result of unfair trade practices, or IP theft or any of the other accusations the US likes to throw at China. China has a burgeoning success story in this field, largely the result of the US not investing in chip design companies for a decade.

Photo by Rob Wingate on Unsplash

3 responses to “China has won the AI Accelerator Race

  1. If the calculations performed by the AI chips are not complex, is there a high entry barrier? Will the chips become commodified?

    • Yes and no
      The barriers to entry are pretty low which is why there are 300 start-ups working on this
      Getting it right however is proving very challenging. The real barrier, and likely the value, is in the software layer.
      That being said, I know people working on the chip side who say that almost no one on the chip side has really cracked the problem. Yet. So there improvements to be made.

  2. Pingback: Amazon AI Assemble | DIGITS to DOLLARS·

Leave a Reply to DavidCancel reply