Billion Dollar Idea: Solve AI for power
There are 2 types of specialised AI chipsets:
With AI getting momentum and being widespread, the inference chips demand is going to peak.
To win the race, let's find the biggest bottleneck.
Building a data center is 1-2 years challenge. Current network capacity is good enough for text generation, don't expect a spike here. Power plant to supply a data center with energy is a venture for 3-6 years depends on type of a plant.
Therefore, is there is a chipset solving for power - it will be the next winner in the race.
Amazon and X is moving towards the right direction.
The simplest thing to outperform current AWS Inferentia 2 is to implement neural model with weigths in metal. Downside is that you cannot retrain it without physical rewiring.
However if you implement a model in bare metal, it will x100 the perfomance with the same TDP.
Currently there are FPGA (Field Programmable Gate Arrays) being extensively developed, which is middle grounds between the bare metal approach and existing AI specialised chipsets. Not the simplest solution though. But it is the most perspective direction is we focus on solving for Performance-per-Watt.
P.S. the version of this post for fellow colleagues on the board is here: https://blog.azolotarev.com/billion-dollar-idea-ai-board-version