All the big guns have such impressive economies of scale at purchasing high end hardware. How do small companies compete? What’s 4 servers versus 40,000? I did not realise even LinkedIn had such a big demand.
Intel understands that the market is changing. Four years ago, the chip maker told us it sells more server processors to Google than it sells to all but four other companies—so it sees firsthand how Google and its ilk can shift the chip market. As a result, it’s now placing bets everywhere. Beyond snapping up Altera and Movidius, it has agreed to buy a third AI chip company called Nervana.
That makes sense, because the market is only starting to develop. “We’re now at the precipice of the next big wave of growth,” Intel vice president Jason Waxman recently told me, “and that’s going to be driven by artificial intelligence.” The question is where the wave will take us.
LinkedIn is designing and building nearly all the pieces and parts of software and hardware that it needs for its data centers, poaching away key people from Facebook and Juniper to do it.
Microsoft today open sourced its next-gen hyperscale cloud hardware design and contributed it to the Open Compute Project (OCP). Microsoft joined the OCP, which also includes Facebook, Google, Intel, IBM, Rackspace and many other cloud vendors, back in 2014. Over the last two years, it already contributed a number of server, networking and data center designs.