Alibaba Wants Their Own Chip

There are two undeniable trends these days: major tech players want their own silicon and the new class of uber-rich want to own classy publications. πŸ—žοΈ

I’m more interested in the first trend (though the second could have interesting implications in this era of doom for publishers). Alibaba is the newest tech giant to announce they are making their own AI chips. 🏭

The chip will reportedly power the company’s cloud technology and IoT devices, and could be used for things like autonomous cars, smart cities and logistics.

This time it’s bigger than just wanting full control and integration. This move is also to reduce Alibaba’s reliance on the West and the impact of the shiny new trade war between China and the US. It also plays nicely into China’s national plan to be the AI superpower of the world. πŸ’ͺ πŸ‡¨πŸ‡³

Src: CNet

Facebook Spreads the Chips Around πŸ’«

It appears that Facebook is forgoing developing their own chips in favor of partnering with multiple top chip makers to add in their Glow machine learning accelerator. 🏎

It’ll be interesting to compare this approach to the other tech players looking to make their own. But this play makes sense. It follows Big Blue’s approach with prior software, doesn’t split their focus, and they also don’t have public-facing hardware offerings like Google (cloud) or Apple (duh). βš–οΈ

Also, Facebook showed off a new AI tool that uncovers bugs in code. That could be pretty nifty. 🐞

Src: TechCrunch

Chips Are Special Again πŸ€—

Remember the last time you agonized over choosing which CPU to get in your new computer? Yeah, I thought so. Maybe you put some thought about which number came after the “i”, but that was about it. We haven’t had to think a lot about computing chips lately, but that’s changing. πŸ”€

The chip continuum

Deep learning and crypto has caused a chip gold rush. Nvidia has blown up thanks to their GPUs being the tool of choice for machine learning. Crypto has already moved beyond the GPU stage and is firmly in ASICs. Bitmain, a maker of mining ASICs, is primed to have one of the richest tech IPOs ever. πŸ’°

The chip market is going to be really interesting as start ups try to carve out a slice of the market and new chips and architectures are created for all kinds of specialized purposes. Buckle up, this is gonna be fun. 🎒

Src: VentureBeat

You Get A Custom Chip! And You Get A Custom Chip! 🎁

As expected, Tesla is one of the newest companies to announce they’re getting into the AI chip game. Wait, what? 🀯

Yup, Elon is doing Elon things and building custom chips built to spec specifically for Tesla’s neural network architectures to drive their vehicles. 🏎

It’s really not that crazy when you think about it. They get more control over the car’s brain and it’s not like they’ll be selling them. As tweets in the article point out, it’s basically what Apple did when they started making their own A series chips. 🍏

Really the coolest part about this is that these new chips/computers are supposed to be able to replace all existing Tesla computers. So instead of having to buy a new super expensive car to get the shiny bells and whistles (like all other tech devices and cars) you just get your hands on the new brain and swap them out. This might be the most modular a car has ever been. πŸ”€

Src: TechCrunch

DARPA Gets Chipy πŸ›‘οΈ

Looks like tech companies aren’t the only ones getting in on the chip game. DARPA is adding its muscle to the sector in an effort to make sure hardware development and innovation doesn’t get overlooked or minimized. πŸ’ͺ

What are they focusing on? πŸ€”

Using ML to automate and expedite development time, and having the tools accessible to n00bs. 🏭

β€œWe’re trying to engineer the craft brewing revolution in electronics,”

Getting funky with materials and integration ideas. πŸ‘½

The ultimate goal is to effectively embed computing power in memory, which could lead to dramatic increases in performance.

Creating new architectures to make chips modular and changeable in real-time. Basically, they want one chip to be able to do more and different things. πŸ“

Today, multiple chips are needed, driving up complexity and cost.

Src: MIT Tech Review

Chips Meet Edge 🀝

The edge is gonna be big, so it’s no surprise that the tech giants are racing to combine AI and edge. Google is the first of the non-chip makers I’m aware of bringing their new hardware to edge devices. So you’ll soon be able to build IoT and other small devices that harness the power of TensorFlow without needing a constant connection. πŸ“Ÿ

The beauty of the edge is that you can load pre-trained models on all kinds of smaller devices that will then run the models using the data they collect without needing to constantly shuttle data back and forth to the cloud. ☁️

Google also recently announced that TensorFlow officially works on the Raspberry Pi after teaming up with the Raspberry Pi Foundation to make it easier to install. πŸ“

Src: TechCrunch

More at CNET

Brain Chips 🧠πŸͺ

IBM is in on the AI chip game, but they’re doing it a bit differently. They’re recreating the brain in silicon, literally. 🧠

They’ve essentially recreated synapses in microelectronic form, something that has been historically very difficult. Why does this matter if GPUs and software are doing such a good job? πŸ‘‡

They found the system to be as accurate as a software-based deep neural network even though it consumed only 1 percent as much energy.

Saving energy is always a good thing, but this would also open the door for AI on small devices. Edge FTW! πŸ™Œ

It’s still early days so only time will tell if this chip is the real deal. βŒ›οΈ

Src: MIT Tech Review

Nvidia Takes it to the Cloud

Cloud and software providers are moving in on the chip game, so why shouldn’t a chip maker move in on the cloud game?

Nvidia has announced that they’re combining 16 of their super AI friendly GPUs into high-performance computing clusters for cloud use. The new tech race is well underway and it is to become THE cloud platform for AI. Another one is to become THE hardware provider, but that is likely to stay a more diversified field than cloud computing could be.

Src: MIT Tech Review

Facebook Chips in on Live Video

I’m starting to get obsessed with watching who is making a chip for what AI purpose these days.

Facebook has announced they’re designing a chip to help with analysis and filtering of live-streaming video. Why? Two reasons:

  1. To minimize reliance on outside suppliers (here’s looking at you Intel)
  2. To avoid the embarrassing mistake of letting someone livestream their suicide or orgy (or something else equally against their TOS or that generally harshes their chill)

Microsoft made the tech world all about that software back in the day with Windows. Now it seems like the pendulum is swinging back towards hardware. And the cycle continues.

Src: Bloomberg