new icn messageflickr-free-ic3d pan white
Mythic Intelligence at the Edge | by jurvetson
Back to photostream

Mythic Intelligence at the Edge

My most recent investment and board seat came out of stealth today. Mythic is taking a radically novel approach to building low-power, low-cost neural nets that can make any product more intelligent. Imagine any device having a voice interface like the Amazon Echo, but with local processing. It would be more responsive and secure with ~100x power and cost savings. And no internet connection needed. Consider a Roomba or microwave oven. A rich voice interface could cost less than the clumsy buttons it replaces. Imagine a Dropcam with local intelligence to avoid the daily false alarms. Drones that could better track their targets.

 

Mythic implements the basic computation of machine intelligence — matrix multiply and add — by using a standard flash memory array and modifying the peripheral circuitry so that the memory cells store analog values for activation levels (a) for each neuron, and the weights (w) of each synapse. In the digital domain, multipliers and adders take many transistors, each consuming power and time. A Mythic multi-bit multiply and accumulate can be done with a single transistor! Computation and memory are unified, as in the brain. And flash is a non-volatile memory that consumes no power in standby mode.

 

Couple some local intelligence to each sensor and “The internet of things is becoming the sensory cortex of the planet” — TechCrunch

 

“'When you start to see these hints that we re-create evolutionary biology in a computer, you get the same basic building blocks, the same developmental milestones. It’s kind of spooky!' When discussing Mythic, Jurvetson compared the startup to Minecraft…” — VentureBeat

 

I was drawing a loose analogy to the redstone wire in Minecraft implementing a digital OR. Mythic uses a common memory readout wire for current accumulation. Addition in the analog domain is instantaneous, and just takes a wire. It needs none of the transistors or time consumed by a digital carry-adder. With just inverters and OR logic to build with, here is an 8-bit CPU that my son built back in grade school.

 

From the NVIDIA CEO: “We’ll soon see the power of computing increase way more than any other period. There will be trillions of products with tiny neural networks inside.”

 

The march to specialized silicon, from CPU to GPU to FPGA to ASIC is now going further, to analog and quantum processing. At a high level, we are recapitulating our evolutionary computational march in silicon, and an ever-growing percentage of our compute will be massively parallel and in-memory processing, just like our cortex. I wrote a blog post about when Intel acquired our deep learning chip company Nervana.

 

Now it’s time to get Mythic.

9,502 views
8 faves
7 comments
Uploaded on March 23, 2017