new icn messageflickr-free-ic3d pan white
Moore's Law over 120 Years | by jurvetson
Back to photostream

Moore's Law over 120 Years

I updated the Kurzweil version of Moore’s Law to include the latest data points. Further UPDATE here, post Tesla AI Day.

 

Of all of the variations of Moore’s Law, this is the one I find to be most useful, as it captures what customers actually value — computation per $ spent. Humanity’s capacity to compute has compounded for as long as we can measure it, starting long before Intel co-founder Gordon Moore noticed a refraction of the longer-term trend in the belly of the then fledgling semiconductor industry.

 

But, Intel has ceded leadership for Moore’s Law. The 7 most recent data points are all NVIDIA GPUs, with CPU architectures dominating the prior 30 years. The fine-grained parallel compute architecture of a GPU maps better to the needs of deep learning than a CPU. There is a poetic beauty to the computational similarity of a processor optimized for graphics processing and the computational needs of a sensory cortex, as commonly seen in neural networks today.

 

Given the succession of substrates for computation over time, we would not expect the GPU to hold the torch forever. Over the past decade, I have been investing in various new architectures in molecular electronics and quantum computing. One of these, Nantero, just closed their last round of financing to ramp up production of their carbon nanotube memory chips (WSJ), and the magnetic chip company Everspin went public earlier this year. Since the vast majority of transistors manufactured are memory, not logic, we have bet on a bifurcation of Moore’s Law, with a focus on memory advances.

16,760 views
23 faves
8 comments
Uploaded on December 10, 2016