Two breakthroughs made this possible.

Computers’ energy efficiency is increasing exponentially. The corollary to Moore’s law says that the energy needed to transmit information also falls as we shrink the size of chips because information has to travel a shorter distance. Since 1946, the number of computations possible per unit of energy has roughly doubled every 1.5 years. This enables useful computing to be done with tiny bursts of energy harvested from the vibrations, heat or the wireless signals around us.

Secondly, new stable memory chips known as Ferro-electric RAM can be reliably reprogrammed despite weak, intermittent power sources.

Such developments open up new possibilities for autonomous, reprogrammable computers that stay permanently embedded in objects without onboard power or maintenance. The first applications are likely to be pacemakers and other energy-constrained biomedical implants. Eventually, vast, cheap sensor networks are expected to blanket roads or farms with millions of individual computers embedded in trees or concrete.

Parks and his team presented their work at a communications technology conference in San Francisco this April as open-source designs for hardware and software.

📬 Sign up for the Daily Brief

Our free, fast, and fun briefing on the global economy, delivered every weekday morning.