You’ve probably never heard of Percy Spencer, but you’ve certainly used one of his most famous inventions—probably to make popcorn.
The microwave was invented in 1947 and soon became an American household staple. The devices sped up the heating and reheating process in the kitchen, and flew off store shelves and onto countertops by the millions in the 1970s, ’80s, and ’90s. But in recent years, sales have started to stagnate and decline. In part, that’s because the convenience they offer comes with the cost of uneven cooking. In the late 2010s, the microwave needs a technological makeover to improve the way they generate heat, while remaining smaller and faster than traditional ovens.
Spencer, an engineer during the first and second World Wars in the US, stumbled upon the physics that power the microwave. He worked at the Raytheon Manufacturing Company, which to this day produces missiles and other warfare technology. After World War II, he began a project working with magnetrons, which produce electromagnetic wave pulses that can be used like radar. George “Rod” Spencer, Percy’s grandson, told Popular Mechanics that his grandfather always kept peanut clusters—like a nutty snack bar—in his pockets to feed animals outside the plant in Cambridge, Massachusetts. While testing a magnetron designed to improve plane radar, Percy slipped his hand into his pocket and discovered that the peanut clusters had melted.
Spencer had discovered that magnetrons can be used to create microwaves, which are a form of electromagnetic radiation with waves shorter than radio waves but longer than anything we can see. When electricity runs through a metal filament, negatively charged electrons rush to positive end. Magnetrons keep these electrons going in a loop created by a magnet. As electrons whiz around, they create electromagnetic microwaves that radiate outward.
These waves hijack the water molecules in food. When exposed to microwaves, water molecules will start wiggling around rapidly. All microwave ovens have metal mesh lining, which reflect these waves, so they bounce around in a closed setting about 2.5 billion times per second. Water molecules get tugged back and forth so fast they create friction that heats food from within.
The first consumer microwave from 1947 was the size of a coat closet and had to be hooked up to a plumbing system to be cooled down with water. Over the next 20 years, technology got smaller and cheaper. In 1967, Raytheon introduced a smaller model fit neatly on a countertop. It used 100 watts of electricity, and cost about $500 ($3,660 in today’s dollars). From there, the microwave took off.
Seventy years later, microwaves are long overdue for an upgrade. They warm up food faster than ovens, but it’s basically impossible to actually cook anything thoroughly because the heating is always uneven. “If you look at the dial on the microwave oven at home, you can set it from 250 watts to 1,000 watts, but that is more or less a trick,” says Klaus Werner, the executive director of the RF Energy Alliance. Even though you can control the the amount of electromagnetic microwaves produced for a period of time, once they’re released into the metal cavity they bounce around uncontrollably, hitting some spots more than others. There’s no way to control them. The turntable was supposed to correct for that by exposing food to these waves from multiple angles, but it’s a shoddy fix.
In a 2015 paper (paywall), Werner outlines a better solution: a solid-state semiconductor, paired with signal amplifiers and receivers. Semiconductors are made out of ceramics, like silicon, which typically block the flow of electrons, but have chemical impurities that help electrons move only in one direction. In general, semiconductors slow down electricity coursing through a system. Meanwhile, the amplifier and receiver create a power feedback loop that allows the semiconductor to adjust and produce the right amount of microwaves, at the right power level, and for the correct length of time, to heat food evenly.
In theory, this improvement would mean microwaves could be used to cook everything from eggs (which explode in current microwaves because the water in them heats up too quickly) to meat or fillets of fish, which require even heating throughout to kill foodborne bacteria.
Like with the magnetron microwave, it will undoubtedly take a few years before semiconductor microwaves become a universal cooking device. At the moment, the US military uses semiconductor microwaves, and there are a couple of soon-to-be commercially available devices. There’s the Adventurer, created by the UK-based company Wayv as a camping microwave, which uses batteries and will go on sale in the US later this year for $199. There’s also the a Dutch company called NXP that has developed the concept for a smart oven that uses semiconductors—no word on pricing or when that will hit the market.
For those of us who aren’t as comfortable with our cooking skills or simply lack the time, semiconductor microwaves would mean faster meals that taste much closer to something cooked through traditional means. Yet no matter how good microwaves become, they’re never going to give you a perfect sear on a steak.
Correction: An earlier version of this post said that microwaves can be set from 50 Watts to 250 Watts. They can actually be set from 250 Watts to 1,000 Watts in 250 Watt steps. Additionally, an earlier version of this article referred to the NXP oven as a “Sage” oven, but we have since learned they have had a cease and desist request for the specific product; they make the hardware for the product only.