By Sir Harold Evans

The Age of Intelligent Things

In 1984 my wife and I bought a weekend house on the ocean in Quogue on Long Island. Every Friday round about Exit 60 on the drive out, I’d face the Goldilocks question: In summer, would it be too hot inside the house? In winter, too cold? Too humid, encouraging those fat, blood-sucking mosquitoes? Could we get a friendly neighbor to stop by and check all round? (No, they’re away, same as us.) Did the cold snap freeze the plumbing again and turn the downstairs toilet into Niagara Falls? These days I have an icon on the screen of my iPhone. It says “Honeywell.” After we rebuilt the house post-Hurricane Sandy in 2012 we installed Total Connect Comfort, which means the smart thermostats in my new HVAC system are linked to the Internet. Now, through my desktop in the city, I set the ranges of temperature for the zones at the beach house.

The Honeywell system promises to send me an email alert if anywhere gets very hot or very cold while we are back at work in Manhattan. As we set out to Quogue on a wet day, I thumb the icon on my mobile. Up comes a panel. Zone by zone, I zip room by room, to read the temperature on my screen in big numerals and the percentage of indoor humidity—all as I had set them previously. Except the unoccupied spare room where our drop-in guests are due. It is 77, seven degrees higher than I set it. My wife takes the mobile while I navigate the traffic.

The installer is a click away on our Honeywell screen mobile, so he gets a message. When we arrive, the 77 is not a misreading. The speedy serviceman has found a mechanical fault in a vent. He soon fixes it. I walk round the house and find the bedroom too cold at 65. The intelligent thermostat is also showing a five-day weather forecast that’s none too cheerful. Feeling like the captain on the bridge of an ocean liner checking his instruments, I command the invisible engine room, pressing arrows up or down to indicate how long the new temperature should be maintained in each room. The system is really swift to respond. An hour later, the drop-in guests phone. They have caught an earlier train. I have their room at 72 in a flash. I am a master of the universe! My Honeywell system is smart. But is it intelligent yet? It still requires a lot of my attention and decision-making. It draws on the resources of the Internet. But in the words of BK Yoon, the CEO of Samsung who rebelled, if only briefly, during his keynote at the 2015 International CES, the giant technology trade show in Las Vegas, “We can’t just talk about the Internet of Things, because it’s so impersonal, like a bed time story for robots.” To truly find meaning, we must look beyond the current, novel phase for connected devices to the transformative experience of the near future, what pioneers call the Intelligence of Things. In this new age, smart connected devices don’t just hand over data to their human users, who then must engage more software or analysis to make decisions. Intelligent things direct human action or interaction.

Mike Dennison, president of the Consumer Technologies Group at global design, engineering, and manufacturing company Flex, believes this age reflects a more intimate relationship than the wild cornucopia of the Internet: “A lot of people are connected to things, things that are connected to the cloud, the Internet of Things, if you will. But they’re not adding any value to that connection. Intelligence of Things goes a step further and really becomes the way you interact, what changes you make as a person because it is intelligent. It is giving you real feedback, real information about what you should be doing or not doing.”

The promise of this intelligent age is driving significant investments in the technology market. Last year, Google acquired a smart home system from Nest for $3.2 billion. Nest followed up by paying $550 million for Dropcam, a Wi-Fi system of video security cameras with night vision and zoom lenses. For followers of technology or investors in technology stocks, the idea that one computer chip could tell another what to do can feel new and exciting. But this important moment in innovation is a milestone on a timeline that began almost a century ago.

Color photos were cutting edge in the 1930s and '40s–about when Alan Turing was pioneering computer intelligence.

Color photos were cutting edge in the 1930s and '40s–about when Alan Turing was pioneering computer intelligence. These 70-year-old U.S. Library of Congress photos are now one of Flickr’s most popular sets.

In the Oscar-bound World War II movie The Imitation Game, the hero, Alan Turing, asks the relevant question. Britain’s brightest brains at the secret Bletchley Park cipher campus 50 miles northwest of London struggle to break the code that German admirals use to communicate with U-boat commanders in the Atlantic, so the convoys and their Royal Navy escorts haven’t a clue where danger lay. The British could intercept messages but they translate only as gibberish because every one is encrypted by Enigma, a machine the size of a typewriter with a dynamic rotor and relay switches. In 1939, Enigma operators were changing the cipher rotor settings every 18 hours, and that meant, Turing told Bletchley’s frantic cryptanalysts, they had to find the needle in 159 million million million haystacks: one in 159,000,000,000,000,000,000 [18 zeros]. Just to check one setting would have taken 10 analysts 20 million years. In reality, to stop an oncoming U-boat attack on a convoy, they had no more than 20 minutes. “Enigma is a very well-designed machine,” said Turing. “Our problem is that we’re trying to beat it with men. What if only a machine can defeat another machine?”

In The Imitation Game, the idea of building a machine to read another machine seemed crazy to the admirals in charge of Turing’s code-breaking quest. At one point they stop it. Watching the movie in 2015, we all but snicker at the fools in uniform who get in Turing’s way. Give them a break. They had never heard of the concept we know as the Intelligence of Things, or the Internet, still less the Internet of Things, now subsumed in the acronym IoT. A cloud was a cloud, a dark mass on the horizon endangering shipping, not a host with a voracious appetite for a multiplicity of signals from things called smartphones, or legions of thieving hackers and other nasty intruders on privacy.

The electro-mechanical machine Turing built, a six-foot-tall, wall-length unit that gathered intelligence under the code name Ultra, broke enough of the “impossible” code to give the Allies a chance of winning the war at sea. But the British machine that beat the German machine was a product of human intelligence. The collaboration of mathematician and rotors was proving a point that would be articulated 70 years later by Clive Thompson in his new book, Smarter Than You Think, that we are not being dumbed down when we call on a computer’s memory system. We are actually enhancing our own native cognition. We need not feel inferior. Turing’s innovative conviction, that human intelligence could outwit itself by devising a machine capable of what an unaided human was not, was brilliantly sustained by Thomas “Tommy” Harold Flowers at Bletchley. In 1943, Flowers first released Colossus, the world’s first programmable electronic computer speeded by the electrons in 1,500 vacuum tubes. They were the electrons that the greatest of innovators, Edison, just missed detecting in 1889.

The progress from Turing and Flowers, from the room-size computer to the personal computer and the smartphone, is one of those successive discoveries and inventions—and failures—by many, translated into periods of innovation by a few. On May 16, 1960, Theodore Maiman demonstrated the first working practical laser. It was, he said later, “a solution looking for a problem.” Few had any use for it. Maiman had to create his own company to exploit its manifold uses, so realized in today’s connected world. As an inventor who became an innovator, he was a classic example of the important distinction: The scientist seeks understanding, an inventor a solution. An innovator seeks a universal application of the solution by new or creative means. To understand the delineation of invention and innovation, Peggy Smedley, a commentator of the connected world, insists, “We have to look back before we can look forward.”

Alexander Graham Bell discovered that sound waves could be converted into undulating electric current, but his phone of 1876 was calculated to develop the American voice and lungs rather than encourage conversation. Thomas Edison made more than 3,000 experiments to invent the incandescent light bulb only now being phased out, and ever since was the icon of inventiveness. But others were close behind. Edison’s achievement was innovative, the conception and creation of a whole electrical industry, capped by the brilliance of his one-time assistant Samuel Insull who evolved the peak/off-peak, pricing system that powered American industry. An Edison is exactly what the Intelligence of Things movement requires, a leader who can make a universally accessible application—a system of hubs, perhaps—for the convergence of idea and things.

WE HAVE TO LOOK BACK BEFORE WE CAN LOOK FORWARD.

But Edison missed the electron. He saw carbon spots on the glass of one of his experimental lamps but simply put didn’t work out that they were dumped by electrons. Gifted others followed up years later, got the electrons into a vacuum tube. There they could detect and transform not only an alternating current but also an incoming radio wave to actuate a faint sound in a headphone. So we achieved radio. But the tubes could not exploit the 1940 insight of the father of the digital age, Claude Shannon (1916-2001). He was a mathematician who amused himself riding a unicycle of his own design while he juggled beanbags and steel balls. His groundbreaking thesis was that any kind of message—words, images, music—could be communicated by binary digits, ones and zeroes, one representing an electrical switch turned on and zero a switch turned off. Then we had to wait for the chip, advancing from the transistor and the semiconductor, before we could have CDs, satellite communications, cellular telephones, personal computers, and the Internet. The microprocessor industry has since excelled. It has made Gordon Moore’s Law look timid. Everyone marveled when, in a casual remark, he said the number of transistors in a dense integrated circuit doubles approximately every two years. Moore said later, “I had no idea that anybody would expect us to keep doubling [capacity] for 10 more years. If you extrapolated out to 1975, that would mean we’d have 65,000 transistors on a single integrated circuit. It just seemed ridiculous.” It wasn’t. It was the acme of prudence.

Today the industry can manufacture a billion transistors on a chip: all building blocks for our daily lives. As vital a key to our connected world was the liberation of software from hardware. The too-often overlooked Gary Kildall, who died at the age of 52 in 1994, wrote the first language for a microcomputer operating system and the first floppy disk operating system before there was even a microcomputer, months before there was an Apple, years before IBM launched the personal computer. Kildall did it, moreover, in such a manner that programmers were no longer restricted by compatibility with the computer’s hardware. Anybody’s application could run on anybody else’s computer. It was the genesis of the whole third-party software industry. His partner, Tom Rolander, puts it well: “Think about how horrible it was for the software vendors before that time. They would have to have different copies of their program configured to different pieces of hardware.” Bill Gates adapted Kildall’s operating system and brilliantly developed from scratch a standard-setting mass-market company that within a decade achieved his mission to put a computer on every desk running Microsoft software. And finally, thanks to a collaboration of government, universities, and some farsighted individuals, we got the web as a decentralized network with a series of protocols to enable computers to communicate with one another and unique addresses for every webpage. The Intelligence of Things is today a Christmas tree grown from a seed fertilized by genius, wreathed now with colorful lights, gifts, and novelties, like “Dorothy,” a device with a built-in Bluetooth chip from a tech company called iStrategyLabs. It slips into a shoe, even a ruby-red slipper. And when clicked, it can send a message from a connected smartphone or summon an Uber car.

THERE WILL BE MILLIONS MORE LIKE HENRY ALFORD.

Some of the gifts under this tree are more thoughtful, like the ICEdot Crash Sensor, which goes on your cycling or climbing helmet and detects motion, changes in forces, and impacts. If you fall, ICEdot triggers a smartphone app over low-energy Bluetooth to sound an alarm, share your GPS coordinates, and initiate emergency contacts. The perennial gift under this tree, of course, is the latest iPhone, which is also a camera, a screen, and a sensor. It can send images across the world in a blink; it can play Bach or Beyoncé through an attached earphone, quietly so no one else can hear; and it can also, by the way, mail messages across the globe; search a library; pinpoint precisely where you are waiting for an Uber car to pick you up; show you its position in a traffic jam; and offer Scrabble while you wait, or a forgiving chess game that allows you to take back a bad move and suggests a better one. That’s a good idea for life itself. Steve Jobs made bad decisions far outweighed by the great ones. He looked back and forward. He was not equipped to make an innovative contribution in the sense of circuits and voltage, but he had the vision of a seer to put together the inventions of others before him and do that with sublime taste. His axiom might be the motto of the Intelligence of Things: “People don’t know what they want until you show it to them.” He admired Edwin Land, who also regarded conventional wisdom as an oxymoron. Nobody except Land had even thought of how nice it would be to have a print of a photograph without waiting. It would have been nuts to think it possible, within the confines of a handheld camera, to develop the negative, rinse it, fix it, dry it, expose the positive, develop it, rinse it, fix it, wash it, dry it again and have the print ejected in 60 seconds. Steve Jobs did it digitally, thanks to Claude Shannon. Jobs’s unique contribution may be his application of design and humanistic thinking to technology, as evidenced in the first Apple Macintosh computer, then in Pixar films, the iPod, and the iPad. His “insanely great” manufacturers and marketing represent an imaginative apogee of form and function.

The beauty and intuitive usefulness of Apple products (but not its insane passion for passwords) seems emblematic of the Intelligence of Things. To realize the full promise of the age of intelligence, there are things to be done cooperatively in the industry, including privacy, security and interoperability standards. Without one common platform, as the Internet itself works, the full potential of the IoT will not be realized. Samsung writes code and designs products for Android, Apple for Apple. (I have lost track of how many times Apple has changed chargers.)

Neither Apple with iOS nor Google with Android is in any hurry to break down the walls between their app stores. It would be nice to integrate our apps so we can use just one to cause the garage door to open, activate the security camera, check on a sleeper, lower the air. The good news is that the cost of making apps is falling. According to Goldman Sachs, some significant connection costs are also tumbling. In 10 years sensor costs have fallen from an average of $1.30 to 60 cents, bandwidth by nearly 40%—a factor of six—processing costs by 60%, and GPS chips will become so cheap they are likely to be inside anything of value, including our bodies. The enablers of this new smart world are very busy creating still more appealing apps. As of June 2014, 1.3 million apps were offered on Samsung’s Android devices, 1.2 million from Apple. New York Times writer Henry Alford started 2015 by confessing, “I am afraid of the Machine.” He asked with exasperation, “What Isn’t There an App For?” But being a conscientious guy, he was persuaded to test-consign his health, home, and wardrobe to three weeks of living off his smartphone. The Henry who people used to know can now do the hip-hop Watergate Dance, and the springs in his dining room chairs won’t collapse anymore, because they’ve been fixed by “Andy” sent over by the house concierge app TaskRabbit. There will be millions more like Henry Alford.

When it comes to the app I now use to regulate the climate in my home, I feel good about the marriage of my smartphone and smart thermostat. What more do I want in this brave new connected world? Well, I checked Nest, fed in my zip code and square footage of the house, and got an immediate claim that they’d save us up to $532 a year. I am also going to ask Atlanta’s Soneter for a quote on its ultrasonic sensors so I don’t ever get a busted pipe in the winter and count the dollars wasted in the hot summer. And now that I am thinking smart, I rebel against anymore bending over in wind and rain to read the levels in the propane tank under the garden. I want the tank to talk to the thermostat. I want the tank to say that at the present rate it will be down to 5% in a week and it intends to keep interrupting the thermostat until gas arrives. I want the intelligent thermostat to send three emails, one to me, one to the gas supplier, and one to my accountant to compare our costs with last month.

The accountant is only human, so I don’t expect to have a message from him yet telling me how many energy dollars I saved by heeding the alert about the over-heated room. When I brag about all this to my friend Peter Wachter, a lighting and energy evangelist, he rebukes me for not already having installed the intelligent Philips Hue programmable color changing Wi-Fi-controlled LED light. “LEDs are at the vanguard of the smart home of the future, with their own on-board microprocessors,” he says. “Using third-party apps, each bulb is independently controllable for color and brightness. Simulate sunrise, sunset, invent random light shows or sync to music.” I have a sentimental affection for the light bulb partly because of Edison’s 3,000 attempts to find a filament that would stay incandescent.

Peter maintains the cooler LEDs with printed circuit boards are better candidates for piggy-backing on ever more enhanced capacities. “And saving energy will drive LED adoption to gargantuan numbers.” It’s not the first time I was astonished by the Intelligence of Things. I was a young science reporter in Manchester, England where Alan Turing worked after the war. He built a big electronic stored-program computer called the Automatic Computing Engine (ACE) at the National Physical Laboratory. I was asked to watch it work and give it a test question. On May 10, 1950, it had registered a clock speed of one megahertz, which made it then the world’s fastest computer. My question had none of the drama of speeding U-boats and convoys. I demanded an immediate answer to the question: What day will it be on my 55th birthday? In a flash it spit out the correct answer, "Tuesday." Now I'm waiting for an app to make me look that young again. Sir Harold Evans is Editor-at-large at Thomson Reuters and author of They Made America: From the Steam Engine to the Search Engine — Two Centuries of Innovators.