For the past few years, Marc Andreessen—a co-founder of Netscape and prominent Silicon Valley venture capitalist—has been espousing the fact that software is going to eat the world.

Andreessen has made a lot of good predictions over his career, but we don’t just have to trust his track record on this one. We can see software breaking into new industries daily. Right now, education and medicine seem to be the prime targets, but it’s hard to imagine any job that hasn’t been changed by software in the past ten years. There is more software affecting jobs every day, and the trend seems to be accelerating.

On the other side of this growing ubiquity is the push from the likes of Golden Krishna—a leading critic of user experience design—who begs companies not to “slap an interface on it,” a push back against the trend of exposing users to software everywhere. There needs to be less interaction between users and software, not more.

Because more software is coming and we need to interact with it less, computers are going to disappear. 

The history of computers is all about augmentation. Vannevar Bush—an analog computer innovator and founder of the Manhattan Project—first proposed the Memex in the 1930’s to improve his reading ability by recording and connecting the articles he encountered. When computers became a reality, it was their ability to help their human counterparts quickly evaluate complex mathematics that made them valuable. 

With the internet, computers began to augment the connections between people and the computer became a source of knowledge instead of a sink that held it. The amount of content published quickly became overwhelming and curation became critical. Companies like Google and Yahoo helped filter out the good stuff and made the internet more personally relevant. Recently, the social web has extended that personalization as virtual identities have become increasingly important.

Augmentation implies that the computer is adding something to our innate capabilities. A part of the interaction, though, has been our investment in the computer. A virtual identity is not an add-on to an offline identity, but an extension of it. As a result, the computer is becoming an integrated part of what we are.

Unfortunately, using a computer sucks. The workflow for interacting with a computer goes something like this:

  1. I want to know something
  2. I decide to ask the computer
  3. I get the computer ready to accept my question
  4. I pose my question so that the computer can understand and enter it
  5. The computer returns information relevant to my answer
  6. I search that information for an answer
  7. I know the thing I wanted to know

I interact with the computer for steps two through six, but I only care about steps one and seven. I want to get rid of every step that involves a computer, and, luckily for me, computers are getting smart enough to do those steps for me. 

That’s why computers are going to disappear. They’re learning how to do everything I use them for. There are three awesome examples of this that are leading software to help computers disappear.

Nest, the smart thermostat developed by Tony Fadell, who created the iPod and iPhone at Apple, has received a ton of well-deserved praise from the industry for being simple to set up, easy to use, and a general pleasure to have in the house. A lot of what people talk about is the beauty of the device and the clarity of the control (as exemplified by the singular “control” instead of “controls”). 

What is truly amazing about the device, though, is how it disappears. At first, you set it when you want to move the temperature up and down, but as it learns your preferences and habits, it begins to set itself correctly more and more frequently. The idea behind Nest is that you don’t want to use your thermostat. You want your house to be a comfortable temperature and you want to conserve energy when you can. Nest doesn’t want you to use it. The ideal version of Nest would have no controls and no display, but the house would always be the right temperature. 

Waze is a navigation app for your smartphone. But the standout feature of Waze is that it pays attention to how fast your going, and uses that information to crowdsource on-the-ground traffic information to everyone else using the app. If drivers using Waze are on the freeway going 30mph, Waze knows there’s traffic, and will direct other users around it accordingly. 

Every driver using Waze passively contributes data to let other users know more about the road they’re driving on. As the Waze community grows, the data collected becomes more accurate, but its directions also have the ability to control and alleviate traffic problems. If every driver on the road were using Waze for directions, it would be possible to coordinate traffic to alleviate overcrowded routes and speed up everyone’s commute. When I’m driving, I want to get from A to B as fast as possible, and it’s the computer’s job to get me there. Though the turn-by-turn navigation is still visible, the data gathering and decision-making that makes Waze amazing is all about making the computer disappear. 

The last example is Google Now. Google Now is a new Android feature that surfaces relevant information before you ask for it. If you follow a sports team, when the game is on, Google Now gives you the scores. If you’re at a bus stop, it tells you when the bus is coming and going. If you drive, it figures out where home and work are and lets you know if there’s unusual traffic on your route when you start your day. 

Google Now is the broadest promise of “less” computing. It figures out what you want before you ask and delivers the answer automatically. This means you spend less time looking at your phone. And the hardware, which provides the answers, becomes less important. Looking ahead at something like Google Glass, where Google has planted an immensely smart computer in a pair of glasses, the hardware can become even less intrusive and provide ambient knowledge without any interaction or inconvenience. The computer can fade away so only the information you need to know is left visible. 

This year the Human Microbiome Project released a huge amount of research about the bacteria that live in humans. Incredibly, less than 10% of the cells in our body are human, and we depend on the organisms inside us to survive. A human is really a wrapper around a diverse group of other different species living together. This is, more and more, the relationship we have with computers. A recent Droid commercial has been advertising that the device offers not an upgrade to your phone, but an upgrade to yourself. It might be a stretch today, but it won’t be for long. We are integrating computers with ourselves more deeply every day, and it’s only a matter of time before they disappear inside us completely.