Chip in to keep stories like these coming.

The year was 1970. Forty five years ago. I was 15 and learning to program my first computer, a 1K, room-sized monster that ate punchcards and paper tape. I had to take a bus to a high school in another part of town to visit it. I wrote programs in BASIC, COBOL and Fortran, languages that are now almost as dead as Latin (that was still taught in school back then) when it comes to consumer electronics. Older boys turned stacks of punchcards into dot-matrix printouts of Playboy centrefolds, which, at a distance, were pretty compelling. What sparse, secret Internet there was flowed between a few university computers in the U.S., far from the awareness of the general public. 

I remember this, sometimes, when I see a 15-year-old deep into a smartphone or a tablet computer. When they are my age, their youth will seem as unfathomably primitive as mine would seem to them now: “Our corneal implants couldn’t process 3D graphic overlays in real time? How did we live like such savages?” they will ask. 

This is important to keep in mind, because it’s easy to imagine we live in a world of wonders now. I can now ask my watch to tell my phone to play me almost any music in the world, and it will comply. I can ask my phone what planes are overhead right now, and it will tell me. I can have a video conference in the palm of my hand. And yes, that is miraculous. But it is also the kind of fumbling magic a young boy performs for his parents. Wifi stutters. Screens dull in sunlight. Batteries have the stamina of overweight pugs. We have to carry cables with us. We are, despite the rocketing progress, still in the Steam Era of technology.

As researchers learn more about material science, chip fabrication and battery chemistry. As biologists discover more about how we see and hear. As systems engineers, cognitive psychologists and machine learning experts pool and grow their knowledge, we will bring our communications tools closer to us. We will wear them like clothes, then like skin and then like new organs that boost our senses or give us ones beyond the six. Our understanding of time and space, of here and there, of what we know and what we can find out, will blur. 

We will no longer have smartphones, or smartwatches. Smart implants will replace them, magnifying our senses, our sense of self and our memories.

I believe this because I see no reason to believe that the speed and processing power of chips will do anything but arch dramatically year by year. Meanwhile, the cost of memory will plummet to near zero for a petabyte (just over a 1,000 terrabytes) of storage. Far fetched? In 1970, a megabyte of memory could be had from IBM for $734,000. Today it costs .0056 cents. 

On top of this, breakthroughs with wonder materials like graphene will mean thinner, faster circuitry not yet in production. 

When I was 15 I couldn’t imagine owning a computer. Teens today have no idea what’s about to happen to them.

Wayne MacPhail has been a print and online journalist for 25 years, and is a long-time writer for on technology and the Internet.

Chip in to keep stories like these coming.


Wayne MacPhail

Wayne MacPhail has been a print and online journalist for 25 years. He was the managing editor of Hamilton Magazine and was a reporter and editor at The Hamilton Spectator until he founded Southam InfoLab,...