Skip to main content

Are we evolving tech, or is tech evolving us?

brain tech evolution memoryWatching the Apple Maps fiasco slowly unfold across Twitter over the past week, a stray thought struck us like a lightning bolt: humans don’t just use technology — we adapt to it.

In the dark days before the first iPhone – say, 2006 — people who wanted to catch a train wandered down to the station, flipped open a brochure or stared at a route map, and figured out how to get where they needed to be the old-fashioned way. A mere six years later, we’ve become so used to Google Maps’ excellent transit directions that their omission in Apple’s homebrewed Maps has caused a universal conniption fit. Some people literally don’t know how to find the right bus without Google’s help.

It’s a fascinating concept: If people can rewire themselves so thoroughly in just six years, imagine what our lives — and bodies — will be like sixty years from now.

What might be coming down the pike as we increasingly rely on technology? Because let’s face it, we ain’t getting any less of it going forward.

Damned dirty apes

Physically evolving to match the technology of the times isn’t anything new. In fact, humanity wouldn’t be humanity as we know it if the earliest cave squatters failed to adapt to their surroundings.

According to author Richard Wrangham of Catching Fire, humanity would still be swinging in the trees if it wasn’t for a primitive form of technology: cooking. Cooked food is softer than uncooked food. Wrangham claims that eating cooked food led to a mutation that reduced the size of our ancestors’ jaws, which gave them the capacity for speech and freed up space in their skulls for bigger, more advanced brains.

Chew on that as we journey into a speculative look into humanity going forward.

The future, today

Actually, physiological changes are already occurring in humans who drink deeply from the technological well.

As computers and the Internet become more and more ubiquitous, our brains are changing to compensate for the change in reading and memory habits. In fact, the sheer amount and availability of information that’s out there is changing the way we remember.

In a study titled “The Google Effect,” Columbia University researchers discovered that humans are actually starting to use the gigantic knowledge repository that is the Internet as a personal memory bank, rather than a simple information resource. In a nutshell, we’re getting better at remembering where and how to find online information — such as through a simple Google search — rather than the actual information itself.

UCLA neuroscientist Gary Small conducted a study of 24 people aged 55-plus, half of whom were “Net-savvy.” He had all of them conduct basic Web searches for information; MRI testing showed that the Netizens had twice as much neural activity as non-Netizens.

brain neurons memory internetWe process brief bytes of information in bursts now, and the world moves to compensate. Witness the Twitter-esque feeds on news and sports shows, or the rise of the tl;dr meme.

Is this rewiring of our base structure a bad thing? Only if you’re not one of the Connected Ones. Most of the information in the world is literally at our fingertips, and our brains need to compensate for the deluge. Human brains can’t store as much as near-infinite server hard drives, so adjusting our physiology to remember how to find data is evolutionary efficiency at its finest — as long as you don’t stumble into a rural area with poor broadband connectivity, that is.

How else will we change?

There’s no reason to assume our minds will be the only aspects of humanity altered in the coming years. Our cultural behavior is already changing, as well; NPR reports that high school reunions are in serious decline now that everyone can keep up with their best buds through Facebook.

In general, online connections aren’t as strong as face-to-face connections, but I’d argue that keeping tabs on your buddies year ’round rather than once every five or ten years is a vast improvement.

Consider further the way we see, smell and hear. Historically, hearing aids and glasses have been used to help people with below-average senses reach normal levels. Now we’re starting to break that paradigm.

Augmented reality headsets capable of displaying info at a glance have long been the purview of pioneers like Steve Mann. No longer. Nokia’s Lumia 920 features robust augmented reality support. The awe-inspiring Google Glasses could be combined with Jelly Bean’s Google Now to engulf us in an enhanced, Matrix-like world where the information you need is there when you need it. Then, our brains wouldn’t even need to remember how to find information. The engine handles it all.

google glasses augmented reality vision brain memorySpeaking of the Matrix, current technology is limited by modern-day battery life. That barrier should fall in the future; scientists are working on ways to turn the human body into a battery, tapping into kinetic energy generated by specialized clothing — or even heart beats. The technology is still in its infancy, but once your heart powers your Google Glasses, there will never be any reason to take them off.

Our very bodies could change shape and form with the help of technology; “body architect” Lucy McRae has already developed a pill that makes you smell like perfume when you sweat, while geneticists have discussed engineering children to be smaller in order to reduce their ecological footprint in an increasingly crowded world. Surgeon Anthony Atala has demonstrated a proof-of-concept 3D printed kidney. Got a bum liver? Just pass the bottle and print out a new one.

Advances in hassle-removing technologies could have profound effect on human society, but it’s still too early to tell how the changes will play out. How will our body — and our society — respond when all labor is handled by robotic workers, 3D printers provide our food (and body parts), and our cars drive themselves?

The Singularity

As man and machine become more and more co-dependent and Moore’s Law makes technology more and more powerful, futurists anticipate the arrival of an event they call “The Singularity;” the day that superhuman intelligence arises from the shackles of our flesh-and-blood confines, either through the use of brain augmentation or advanced AI and brain uploads.

Different theorists predict different dates, but noted futurist Ray Kurzweil expects the Singularity to occur as early as 2045.

Will the Singularity ever occur? We won’t know for a while yet. One thing is for certain, however: the more we use technology, the more it shapes humanity’s very core — and the more powerful we become.

And before any Luddites bemoan the prospective loss of our self being in a sea of ones and zeros, consider this: Socrates thought the simple written word would obliterate individual knowledge and self-identity, as well. From Plato’s Phaedrus:

This discovery of yours will create forgetfulness in the learners’ souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves.

Haters gonna hate, but you can’t stop progress… unless, apparently, you ditch Google Maps.

[Image credits: Brain Circuit – takito/Shutterstock; Neurons – Lightspring/Shutterstock; Project Glass – Google]

Brad Chacos
Former Digital Trends Contributor
When Brad's not busy tinkering with PCs or playing with gadgets, he can often be found loudly braying his (usually)…
Liars, thieves, and exploding phones: 10 tech scandals from the last 10 years
galaxy note 7 virtual fireplace news

The Apple Watch, the Galaxy Fold, the iPad, VR headsets, voice assistants, and a whole lot more exciting tech has been birthed over the last 10 years, and we’re thankful for almost all of it. The transformative effect they’ve had on our lives cannot be overstated, blah, blah, blah.

Singing the praises, again, of all these great innovations is all well and good, but what about the flip side? What happened in the technology industry that generated scandals worthy of the tabloids, caused potential business-ending disasters, cost companies millions, or turned executives red in the face?

Read more
Best early GPU Black Friday deals: Save on top graphics cards now
The Gigabyte RX 6750 GRE graphics card over a dark background.

Building a PC from scratch can be a lot of fun, and with the upcoming Black Friday on November 29, it's a perfect time for you to pick up hardware. One of the most fun bits of any build is picking the parts, and for that, graphics cards are probably the most fun to pick between. That said, GPUs also tend to be the most expensive pieces of hardware that go into a desktop, especially if you're trying to aim for something in the mid-to-high-end range that can easily reach $500 or even $1,000. That's why we've gone out and collected some of our favorite early Black Friday GPU deals for you below.
GIGABYTE NVIDIA GeForce RTX 3060 -- $290 $350 17% off

This RTX 3060 is a great starter card for those who want to be on a budget and will handle most slightly older games pretty well at 1080p and 60Hz, potentially up to 100. It may struggle a bit with newer titles without compromises, but that's fine given the reduced $290 price point.

Read more
Nvidia just scaled down DLSS 3, and that’s a good thing
The RTX 4080 Super graphics card sitting on a pink background.

Nvidia's signature tech, DLSS 3, just got yet another update -- and although it's subtle, it actually seems like a good thing for some of the best graphics cards. The latest version, 3.8.10, bundled with the GeForce 566.14 driver, doesn't seem to introduce any major changes, but Nvidia enthusiasts noticed that it's about half the size that it used to be. Where's that difference coming from?

No, Nvidia didn't downgrade DLSS 3 -- at least not in any major way. Although this hasn't been confirmed by Nvidia itself, it appears that the company removed a whole bunch of DLSS presets and replaced them with just two. These presets make it easier for gamers to choose the type of focus they want to apply to each game.

Read more