Skip to main content

The future of A.I.: 4 big things to watch for in the next few years

brain with computer text scrolling artificial intelligence
Chris DeGraw/Digital Trends, Getty Images

A.I. isn’t going to put humanity on the scrap heap any time soon. Nor are we one Google DeepMind publication away from superintelligence. But make no mistake about it: Artificial intelligence is making enormous strides.

As noted in the Artificial Intelligence Index Report 2021, last year the number of journal publications in the field grew by 34.5%. That’s a much higher percentage than the 19.6% seen one year earlier. A.I. is going to transform everything from medicine to transportation, and there are few who would argue otherwise.

Recommended Videos

Here in 2021, we’re well into the deep learning revolution, which supercharged A.I. in the twenty-first century. But “deep learning” is a broad term that, by now, most people are very familiar with. Where are the big advances coming in A.I.? Where should you be looking to see the future unfolding in front of you? Here are some of the technologies to keep an eye on.

Transformers: More than meets the eye

“Robots in disguise // Autobots wage their battle // To destroy the evil forces // Of the Decepticons.” Wait, that’s something else!

In fact, far from a franchise that enjoyed its heyday last century, Transformers — the A.I. model — represent one of the field’s most promising present-day advances, particularly in the field of natural language processing research.

Language understanding has been a key interest in A.I. since before it was even called A.I., dating back all the way to Alan Turing’s proposed test for machine intelligence. Transformer models, first described by Google researchers in 2017, have been shown to be vastly superior to previous language models. One reason is the almost unfathomably large datasets they can be trained on. They can be used for machine translation, summarizing documents, answering questions, understanding the content of video, and much, much more. While large language models certainly pose problems, their success is not to be denied.

transformer (machine learning model)
Image used with permission by copyright holder

The advent of Transformers led to the development of GPT-3 (Generative Pre-trained Transformer), which boasts 175 billion parameters, was trained on 45 TB of text data, and cost upward of $12 million to build. At the start of this year, Google took back its crown by debuting a new language model with some 1.6 trillion parameters, making it nine times the size of GPT-3. The Transformer revolution is just beginning.

Generative adversarial networks

Conflict doesn’t usually make the world a better place. But it certainly makes A.I. better.

Over the past several years, there have been considerable advances in image generation: referring to the use of A.I. to dream up pictures that look indistinguishable from actual pictures from the real world. This isn’t just about social media-fueled conspiracy theories fooling people into thinking that President Biden has been caught partying with the Illuminati, either. Image generation can be used for everything from improving search capabilities to helping designers create variations on a theme to generating artwork that sells for millions at auction.

So where does the conflict come into play? One of the principal technologies for image generation is called a generative adversarial network (GAN). This class of machine learning framework uses a combative, tug-of-war approach to pass images and feedback between a “generator” and a “discriminator” algorithm, resulting in incremental improvements until the discriminator is unable to tell what’s real and fake. GANs have also been used for generating fake genetic code that could be used by researchers.

Look for plenty more innovative applications in the near future.

Neuro-symbolic A.I.

Neurosymbolic AI Explained

In a December 2020 publication, researchers Artur d’Avila Garcez and Luis Lamb described neuro-symbolic A.I. as the “third wave” of artificial intelligence. Neuro-symbolic A.I. is not, strictly speaking, totally new. It’s more like getting two of the world’s greatest rock stars, who once battled at the top of the charts, together to create a supergroup. In this case, the supergroup consists of self-learning neural networks and rule-based symbolic A.I.

“Neural networks and symbolic ideas are really wonderfully complementary to each other,” David Cox, director of the MIT-IBM Watson A.I. Lab in Cambridge, Massachusetts, previously told Digital Trends. “Because neural networks give you the answers for getting from the messiness of the real world to a symbolic representation of the world, finding all the correlations within images. Once you’ve got that symbolic representation, you can do some pretty magical things in terms of reasoning.”

The results could give us A.I. that is better at carrying out this reasoning process, as well as more explainable A.I. that can, well, explain why it made the decision that it did. Look for this to be a promising avenue of A.I. research in the years to come.

Machine learning meets molecular synthesis

Along with GPT-3, last year’s most significant A.I. advance was DeepMind’s astonishing AlphaFold, which applied deep learning to the decades-old biology challenge of protein folding. An answer to this problem will lead to the curing of diseases, new drug discovery, a greater understanding of life on a cellular level, and more. This last entry on the list is less a specific example of A.I. technology and more of an example of how A.I. is making a big difference in one domain.

Machine learning techniques in this area are proving transformative for healthcare and biology in fields like molecular synthesis, whereby ML can help scientists work out which potential drugs they should be evaluating and then how to most effectively synthesize them in the lab. There is, perhaps, no area more life-changing where A.I. is going to be used over the next decade and beyond.

Luke Dormehl
Former Digital Trends Contributor
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
Deep learning A.I. can imitate the distortion effects of iconic guitar gods
guitar_amp_in_anechoic_chamber_26-1-2020_photo_mikko_raskinen_006 1

Music making is increasingly digitized here in 2020, but some analog audio effects are still very difficult to reproduce in this way. One of those effects is the kind of screeching guitar distortion favored by rock gods everywhere. Up to now, these effects, which involve guitar amplifiers, have been next to impossible to re-create digitally.

That’s now changed thanks to the work of researchers in the department of signal processing and acoustics at Finland’s Aalto University. Using deep learning artificial intelligence (A.I.), they have created a neural network for guitar distortion modeling that, for the first time, can fool blind-test listeners into thinking it’s the genuine article. Think of it like a Turing Test, cranked all the way up to a Spınal Tap-style 11.

Read more
Neuro-symbolic A.I. is the future of artificial intelligence. Here’s how it works
IBM Watson Shapes

Picture a tray. On the tray is an assortment of shapes: Some cubes, others spheres. The shapes are made from a variety of different materials and represent an assortment of sizes. In total there are, perhaps, eight objects. My question: “Looking at the objects, are there an equal number of large things and metal spheres?”

It’s not a trick question. The fact that it sounds as if it is is proof positive of just how simple it actually is. It’s the kind of question that a preschooler could most likely answer with ease. But it’s next to impossible for today’s state-of-the-art neural networks. This needs to change. And it needs to happen by reinventing artificial intelligence as we know it.

Read more
Hyundai believes CarPlay, Android Auto should remain as options
The 6.9-inch Sony digital media receiver installed in the dashboard of a vehicle.

Hyundai must feel good about the U.S. market right now: It just posted "record-breaking" November sales, led by its electric and hybrid vehicles.

It wouldn’t be too far of a stretch for the South Korean automaker to believe it must be doing something right about answering the demands of the market. And at least one recurring feature at Hyundai has been a willingness to keep offering a flexible range of options for drivers.

Read more