Skip to main content

A.I. can spot galaxy clusters millions of light-years away

Image showing the galaxy cluster Abell1689. The novel deep learning tool Deep-CEE has been developed to speed up the process of finding galaxy clusters such as this one, and takes inspiration in its approach from the pioneer of galaxy cluster finding, George Abell, who manually searched thousands of photographic plates in the 1950s. NASA/ESA

Galaxy clusters are enormous structures of hundreds or even thousands of galaxies which move together, and for many years they were some of the largest known structures in the universe (until superclusters were discovered). But despite their massive size, they can be hard to identify because they are so very far away from us.

Now, a Ph.D. student has created a deep learning artificial intelligence which could help tackle this problem. The tool is called “Deep-CEE” (Deep Learning for Galaxy Cluster Extraction and Evaluation), and it can help to pick out galaxy clusters even when they are dim and far away.

Recommended Videos

The A.I. looks at color images and picks out potential galaxy clusters using neural networks, which mimic the way that a human brain would learn to recognize objects. It was trained using images of known galaxy clusters, until it was able to identify new clusters in images even when other objects were present as well.

“We have successfully applied Deep-CEE to the Sloan Digital Sky Survey,” Matthew Chan, the PhD student at Lancaster University who is responsible for this work, said in a statement. “Ultimately, we will run our model on revolutionary surveys such as the Large Synoptic Survey telescope (LSST) that will probe wider and deeper into regions of the Universe never before explored.”

This work will be valuable for future projects which require mining large amounts of data, such as analyzing data from telescopes. When there is a very large dataset from a telescope, Deep-CEE could quickly scan through the images and predict where galaxy clusters might be found. Projects like the LSST, which comes online in 2021 and will image the entire sky of the southern hemisphere, will generate a whopping 15TB of data every night, so A.I. will be needed to run through and identify items of interest which humans can then check out.

“Data mining techniques such as deep learning will help us to analyze the enormous outputs of modern telescopes,” Dr John Stott, Chan’s Ph.D. supervisor, said in the same statement. “We expect our method to find thousands of clusters never seen before by science.”

The work was presented at the Royal Astronomical Society’s National Astronomy meeting this week, and the paper is available on pre-publication archive arXiv.

Georgina Torbet
Georgina has been the space writer at Digital Trends space writer for six years, covering human space exploration, planetary…
The BigSleep A.I. is like Google Image Search for pictures that don’t exist yet
Eternity

In case you’re wondering, the picture above is "an intricate drawing of eternity." But it’s not the work of a human artist; it’s the creation of BigSleep, the latest amazing example of generative artificial intelligence (A.I.) in action.

A bit like a visual version of text-generating A.I. model GPT-3, BigSleep is capable of taking any text prompt and visualizing an image to fit the words. That could be something esoteric like eternity, or it could be a bowl of cherries, or a beautiful house (the latter of which can be seen below.) Think of it like a Google Images search -- only for pictures that have never previously existed.
How BigSleep works
“At a high level, BigSleep works by combining two neural networks: BigGAN and CLIP,” Ryan Murdock, BigSleep’s 23-year-old creator, a student studying cognitive neuroscience at the University of Utah, told Digital Trends.

Read more
Clever new A.I. system promises to train your dog while you’re away from home
finding rover facial recognition app dog face big eyes

One of the few good things about lockdown and working from home has been having more time to spend with pets. But when the world returns to normal, people are going to go back to the office, and in some cases that means leaving dogs at home for a large part of the day, hopefully with someone coming into your house to let them out at the midday point.

What if it was possible for an A.I. device, like a next-generation Amazon Echo, to give your pooch a dog-training class while you were away? That’s the basis for a project carried out by researchers at Colorado State University. Initially spotted by Chris Stokel-Walker, author of YouTubers:How YouTube Shook Up TV and Created a New Generation of Stars, and reported by New Scientist, the work involves a prototype device that’s able to give out canine commands, check to see if they’re being obeyed, and then provide a treat as a reward when they are.

Read more
This basic human skill is the next major milestone for A.I.
Profile of head on computer chip artificial intelligence.

Remember the amazing, revelatory feeling when you first discovered the existence of cause and effect? That’s a trick question. Kids start learning the principle of causality from as early as eight months old, helping them to make rudimentary inferences about the world around them. But most of us don’t remember much before the age of around three or four, so the important lesson of “why” is something we simply take for granted.

It’s not only a crucial lesson for humans to learn, but also one that today’s artificial intelligence systems are pretty darn bad at. While modern A.I. is capable of beating human players at Go and driving cars on busy streets, this is not necessarily comparable with the kind of intelligence humans might use to master these abilities. That’s because humans -- even small infants -- possess the ability to generalize by applying knowledge from one domain to another. For A.I. to live up to its potential, this is something it also needs to be able to do.

Read more