Skip to main content

Twitter is using A.I. to ditch those awful auto-cropped photos

twitter auto crops improve with ai
Twindesign/123RF
The Twitter auto crop feature functions like a tweet’s character limit in order to keep images on the microblogging platform consistent with the rest of the feed — but now Twitter is getting better at those crops, thanks to artificial intelligence. Twitter is now rolling out a smarter auto crop based on neural networks, the company announced in a blog post on January 24.

The previous auto crop feature worked by using face detection to keep faces in the frame. When no faces were detected in the image, the software would simply crop the preview at the center, while a click on the image allowing users to see the entire shot. Twitter says the crop option without faces would often lead to awkward crops, while sometimes the software didn’t correctly identify faces.

Recommended Videos

To fix those awkwardly cropped previews, Twitter engineers used what’s called salient image maps to train a neural network. Salient maps use eye trackers to determine the areas of an image that most catch the viewer’s eye. Earlier research in the area showed that viewers tend to focus on faces, text, animals, objects, and areas with high contrast.

Twitter used that earlier data to train the program to understand which areas of the image are the most important. Using that data, the program can recognize those features and make that auto crop in a place that will leave the most visual areas inside the crop.

But Twitter wasn’t done — while saliency software works well, it’s also slow, which would have prevented tweets from being posted in real time. To solve the awkward crops problem without a slowdown, Twitter refined the program again using two different techniques that improved the speed tenfold. The first trained a smaller network using that first good but slow program in order to speed up those crops. Next, the software engineers determined a number of visual points to map on each image, effectively removing the smaller, less important visual cues while keeping the largest areas intact.

Twitter Auto Crop
Before Image used with permission by copyright holder
Twitter Auto Crop
After Image used with permission by copyright holder

The resulting software allows images to post in real time, but with better crops. In a group of before and after pictures, Twitter shows images with faces that the earlier system wouldn’t detect properly cropped to face rather than feet. Other examples show images of objects that were cut out in the first program because they didn’t sit in the middle of the image, but were more appropriately cropped using the updated algorithms. Another example shows the program recognizing text and adjusting the crop to include a sign.

The updated cropping algorithm is already rolling out globally on both iOS and Android apps as well as Twitter.com.

Hillary K. Grigonis
Hillary never planned on becoming a photographer—and then she was handed a camera at her first writing job and she's been…
Nvidia’s latest A.I. results prove that ARM is ready for the data center
Jensen Huang at GTX 2020.

Nvidia just published its latest MLPerf benchmark results, and they have are some big implications for the future of computing. In addition to maintaining a lead over other A.I. hardware -- which Nvidia has claimed for the last three batches of results -- the company showcased the power of ARM-based systems in the data center, with results nearly matching traditional x86 systems.

In the six tests MLPerf includes, ARM-based systems came within a few percentage points of x86 systems, with both using Nvidia A100 A.I. graphics cards. In one of the tests, the ARM-based system actually beat the x86 one, showcasing the advancements made in deploying different instruction sets in A.I. applications.

Read more
Nvidia lowers the barrier to entry into A.I. with Fleet Command and LaunchPad
laptop running Nvidia Fleet Command software.

Nvidia is expanding its artificial intelligence (A.I.) offerings as part of its continued effort to "democratize A.I." The company announced two new programs today that can help businesses of any size to train and deploy A.I. models without investing in infrastructure. The first is A.I. LaunchPad, which gives enterprises access to a stack of A.I. infrastructure and software, and the second is Fleet Command, which helps businesses deploy and manage the A.I. models they've trained.

At Computex 2021, Nvidia announced the Base Command platform that allows businesses to train A.I. models on Nvidia's DGX SuperPod supercomputer.  Fleet Command builds on this platform by allowing users to simulate A.I. models and deploy them across edge devices remotely. With an Nvidia-certified system, admins can now control the entire life cycle of A.I. training and edge deployment without the upfront cost.

Read more
Can A.I. beat human engineers at designing microchips? Google thinks so
google artificial intelligence designs microchips photo 1494083306499 e22e4a457632

Could artificial intelligence be better at designing chips than human experts? A group of researchers from Google's Brain Team attempted to answer this question and came back with interesting findings. It turns out that a well-trained A.I. is capable of designing computer microchips -- and with great results. So great, in fact, that Google's next generation of A.I. computer systems will include microchips created with the help of this experiment.

Azalia Mirhoseini, one of the computer scientists of Google Research's Brain Team, explained the approach in an issue of Nature together with several colleagues. Artificial intelligence usually has an easy time beating a human mind when it comes to games such as chess. Some might say that A.I. can't think like a human, but in the case of microchips, this proved to be the key to finding some out-of-the-box solutions.

Read more