Skip to main content

Chinese scientists built a ‘robot goddess,’ then made it subservient and insecure

Interactive robot goddess
www.news.cn

An ultra-realistic robot was unveiled last week by researchers from the University of Science and Technology in China (USTC). Jia Jia, as the female robot has been named, is apparently capable of basic communication, interaction with nearby people, and natural facial expressions. Unfortunately, many of her pre-programmed interactions appear to be highly stereotypical.

Related: See here for a vacuum cleaning robot

Recommended Videos

For example, if Jia Jia detects that someone is taking a photo of her, she’ll warn the photographer to stand back or else the picture will make her face “look fat.” Jia Jia can’t do much beyond that though. Essential human emotional responses like laughing and crying are not in the robot’s repertoire. Her hands have also been left lifeless. She does, however, speak super subserviently. The prompt, “Hello,” elicits the reply, “Yes my lord. What can I do for you?”

Related Offer: What can you build? Try your hand with robotic kits here

We’ve seen a few other ultra-realistic female robots recently. A few weeks ago a 42-year-old product and graphic designer from Hong Kong revealed “Mark 1,” a $50,000 female robot he built himself to resemble Scarlett Johansson. The project, which took a year and a half to complete, was supposedly the fulfillment of a childhood dream. Like Jia Jia, Mark 1 is capable of basic human-like interaction, command responses, and movement. Mark 1 actually outperforms Jia Jia in that the former can move its limbs, turn its head, bow, smirk, and wink.

And that’s just the most recent example. Last year researchers at the Intelligent Robotics Laboratory at Osaka University in Japan and Shanghai Shenqing Industry in China revealed Yangyang, a dynamic robot with an uncanny resemblance to Sarah Palin. Yangyang also seems to do more than Jia Jia with its abilities to hug and shake hands. 

RelatedYangyang is an eerie female robot that can talk, smile, shake your hand, and hug you

The USTC researchers spent three years developing Jia Jia, an apparent labor of love. And they aren’t done yet. Team director Chen Xiaoping says he hopes to develop and refine their creation, equipping it with artificial intelligence through deep learning and the ability to recognize people’s facial expressions, according to Xinhua News. Chen hopes Jia Jia will become an intelligence “robot goddess.” He added that the prototype was “priceless” and would not yet consider mass production.

Dyllan Furness
Former Digital Trends Contributor
Dyllan Furness is a freelance writer from Florida. He covers strange science and emerging tech for Digital Trends, focusing…
Like a wearable guide dog, this backback helps Blind people navigate
intel sensor backpack for blind visually impaired people jagadish mahendran

Visual Assistance System for the Visually Impaired

In “Secondhand Spoke,” the 15th episode of the 12th season of Family Guy, teenage son Chris Griffin is being bullied. With Chris unable to come up with responses to the verbal gibes of his classmates, his smarter baby brother, Stewie, hops in a backpack so that Chris can surreptitiously carry him around. Prompted by Stewie, Chris not only manages to get back at the bullies, but even winds up getting nominated for class president for his troubles.

Read more
An Amazon A.I. scientist wants to transform downtown Jackson, Mississippi
Nashlie Sephus

Most people look at a couple of vacant lots and see … vacant lots. But Nashlie Sephus sees gold.

Sephus, a 35-year-old Black A.I. researcher with Amazon, plans to turn seven buildings and about 500,000 square feet of downtown Jackson, Mississippi, into a technology park and incubator. Her story, as detailed on Inc.’s Web site, is remarkable:
The 35-year-old has spent the past four years splitting her time between Jackson, her hometown, and Atlanta, where she works as an applied science manager for Amazon's artificial intelligence initiative. Amazon had acquired Partpic, the visual recognition technology startup where she was chief technology officer, in 2016 for an undisclosed sum. In 2018, she founded the Bean Path, an incubator and technology consulting nonprofit in Jackson that she says has helped more than 400 local businesses and individuals with their tech needs.
But beyond entrepreneurship and deep A.I. know-how, Sephus is eager to bring tech to a city hardly known for its tech roots. "It's clear that people don't expect anything good to come from Jackson," she told Inc. "So it's up to us to build something for our hometown, something for the people coming behind us."

Read more
Why teaching robots to play hide-and-seek could be the key to next-gen A.I.
AI2-Thor multi-agent

Artificial general intelligence, the idea of an intelligent A.I. agent that’s able to understand and learn any intellectual task that humans can do, has long been a component of science fiction. As A.I. gets smarter and smarter -- especially with breakthroughs in machine learning tools that are able to rewrite their code to learn from new experiences -- it’s increasingly widely a part of real artificial intelligence conversations as well.

But how do we measure AGI when it does arrive? Over the years, researchers have laid out a number of possibilities. The most famous remains the Turing Test, in which a human judge interacts, sight unseen, with both humans and a machine, and must try and guess which is which. Two others, Ben Goertzel’s Robot College Student Test and Nils J. Nilsson’s Employment Test, seek to practically test an A.I.’s abilities by seeing whether it could earn a college degree or carry out workplace jobs. Another, which I should personally love to discount, posits that intelligence may be measured by the successful ability to assemble Ikea-style flatpack furniture without problems.

Read more