Samsung envisions a world filled with AI assistants, but not the sort of chatbot-powered assistants to which we’ve become accustomed. In a press release late this evening, the Seoul company unveiled Neon, a project developed by subsidiary STAR Labs that ambitiously aims to deliver “immersive … services” that “[make] science fiction a reality.”
Pranav Mistry, a human-computer interaction researcher and former senior vice president at Samsung Electronics, explained that the Core R3 software engine underlying Neon animates realistic avatars designed to be used in movies, augmented reality experiences, and web and mobile apps. “[Core R3] autonomously create new expressions, new movements, [and] new dialog … completely different from the original captured data [with latency of less than a few milliseconds]” he wrote in a tweet.
Neon’s avatars look more like videos than computer-generated characters, and that’s by design — beyond media, they’re intended to become “companions and friends” and stand in for hotel concierges and receptionists in hotels, stores, restaurants, and more. That said, they’ve been engineered with strict privacy guarantees, such that private data isn’t shared without permission.
And they won’t be as capable as your average AI assistant. STAR Labs makes explicit in an FAQ shared with reporters that Neon avatars “don’t know it all” and aren’t an interface to the internet to “ask for weather updates” or “play your favorite music.” They’re instead mean to have conversations and help with “goal-oriented” tasks, or assist in marginally complicated things that require a “human touch.”
When the beta launches later this year, businesses will be able to license or subscribe to Neon as a service, according to Mistry. A second component — Spectra, which will be responsible for the Neon avatars’ intelligence, learning, emotions and memory — is still in development, and it might make its debut at a conference later this year.
“We have always dreamt of such virtual beings in science fictions and movies,” he said in a statement. “[Neon avatars] will integrate with our world and serve as new links to a better future, a world where ‘humans are humans’ and ‘machines are humane’.”
It’s worth noting that AI-generated high-fidelity avatars aren’t exactly the most novel thing on the planet. In November 2018 during China’s annual World Internet Conference, state news agency Xinhua debuted a digital version of anchor Qiu Hao — Xin Xiaohao — capable of reading headlines around the clock. Startup Vue.ai leverages AI to generate on-model fashion imagery by sussing out clothing characteristics and learning to produce realistic poses, skin colors, and other features. Separately, AI and machine learning has been used to produce videos of political candidates like Boris Johnson giving speeches they never said in reality.
Neon brings to mind Project Milo, a prototypical “emotional AI” experience developed in 2009 by Lionhead Studios. Milo featured an AI structure that responded to spoken words, gestures, and several predefined actions, and it featured a procedural generation system that constantly updated a built-in dictionary capable of matching words in conversations with voice-acting clips.
Milo never saw the light of day, but Samsung by all appearances seems keen to commercialize the tech behind Neon in the coming years. Time will tell.
(Source: VentureBeat https://venturebeat.com/2020/01/06/samsung-subsidiary-star-labs-showcases-neon-an-artificial-human-project/)