What’s the iPhone X’s 3D Animoji characters — fox, monkey, swirl-capped turd — but with one major, absolutely crucial difference: You.least loved feature? That’s easy: AR Emoji, the Samsung exclusive that whips up 3D avatars. It’s a pretty transparent imitation of the
While, yes, you can become a character like Donald Duck, you can also make Samsung’s 3D figures look like you, because they’re made from a selfie that the Galaxy S9 or took of your face.
That was the idea, anyway. In reality, AR Emoji’s creepy human avatars look very little like the real people they try to represent.
None of the over-20 CNET staffers who made an AR Emoji recognized themselves in their avatar. Our 3D selves had plasticky faces that grimaced every time we tried to smile, developed a horrible eye twitch and lacked enough customization options to give me curly hair or clothes I’d actually wear. AR Emoji even gave a short-haired editor a man bun for.
My colleagueand former TechCrunch writer Darrel Etherington, meanwhile, would never be mistaken for twins. But their AR Emoji would have them separated at birth:
In theory, AR Emoji lets you customize everything from hair to skin tone. But in reality, those shades you’re offered fail to represent real life variation, and the default image AR Emoji generates is often disappointing:
How could Samsung, the world’s largest supplier of smartphones, have gone so wrong? Turns out, Samsung didn’t make AR Emoji itself. It snapped up the license to bring these plasticky, unsettling faces to life from a two-year-old startup called Loom.ai. That company’s Hollywood-heavy pedigree includes resumes that read like a who’s-who of CG animation: Lucasfilm, Dreamworks, Disney Research. And its CTO and principal engineer are each Oscar-winners who won SciTech Academy Awards for separate achievements in computer animation.
So how did people who worked on the tech that controversially brought this major Star Wars character back to life in create such a lackluster flop in AR Emoji?
Because making a fast, accurate avatar from your phone’s selfie camera is really hard.
AR Emoji creation is fast. Maybe too fast
When and if you set up your AR Emoji, the Galaxy S9 phone will create your 3D avatar in a matter of seconds. The front-facing camera takes a photo of your face and analyzes that photo to approximate what your features really look like.
“In the movies, [making a 3D avatar is] a multimillion dollar effort with lots of specialized hardware,” said Kiran Bhat, Loom.ai’s co-founder and CTO.
He knows all too well the painstaking and time-consuming work that goes into Hollywood-level motion capture from his time at Lucasfilm.
are used to scan actors’ body movements and even specific facial expressions, which are then mapped on to computer-generated characters such as the Incredible Hulk (The Avengers), Gollum (the Lord of the Rings movies), Caesar (the recent Planet of the Apes trilogy) or Leader Snoke (The Last Jedi). (In fact, three of those four characters are brought to life by the same actor: .)
But AR Emoji was designed to be the ultimate express version of the Hollywood process. No day-long motion capture shoot, no green tracking dots on your face; just snap a selfie instead.
“We’ve simplified it into a single photograph,” said Bhat. And here’s why: “People’s attention span is low. People lose it after 5 seconds.”
If speed weren’t one of the most important ingredients in making AR Emoji, the software could generate more convincing figures. Loom.ai’s initial software made much more photorealistic avatars, if you were willing to stick around the seven minutes it took to process them.
Indeed, the company’s website shows what could have been: computer-rendered avatars made with the same software tools as AR Emoji take the place of employee photos. They’re still images, not animated videos, but they look good.
When you know the backstory, Loom.ai’s switch from Hollywood motion capture to a bare-bones phone photo shows off some impressive work. But if the end result gives you the heebie-jeebies, does it really matter?
Samsung needs to shoulder AR Emoji blame, too
This is the part of our story where Samsung comes in. With AR Emoji, Loom.ai isn’t calling all the shots. Its simply created the framework for Samsung, and then worked with the mobile giant to tailor the experience to whatever Samsung wants.
More specifically, Loom.ai supplied the SDK (software development kit) that drives the avatars, Bhat said, explaining that AR Emoji “uses a 2D tracker provided from Samsung on how the face moves, which is what gets fed into the SDK.”
Remember, too, that unlike the iPhone X, the Galaxy S9 doesn’t have the hardware to scan a 3D image of your face — its camera can only shoot in 2D. As a result, the Galaxy S9’s selfie photo has fewer details to grab onto, which explains why AR Emojis are fairly generic and don’t track your expressions as well. The software is working with a limited topography.
According to Loom.ai, Samsung picked how AR Emojis are styled and created the face-tracking software that makes those avatars move. Improving this tracking software would, in theory, make these digital characters look a lot more like real-life you.
Samsung declined to comment on this story.
A different manufacturer or app, meanwhile, could generate a better (or worse) result by plugging its own tracking algorithm into Loom.ai’s framework. This would be instructive… if Loom.ai had any other partners ready to bring their wares to market. (Right now, Loom.ai is also working with Chinese phone and app brand Meitu, but there’s no public product yet.)
How AR Emojis will get better
It’s still early days for animated emojis on any phone, but right now Apple is far ahead of the game.
That infamous notch on the iPhone X is crammed full of key sensors not yet found on other phones, including a dot projector, flood illuminator and infrared camera. In addition to enabling FaceID, they work in concert with the selfie camera, mapping the user’s face in real-time.
As a result, the facial expressions and mouth movements of Apple’s Animojis sync are much better with your movements, even if they are transferred onto a robot face or a cartoonish pile of poop.
“The ability to give the system more data is helpful,” Bhat said. “If [a phone has] extra sensors, that’s an active area that would help the algorithm outline your face better.” He added, however, that depth-sensing cameras run the risk of creating digital noise.
In addition to better cameras with more sensors, Loom.ai said that tweaking the computing algorithms that make your avatar’s face move can help AR Emoji become more lifelike, with less lag time to boot.
The founders also acknowledge that AR Emoji needs more options for body types, hair colors and other markers of self-identification, which will come, they promise, in some future Galaxy S9 update.
“The exact timing is up to Samsung,” said Ramasubramanian, Loom.ai’s CEO, “But it is something you will see evolve.”
The Loom.ai team will probably never win an Oscar for Galaxy S9’s AR Emoji, but a much-improved version 2.0 is exactly the sort of comeback story that Hollywood is built on.