NATALYA BASHNYAK

new media designs and 
visual-spatial arts theories practice

CGI, Simulations & Research

  1. IDIOGLOSSIA
  2. MACHINIMA GTA V
  3. DIGITAL AVATARS
  4. ALDI SÜD

  5. PERMACOMPUTING
  6. THE UNCANY VALLEY 
  7. GENDER&ART IN THE GDR
  8. AI - GENERATED ANIMATION

INDEX

CONTACT 
╭( ・ㅂ・)و

DIGITAL AVATARSYear: 2022
Format: 3D Animation

The computer graphics community has long been hung up on the task of simulating the human face: it is very complex, with subtle muscle movements, imperceptible fine details, subsurface elements, a mixture of moist, hairy and soft. Rendering such objects in a lifelike way is no small feat. Since first experiments in the early 1972 Ed Catmull (founder of Pixar) and his colleagues created the world’s first 3D rendered and digitally animated film, A Computer Animated Hand. A minute long animated version of Ed’s left hand and avatar was for a class assignment when they were students at the University of Utah, in the Entertainment Arts and Engineering program. Since then, the challenges of creating lifelike digital humans are now for the most part solved. With software such as iClone 7, MetaHuman and the Unreal Engine 5 rendering pipeline, we are even able to create lifelike humans which render in ‘real time’. We can also track our physical human motions and expressions with an iPhone and Live Face app and transpose them onto virtual avatars. This allows for a wholly new human experience of inhabiting distinctly nonhuman bodies.

While working with digital avatars I used the above-mentioned tools to bring the characters to life, however, the complexity arose from requiring multiple digital avatars to be blended seamlessly, creating an illusion of a shapeshifting but believable, non-uncanny humans. Morphing video seamlessly from one frame to another is another task that has kept computer scientists up at night. One of the first successfully deployed, seamless morphing effects is visible in Michael Jackson’s music video Black of White (1991).


Live Face app real-time motion tracking my head and face, running on the iPhone to animate a digital avatar
Customized Blueprints in Unreal Engine 5 enable to go beyond the MetaHuman default aesthetics 
Morphing animation
Guides

For these projects the experimentation went through many different approaches, before settling on Re:Vision Motion Morph plugin for After Effects, which provided the most smooth and seamless transitions with minimal clean-up required and no artefacts. I tracked guides onto the faces of each avatar using another 2D tracking tool called Mocha. These guides were then used to tell the software which feature points correlated between the faces, and based on these feature points the morphing transition could be animated. 

Some challenges I encountered while recording the performances for these digital humans were that blinking, rising eyebrows and moving my forehead caused many problems in the recording. Cleaning up these issues required additional manual work on the key frames to smooth the animation and make it more natural. The exciting part for me was to experiment and modify the Unreal Engine 5 Blueprints. With a punk approach to the look, I adjusted my MetaHuman appearance by adding gleaming details, such as braces, a nose ring, painting their hair and giving them make-up. 

→ Featuring excerpts of Unreal Engine, press-release (2021), Joel McKim’s opinion on Animation without Animators: From Motion Capture to MetaHumans (2022), Sujin Bae’s, Timothy Jun’s, Justin Cho’s, Ohbyung Kwon’s study Effects of meta-human characteristics on user acceptance: from the perspective of uncanny valley theory (2024)

Related Writing:

The Uncanny Valley Phenomenon - why do more realistic characters look less human? read

📖



CGI emerged nearly 50 years ago, but it wasn't until the mid-90s that significant research breakthroughs enabled rendering approaching photorealism. This research, coupled with the PC revolution, empowered studios like ILM, PIXAR, Digital Domain, and Weta to lead the way in industrial-scale photorealistic CGI production for cinema.
Between 1995 and 2015, new techniques emerged, and computer scientists developed libraries of simulated phenomena. The exploration of the Uncanny Valley phenomenon further accelerated progress. Software companies then packaged these advancements into multipurpose 3D animation programs. Today, CGI programs offer comprehensive off-the-shelf toolkits for creating natural motion and photorealistic imagery. Recent research finds that Metahumans outperform other real-time digital humans and are more accepted by users. “When digital humans trigger the uncanny valley effect in appearance and function, meta-humans only face it in function.” (Effects of meta-human characteristics on user acceptance)

My paper, ‘The Uncanny Valley Phenomenon’, is on the highly realistic animated figures and avatars of screen-based media, such as animation, cinema, video games, and synthetic world. It explains the feelings that unnatural bodies awaken in us, from disgust to acceptance. It examines the challenges of the Uncanny Valley for designers, while attempting to achieve consistent levels of realism.

Credits:
Research, Animation: Natalya Bashnyak
3D Assets: MetaHuman customized in Unreal Engine, Substance 3D Painter
Software: 3D Facial MoCap Animation and Lip Syncing with iClone 7 using iPhone Live Face app for tracking, positioned 40 cm in front of the monitor
Morphing: After Effects
Audio: AI generated voice
Produced with the support of BUS.GROUP

2024 NATALYA BASHNYAK