Wednesday, February 1, 2023

AI-based Character Animator features launched by Adobe

By ET Bureau - August 20, 2020 1 Mins Read

Adobe beta launched new functionalities of Adobe Character Animator fueled by AI. The latest features enable the desktop software to integrate live motion, capturing with a recording tech to manipulate 2D puppets created on Illustrator or Photoshop. Tools like Lip Sync and Speech-Aware Animation, are powered by Adobe’s cross-platform ML tech Sensei. It then deploys algorithms to produce an animation from pre-recorded speech and accurately align mouth actions for audio parts.

Read More Remote Working: How can IT leaders Ensure Productivity for the Team

Last year’s Adobe Max conference saw reviewing of Speech-Aware Animation as Project SweetTalk. It helps generate eyebrow and head movements relevant to the recorded animation character. The latest feature Lip Sync boosts auto lip-syncing and mouth shapes timing, also referred to as “visemes”.

Source: Venturebeat



AUTHOR

ET Bureau

The Enterprise talk Bureau has five well-trained writers and journalists, well versed in B2B enterprise technology industry, and constantly in touch with industry leaders for the latest trends, opinions, and other inputs- to bring you the best and latest in the domain.

Subscribe To Newsletter

*By clicking on the Submit button, you are agreeing with the Privacy Policy with Enterprise Talks.*