AI-based Character Animator features launched by Adobe

AI-based Character Animator features launched by Adobe

Adobe beta launched new functionalities of Adobe Character Animator fueled by AI. The latest features enable the desktop software to integrate live motion, capturing with a recording tech to manipulate 2D puppets created on Illustrator or Photoshop. Tools like Lip Sync and Speech-Aware Animation, are powered by Adobe’s cross-platform ML tech Sensei. It then deploys algorithms to produce an animation from pre-recorded speech and accurately align mouth actions for audio parts.

Read More Remote Working: How can IT leaders Ensure Productivity for the Team

Last year’s Adobe Max conference saw reviewing of Speech-Aware Animation as Project SweetTalk. It helps generate eyebrow and head movements relevant to the recorded animation character. The latest feature Lip Sync boosts auto lip-syncing and mouth shapes timing, also referred to as “visemes”.

Source: Venturebeat

Previous articleGreenBay technologies acquired by Informatica
Next articleImproved Azure outage assistance by Microsoft