Oculus Lipsync integration
Apply Oculus Lip Sync facial animations to the Ready Player Me avatars
Last updated
Apply Oculus Lip Sync facial animations to the Ready Player Me avatars
Last updated
Oculus Lipsync is a plugin provided by Meta that can be used to sync avatar lip movements to speech sounds and laughter. Meta provides public documentation and an example project for the OVRLipSync integration.
Important: To integrate OVRLipSync you first need to read and agree to Meta's terms and conditions of using the plugin.
Ready Player Me avatars come with Oculus Viseme morph targets needed for the OVRLipSync plugin.
You need to follow the steps described in the OVRLipSync setup documentation. When you finish with the described steps, you would have the OVRLipSync plugin successfully added to your project. You can also check the public example provided by Meta to better understand how it works.
Note: While in UE4 everything will work out of the box, for UE5 you would need to do manual fixes inside of the ovrLipSync plugin, the project wasn't updated for a while. Open the OVRLipSyncEditorModule.cpp file and add the following line in line 90. SoundWave->LoadingBehavior = ESoundWaveLoadingBehavior::ForceInline; . Open the OVRLipSync.Build.cs file and add AndroidPermission dependency to the PublicDependencyModuleNames dependencies.
Follow the steps for setting Up Ready Player Me Unreal SDK for the created project.
Now that all of the plugins are added to the project and the project is compilable we can start with adding Blueprint files.
There are two ways you can integrate the OVRLipSync:
By live capturing the sound from an external audio source and applying it to OVRLipSync.
By playing an already existing soundtrack track and applying it to OVRLipSync.
The plugin provides two actor components OVRLipSync used for live capturing and OVRLipSyncPayback used for playing the existing sound. In this example, for simplicity, we will use OVRLipSyncPayback with an already-existing soundtrack.
In the first step, we import the speech audio file to the project. Right-click on the imported audio file and from the dropdown select Generate LipSyncSequence. A new FrameSequence blueprint file will be created next to the audio file. This file will be used by the OVRLipSyncPayback component.
To load avatars that contain Oculus Viseme morph targets, we need to create An avatar config and a standard morph target group. We simply duplicate DA_RPM_AvatarConfig and DA_RPM_StandardMorphTargetGroup from the ReadyPlayerMe plugin into our project and change them. We add the Oculus Viseme morph target group to the duplicated data asset.
The next step is to create a BP_RPM_Actor actor blueprint that will represent the talking model. We will add 4 components to it. SkeletalMesh, Audio, ReadyPlayerMe, and OVRLipSyncPlayback.
In the SkeletalMesh component, we set the skeletal mesh to RPM_Mixamo_SkeletalMesh.
In the Audio component, we set the sound to the speech sound we want to play and disable the Auto Activate flag to play the sound after the avatar is loaded. We can also position it close to the mouth of the avatar in the viewport and make it a child of the SkeletalMesh component so that the sound will be attached to the mesh.
In the ReadyPlayerMe component, we set the avatar URL to the avatar that we want to load. Target Skeleton to RPM_Mixamo_Skeleton. Avatar config to the newly created DA_VisemeAvatarConfig.
In the OVRLipSyncPlayback component, we set the sequence property to the generated FrameSequence blueprint file.
In the Begin Play event, we hide the skeletal mesh and load the avatar. When the new avatar is loaded we show the skeletal mesh and call the Start function of the OVRLipSyncPayback component, to start paying the animation.
The morph target names of the Ready Player Me avatars are slightly different from the hardcoded names that are added in the OVRLipSync plugin, they have a `viseme_` prefix. For applying the correct names for the morph targets, select the OVRLipSyncPayback component, open the properties tab, and click on the + button next to the On Visemes Ready. This will create an event in the event graph. In this event, we call the Assign Visemes to Morph Targets function of the OVRLipSyncPayback component and set the list of morph target names. Below is the list of morph target names in the order that is required.
We are done with the integration, we place the actor on the map and press the play button, and we will get the following result.