Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Audio & Video Sync Issue #1649

Open
nirmala-ncompass opened this issue Nov 21, 2024 · 1 comment
Open

Audio & Video Sync Issue #1649

nirmala-ncompass opened this issue Nov 21, 2024 · 1 comment

Comments

@nirmala-ncompass
Copy link

nirmala-ncompass commented Nov 21, 2024

Hi @pedroSG94 My Custom Implementation is:
I am recording audio and video in file, mean while I am extending and modified AndroidMuxerRecordController class to get audio and video buffers. In my custom class I am receiving it in one method (onBuffervailable) Here based on my video frame count i am splitting chunks and writing in different file. Here I am resetting Buffer PTS value for each file. Kindly check below code. Audio and video is good in local video. But i my case when i am stetching those videos in mediaLive. The video is good but audio is playing bit delay. Its not in sync. Could you please help me How to set new PTS value for the buffers in one file with audio and video in sync?

`

   private static final int VIDEO_TIME_BASE = 1000 * 1000; 
   private static final int AUDIO_TIME_BASE = 1000 * 1000;
  private static final int VIDEO_FRAME_RATE = 30;
  private static final long VIDEO_FRAME_DURATION = VIDEO_TIME_BASE / VIDEO_FRAME_RATE;
  private static final int AUDIO_SAMPLE_RATE = 32000; // 32 kHz
  private static final int AUDIO_FRAME_SIZE = 1024; // AAC frame size
 private static final long AUDIO_FRAME_DURATION = (long) AUDIO_FRAME_SIZE * AUDIO_TIME_BASE / 
 AUDIO_SAMPLE_RATE;
 private long videoPtsCounter = 0;
 private long audioPtsCounter = 0;

     public void onBufferAvailable(ByteBuffer byteBuffer, MediaCodec.BufferInfo bufferInfo, boolean isVideo) {
   
      synchronized (this) {
        
      // Copy the ByteBuffer to prevent overwriting
        ByteBuffer bufferCopy = ByteBuffer.allocate(byteBuffer.remaining());
        bufferCopy.put(byteBuffer);
        bufferCopy.flip();

        long calculatedPts;

        if (isVideo) {
            // Calculate video PTS
            calculatedPts = videoPtsCounter * VIDEO_FRAME_DURATION;
            videoPtsCounter++;
        }
        else {
            // Calculate audio PTS
            calculatedPts = audioPtsCounter * AUDIO_FRAME_DURATION;
            audioPtsCounter++;
        }

        // Set the final adjusted PTS
        bufferInfo.presentationTimeUs = calculatedPts;
        
        // Create buffer data object
        BufferData bufferData = new BufferData(bufferCopy, new MediaCodec.BufferInfo(), isVideo);
        bufferData.bufferInfo.set(bufferInfo.offset, bufferInfo.size, bufferInfo.presentationTimeUs, bufferInfo.flags);

        // Track video frame count
        if (isVideo) {
            videoFrameCounter++;
        } else audioFrameCounter++;
        // Add buffer data to the buffer list
        if (useFirstArray) {
            bufferList1.add(bufferData);
        } else {
            bufferList2.add(bufferData);
        }
        // Process buffer if we hit the frame limit
        if (videoFrameCounter >= MAX_VIDEO_FRAMES) {
            processBufferInBackground(useFirstArray ? bufferList1 : bufferList2, false);
            if (useFirstArray) {
                bufferList1.clear();
            } else {
                bufferList2.clear();
            }
            audioPtsCounter = 0;
            videoPtsCounter = 0;
            audioFrameCounter = 0;
            videoFrameCounter = 0; // Reset frame counter
            useFirstArray = !useFirstArray; // Switch buffer list
        }
    }
}

`

@pedroSG94
Copy link
Owner

Hello,

Did you tried using the clock to create the pts instead of calculate it based in the config?
You can try it in both (video/audio) or only in one to test:

bufferInfo.presentationTimeUs = System.nanoTime() / 1000 - presentTimeUs;

Where presentTimeUs is (when you start encoder or receive the first frame):

System.nanoTime() / 1000

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants