Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gapless playback is not working #8594

Closed
adoudech opened this issue Feb 16, 2021 · 27 comments
Closed

Gapless playback is not working #8594

adoudech opened this issue Feb 16, 2021 · 27 comments
Assignees
Labels

Comments

@adoudech
Copy link

adoudech commented Feb 16, 2021

Issue description

When trying to play a gapless playback between two MP3 files, a noticeable gap is detected between them.
The issue is also reproduced when using Exoplayer demo app with the test.mp3 file already added in assets.
I added the following snippet to PlayerActivity in Exoplayer demo app L299 for test purpose => the issue is reproduced : a gap between the two media sources is heard.

    val concatenatingMediaSource = ConcatenatingMediaSource()
    val defaultMediaSourceFactory = DefaultMediaSourceFactory(context)
    concatenatingMediaSource.addMediaSource(defaultMediaSourceFactory.createMediaSource(
        MediaItem.fromUri("asset:///test.mp3")))
    concatenatingMediaSource.addMediaSource(defaultMediaSourceFactory.createMediaSource(
        MediaItem.fromUri("asset:///test.mp3")))
    player.setMediaSource(concatenatingMediaSource)
    player.prepare()

Please notice that the delay/padding information is well detected by Exoplayer for the test.mp3 in demo app assets.

  • ExoPlayer version number : 2.13.0
  • Android version : Android 7 & Android 10
  • Android device: Samsung galaxy s7 edge & Samsung galaxy s10

Please let me know if you need more details.
Thank you

@christosts
Copy link
Contributor

@krocard can you take a look?

@krocard
Copy link
Contributor

krocard commented Feb 16, 2021

I can't reproduce on a Pixel 3 on Android 10. Here are my exact steps:

git clone -b r2.13.0 https://github.com/google/ExoPlayer.git && cd ExoPlayer
ANDROID_SDK_ROOT=~/Android/Sdk/ ./gradlew installNoDecoderExtensionsDebug
adb push testdata/src/test/assets/media/mp3/test.mp3 /sdcard/
adb shell am start -a com.google.android.exoplayer.demo.action.VIEW_LIST --es uri_0 file:///sdcard/test.mp3 --es uri_1 file:///sdcard/test.mp3

The file plays twice without any audible gap. I also tried to apply a similar change to yours (though test.mp3 is not in the demo asset, so I used a file URI):

-    player.setMediaItems(mediaItems, /* resetPosition= */ !haveStartPosition);
+    ConcatenatingMediaSource concatenatingMediaSource = new ConcatenatingMediaSource();
+    DefaultMediaSourceFactory defaultMediaSourceFactory = new DefaultMediaSourceFactory(this);
+    concatenatingMediaSource.addMediaSource(defaultMediaSourceFactory.createMediaSource(
+        MediaItem.fromUri("file:///sdcard/test.mp3")));
+    concatenatingMediaSource.addMediaSource(defaultMediaSourceFactory.createMediaSource(
+        MediaItem.fromUri("file:///sdcard/test.mp3")));
+    player.setMediaSource(concatenatingMediaSource);

Please follow those steps (preferably the first ones a they do not involve any code change) and attach a bug report if you still observe the issue.

@adoudech
Copy link
Author

I'm still facing the issue after following your steps (the first ones) with my device Samsung galaxy s10.
I tested on emulator Pixel 3 Android 10, I still got the gap between the two test mp3 files.

@adoudech
Copy link
Author

Please take a look at the attached file :

Screen_Recording_20210216-185257_ExoPlayer.mp4

@krocard
Copy link
Contributor

krocard commented Feb 17, 2021

I was able to reproduce on the emulator (pixel 3 Android 10). I have not yet traced the issue but I can hear underrun though that seem unrelated.
Longer gapless media do not exhibit the problem, so the issue could be cause/revealed by the short duration of this test media.

@GeoffreyMetais
Copy link

I confirm this behavior with test.mp3 and Exoplayer r2.13.0.

I clearly hear a gap with a Samsung Galaxy S8 (Android 9)

But no gap at all on Pixel 4a and galaxy S20+ (both Android 11)

@adoudech
Copy link
Author

adoudech commented Feb 17, 2021

I upgraded my Samsung galaxy S10 to Android 11 and I cannot reproduce the issue => Gapless playback seems working on Android 11.
/!\ I still have the issue on Samsung galaxy s7 Android 7

@adoudech
Copy link
Author

adoudech commented Feb 22, 2021

@krocard Any updates on this issue please? Thanks for any suggestions/clues

@krocard
Copy link
Contributor

krocard commented Feb 25, 2021

I found what is happening but I have no root caused it yet.

Normally (for example on Pixel) during a track transition here is what happens:

  1. Extractor reaches the end of the 1st track. Last MP3 packet is at Presentation Time τ
  2. Renderer is notified that the current stream will end at τ.
  3. Renderer pulls from the MP3 buffer queue the last buffer of the 1st track (It has the Presentation Time τ). It queues it for decoding.
  4. Renderer pulls from the MP3 buffer queue a format change (due to the switch to the second track). It stops pulling from the MP3 buffer queue until τ is played and saves the new input format.
  5. Renderer pulls from the decoder a buffer τ and plays it. From step 2) it knows that τ was the last buffer of the stream. It notifies the AudioSink that there is now an audio discontinuity.
  6. Renderer pulls from the MP3 buffer queue the first buffer of the second track, which has a Presentation Time τ'.
  7. Renderer pushes the τ' buffer for decoding.
  8. Renderer pulls from the decoder a buffer τ'. From step 6, because of the buffer presentation time, the renderer knows that it is the first buffer of a new format. It reconfigure the AudioSink with the new format and pushes the τ' buffer to it.
  9. AudioSink receives the τ' buffer. It knows there has been a reconfiguration is pending, so drains its AudioProcessors before reconfiguring.
  10. AudioSink's TrimmingProcessor knows that a reconfiguration is pending from 8. It concludes that the previous track has ended. Therefore it drops the amount of data required by the gapless metadata.
  11. AudioSink the handle the buffer, but as there is a discontinuity pending from step 5, it drains the AudioProcessors again*. This is a no-op, but confusing.

The issue I'm observing in the emulator is at step 5. The renderer assumes that if the last buffer of the track queued for decoding has a presentation time of τ, the last buffer decoded of the codec will have a presentation time of τ. This is not the case on the emulator. An additional decoded buffer with a presentation time τ'' is outputted by the codec even though no encoded buffer with a presentation time τ'' was ever pushed.
Note that presentation time τ'' == τ'. I don't think it is significant tough.

Here is what happens on the emulator:

  1. same
  2. same
  3. same
  4. same
  5. same, because Renderer assumes (incorrectly) that τ is the last buffer of track 1 from step 2, it informs the audio sink that there is a discontinuity. This is incorrect and leads to the following
    1. Render pulls the extra τ'' buffer (which is the real last buffer of track 1) from the decoder. It sends it to AudioSink for playing.
    2. AudioSink receives the τ'' buffer. It was told in step 5 that there is a discontinuity, so it drains its internal processors.
    3. AudioSink's TrimmingProcessor has no reconfiguration pending. It concludes that this is a seek not the end of the track. Therefore it does not drop the amount of data required by the gapless metadata and plays it instead. => a gap is heard

Following reproduction step from #8594 (comment). The consistent presentation times observed are:

  • τ=1018775 us
  • τ'=τ''=1044897us

I don't know why the decoder (OMX.google.mp3.decoder) outputs one more buffer than was queued. We might want to rely on the EOS flag on the buffer rather than the presentation time.

Regardless, we should avoid the constant double onQueueEndOfStream call on the processors on each track transition in handle buffer. One due to reconfigurationPending (10) and the second one due to the startMediaTimeUsNeedsSync (11) due to handleDiscontinuity called on track transition (5).

@krocard
Copy link
Contributor

krocard commented Feb 25, 2021

Log of the Emulator at the track transition (gap is heard)
MRend.drainOutputBuffer: new outputBuffer pt=835918
MRend.feedInputBuffer: Queuing input buffer ts=1018775
MRend.drainOutputBuffer: new outputBuffer pt=862040
MRend.onStreamChanged: Queueing stream changed for pt=1018775
MRend.feedInputBuffer: sample stream format change
MRend.onInputFormatChanged: waitingForFirstSampleInFormat = true
MRend.drainOutputBuffer: new outputBuffer pt=888163
MRend.drainOutputBuffer: new outputBuffer pt=914285
MRend.drainOutputBuffer: new outputBuffer pt=940408
MRend.drainOutputBuffer: new outputBuffer pt=966530
MRend.drainOutputBuffer: new outputBuffer pt=992653
MRend.drainOutputBuffer: new outputBuffer pt=1018775
MRend.onProcessedOutputBuffer: calling onProcessedStreamChange pt=1018775 pendingOutputStreamOffsetCount=1 presentationTimeUs=1018775 pendingOutputStreamSwitchTimesUs[0]=1018775
ASink.handleDiscontinuity: startMediaTimeUsNeedsSync = true
MRend.drainOutputBuffer: new outputBuffer pt=1044897
ASink.handleBuffer: calling drainToEndOfStream() due to startMediaTimeUsNeedsSync
ASink.drainToEndOfStream: drainingAudioProcessorIndex=0 audioProcessorNeedsEndOfStream=true
TrimP.onQueueEndOfStream: end of stream queued but no reconfiguration!
ASink.drainToEndOfStream: return false as AP is not ended
ASink.handleBuffer: calling drainToEndOfStream() due to startMediaTimeUsNeedsSync
ASink.drainToEndOfStream: drainingAudioProcessorIndex=0 audioProcessorNeedsEndOfStream=false
ASink.drainToEndOfStream: return false as AP is not ended
ASink.handleBuffer: calling drainToEndOfStream() due to startMediaTimeUsNeedsSync
ASink.drainToEndOfStream: drainingAudioProcessorIndex=0 audioProcessorNeedsEndOfStream=false
ASink.drainToEndOfStream: return false as AP is not ended
ASink.handleBuffer: calling drainToEndOfStream() due to startMediaTimeUsNeedsSync
ASink.drainToEndOfStream: drainingAudioProcessorIndex=0 audioProcessorNeedsEndOfStream=false
TrimP.getOutput: outputted the endBuffer, NO trimming!
ASink.drainToEndOfStream: return false as still buffer to write
ASink.handleBuffer: calling drainToEndOfStream() due to startMediaTimeUsNeedsSync
ASink.drainToEndOfStream: return false as still buffer to write
ASink.handleBuffer: calling drainToEndOfStream() due to startMediaTimeUsNeedsSync
ASink.drainToEndOfStream: return false as still buffer to write
ASink.handleBuffer: calling drainToEndOfStream() due to startMediaTimeUsNeedsSync
ASink.drainToEndOfStream: return false as still buffer to write
ASink.handleBuffer: calling drainToEndOfStream() due to startMediaTimeUsNeedsSync
ASink.drainToEndOfStream: completed
TrimP.onFlush: without reconfiguration pending
MRend.feedInputBuffer: Adding new format to queue for ts=1044897 due to waitingForFirstSampleInFormat
MRend.feedInputBuffer: Queuing input buffer ts=1044897
MRend.feedInputBuffer: Queuing input buffer ts=1071019
MRend.feedInputBuffer: Queuing input buffer ts=1097141
MRend.feedInputBuffer: Queuing input buffer ts=1123264
MRend.feedInputBuffer: Queuing input buffer ts=1149386
MRend.feedInputBuffer: Queuing input buffer ts=1175509
MRend.feedInputBuffer: Queuing input buffer ts=1201631
MRend.feedInputBuffer: Queuing input buffer ts=1227754
MRend.drainOutputBuffer: new outputBuffer pt=1044897
MRend.updateOutputFormatForTime: Output format changed from format queue pt=1044897 Format(null, null, null, audio/mpeg, null, -1, null, [-1, -1, -1.0], [1, 44100])
ARend.onOutputFormatChanged: reconfigure sink due to output format change
TrimP.onConfigure: trim=576, 1404
ASink.drainToEndOfStream: drainingAudioProcessorIndex=0 audioProcessorNeedsEndOfStream=true
TrimP.onQueueEndOfStream: Trimming 529 trimmedFrameCount=1105
ASink.drainToEndOfStream: completed
TrimP.onFlush: will now skip 576 at beginning and 1404 at end
TrimP.queueInput: All start bytes have been trimmed
MRend.drainOutputBuffer: new outputBuffer pt=1071019
MRend.drainOutputBuffer: new outputBuffer pt=1097141
MRend.drainOutputBuffer: new outputBuffer pt=1123264
MRend.drainOutputBuffer: new outputBuffer pt=1149386
MRend.drainOutputBuffer: new outputBuffer pt=1175509
MRend.drainOutputBuffer: new outputBuffer pt=1201631
MRend.feedInputBuffer: Queuing input buffer ts=1253876
MRend.feedInputBuffer: Queuing input buffer ts=1279999
MRend.feedInputBuffer: Queuing input buffer ts=1306121
Log of the Pixel 3 (gapless)
MRend.feedInputBuffer: Queuing input buffer ts=980657
MRend.feedInputBuffer: c2Mp3TimestampTracker changed ts from=1018775 to=1006780
MRend.feedInputBuffer: Queuing input buffer ts=1006780
MRend.drainOutputBuffer: new outputBuffer pt=876167
MRend.drainOutputBuffer: new outputBuffer pt=902290
MRend.drainOutputBuffer: new outputBuffer pt=928412
MRend.drainOutputBuffer: new outputBuffer pt=954535
MRend.drainOutputBuffer: new outputBuffer pt=980657
MRend.onStreamChanged: Queueing stream changed for pt=1018775
MRend.feedInputBuffer: sample stream format change
MRend.onInputFormatChanged: waitingForFirstSampleInFormat = true
MRend.drainOutputBuffer: new outputBuffer pt=1006780
MRend.drainOutputBuffer: new outputBuffer pt=1032902
MRend.onProcessedOutputBuffer: calling onProcessedStreamChange pt=1032902 pendingOutputStreamOffsetCount=1 presentationTimeUs=1032902 pendingOutputStreamSwitchTimesUs[0]=1018775
ASink.handleDiscontinuity: startMediaTimeUsNeedsSync = true
MRend.feedInputBuffer: c2Mp3TimestampTracker changed ts from=1044897 to=1044897
MRend.feedInputBuffer: Adding new format to queue for ts=1044897 due to waitingForFirstSampleInFormat
MRend.feedInputBuffer: Queuing input buffer ts=1044897
MRend.feedInputBuffer: c2Mp3TimestampTracker changed ts from=1071019 to=1059023
MRend.feedInputBuffer: Queuing input buffer ts=1059023
MRend.feedInputBuffer: c2Mp3TimestampTracker changed ts from=1097141 to=1085146
MRend.feedInputBuffer: Queuing input buffer ts=1085146
MRend.feedInputBuffer: c2Mp3TimestampTracker changed ts from=1123264 to=1111268
MRend.feedInputBuffer: Queuing input buffer ts=1111268
MRend.feedInputBuffer: c2Mp3TimestampTracker changed ts from=1149386 to=1137391
MRend.feedInputBuffer: Queuing input buffer ts=1137391
MRend.feedInputBuffer: c2Mp3TimestampTracker changed ts from=1175509 to=1163513
MRend.feedInputBuffer: Queuing input buffer ts=1163513
MRend.feedInputBuffer: c2Mp3TimestampTracker changed ts from=1201631 to=1189636
MRend.feedInputBuffer: Queuing input buffer ts=1189636
MRend.drainOutputBuffer: new outputBuffer pt=1044897
MRend.updateOutputFormatForTime: Output format changed from format queue pt=1044897 Format(null, null, null, audio/mpeg, null, -1, null, [-1, -1, -1.0], [1, 44100])
ARend.onOutputFormatChanged: reconfigure sink due to output format change
TrimP.onConfigure: trim=576, 1404
ASink.drainToEndOfStream: drainingAudioProcessorIndex=0 audioProcessorNeedsEndOfStream=true
TrimP.onQueueEndOfStream: Trimming 1404 trimmedFrameCount=1980
ASink.drainToEndOfStream: completed
TrimP.onFlush: will now skip 576 at begining and 1404 at end
ASink.handleBuffer: calling drainToEndOfStream() due to startMediaTimeUsNeedsSync
ASink.drainToEndOfStream: drainingAudioProcessorIndex=0 audioProcessorNeedsEndOfStream=true
TrimP.onQueueEndOfStream: end of stream queued but no reconfiguration!
ASink.drainToEndOfStream: completed
TrimP.onFlush: without reconfiguration pending
TrimP.queueInput: All start bytes have been trimmed
MRend.drainOutputBuffer: new outputBuffer pt=1059023
MRend.drainOutputBuffer: new outputBuffer pt=1085146
MRend.drainOutputBuffer: new outputBuffer pt=1111268

[internal: patch adding logging at cl/359504935]

@krocard
Copy link
Contributor

krocard commented Feb 25, 2021

@tonihei what is your opinion on this presentation time mismatch and extra codec buffer?
It could also be due to incorrect presentation time calculation in the codec. Similar to why c2Mp3TimestampTracker was introduced.

@krocard
Copy link
Contributor

krocard commented Feb 25, 2021

I just tried forcing c2Mp3TimestampTracker for OMX.google.mp3.decoder, and it solved the issue (no gap).
Nevertheless this doesn't seem to be a scalable solution given that the implementation seems to be specific to a given codec. Samsung phone might be a different codec, thus with a different timing offset.

@Samrobbo do you think generalizing c2Mp3TimestampTracker is possible?

@ojw28
Copy link
Contributor

ojw28 commented Mar 1, 2021

@Samrobbo @andrewlewis - Assuming the answer to Kevin's question is no, which I think it probably is, we might want to rethink the changes that made C2Mp3TimestampTracker necessary in the first place.

Is its primary purpose to avoid the need to pass end-of-stream through the decoder as a way of signalling the point at which the format changes? If so, we may want to consider reverting back to that approach.

I think that its other purpose of making sure decodeOnlyPresentationTimestamps work correctly is unimportant, at least for MP3, due to the way SampleQueue uses MimeTypes.allSamplesAreSyncSamples. We should probably also pick up [Internal ref: go/exo-decode-only-simplification] again, to get rid of decodeOnlyPresentationTimestamps.

@Samrobbo
Copy link
Contributor

Samrobbo commented Mar 1, 2021

Generalising the c2Mp3TimestampTracker is troublesome, because the logic for determining the timestamp is designed to directly match how c2 codec calculates timestamps. Using a different decoder means we can not be certain that the exact same timestamp is calculated.

If I'm remembering correctly, the timestamp tracker was used because the codec outputted timestamps that did not match up to input, and as a result we were losing track of the Format changes and such. I don't recall us ever passing EOS to signal format changes though?

@krocard
Copy link
Contributor

krocard commented Mar 2, 2021

@andrewlewis pointed me to the fact that Codec 2 is now alias to OMX: https://cs.android.com/search?q=alias.*OMX.google.*.decoder
Thus it does not seem that the name of the codec is a reliable way to determine it's implementation.
Additionally, from the title of one of those changes, it seems this alias is part of the decoder apex module.
As a result, it might be that this change is pushed in apex updates, leading the identification of this behavior change been more difficult than just checking the API level. I have not confirmed this theory though.

Finally, as noted in this internal comment, codec are not required to match input and output timestamp. Google's C2 mp3 and QC's AAC decoder are confirmed to have such mismatch, but many other codec might too.

I envision 2 possible solutions:

  1. Detect that a Codec 2 is aliased by OMX in some way and use the current c2Mp3TimestampTracker algorithm.
    This is made difficult by the fact that the codec name is alias and that a detection based on timestamp is non trivial.
    This also has the drawback of only working with this specific codec.

  2. Stop matching the input and output timestamps of the decoder.
    Another mechanism will need to be found to propagate stream change and format change through the decoder. See @ojw28's previous comment Gapless playback is not working  #8594 (comment).

@adoudech
Copy link
Author

Thanks for the feedback. Will this issue be fixed on a future release of ExoPlayer? If so, how far down the line will that be?

BR,
Amir

@krocard
Copy link
Contributor

krocard commented Mar 16, 2021

ExoPlayer currently relies on on the timestamp matching on both side of the decoder. Removing this assumption is a pretty significant change. The current fix is to compensate for the only known codec having such discrepancy:

if ("c2.android.mp3.decoder".equals(codecInfo.name)) {
c2Mp3TimestampTracker = new C2Mp3TimestampTracker();
}

The issue is that on some devices, the name is aliased to OMX.google.mp3.decoder, leading to incorrect detection.

Digging into the platform codec alias code, I realized that canonicalName field was added to retrieve the real codec name. I'm testing if this could let us tell the real codec name.

@krocard
Copy link
Contributor

krocard commented Mar 17, 2021

Looking into the emulator codec config, it is clear that the c2 decoders are alias to OMX.

The config is the same for all Emulated Pixel 3 XL API 29 & 30 and real Pixel 3 XL API 29 and 30:

$ adb shell "grep -A5  c2.android.mp3 $(find / -name media_codecs* 2>/dev/null)"
/apex/com.android.media.swcodec/etc/media_codecs.xml:        <MediaCodec name="c2.android.mp3.decoder" type="audio/mpeg">
/apex/com.android.media.swcodec/etc/media_codecs.xml-            <Alias name="OMX.google.mp3.decoder" />
/apex/com.android.media.swcodec/etc/media_codecs.xml-            <Limit name="channel-count" max="2" />
/apex/com.android.media.swcodec/etc/media_codecs.xml-            <Limit name="sample-rate" ranges="8000,11025,12000,16000,22050,24000,32000,44100,48000" />
/apex/com.android.media.swcodec/etc/media_codecs.xml-            <Limit name="bitrate" range="8000-320000" />
/apex/com.android.media.swcodec/etc/media_codecs.xml-        </MediaCodec>

Nevertheless on API 29 on the emulator, when querying MediaCodecInfo the canonical name of OMX.google.mp3.decode is returned not c2.android.mp3.decoder as expected. As a result it is not recognized as an aliased codec.

The weird issue is that this only affects an emulated Pixel 3 XL API 29. Emulating a Pixel 3 XL at API 30 doesn't show the bug, all codecs are correctly listed as OMX (aliased) and C2.
A real Pixel 3 XL flashed at the last public release of API 29 also correctly exposes the codecs twice (alias and not aliased).
I can't explain the behaviour difference.

So whatever is causing aliased codecs to only show as their aliased name seems to only affect some devices on API 29 (Android 10).

As a result, if we find a way to differentiate the OMX and C2 codec is seems it will have to be by analysing their decoding behaviour rather than their metadata.

Alternatively, we could consider all API 29 OMX.google.mp3.decode decoders to be alias of C2 as it was the default config on Android 10. Nevertheless, such change would be nearly impossible to test and I'm afraid of regressing any devices that would really use OMX decoders on Android 10.

@ashutoshgngwr
Copy link

@krocard I am unable to reproduce the original issue (#7994) with ExoPlayer r2.14.1. I am playing the same sounds with the ExoPlayer demo (main) on Xiaomi Mi A3 running Android 11 (Security patch May 2021). What's happening?

@krocard
Copy link
Contributor

krocard commented Jun 24, 2021

Investigations suggests that only some devices running on Android 10 are affected by this issue, which is caused by incorrect reporting by the platform of the decoder's name. The platform issue seems to have been fixed in Android 11.

@ashutoshgngwr
Copy link

ashutoshgngwr commented Jun 24, 2021

@krocard Sorry, I missed it between the two issues. How come is it not a regression bug? It was working prior to ExoPlayer r2.12.0 (works with r2.11.8 to be precise). If it is a regression bug, wouldn't the platform conditions be the same for both ExoPlayer versions? If they will be, how does it not occur in r2.11.8 and prior releases? I hope you see my point. I am not familiar with the project internals, but it shouldn't be so hard to fix this.

@krocard
Copy link
Contributor

krocard commented Jun 24, 2021

There are 2 main mp3 software decoders:

  • OMX.google.mp3.decode
  • c2.android.mp3.decoder

Unfortunately they don't handle timestamp in the same way [1]. Prior to 2.12.0 we threaded both the same as the OMX one so there was no issue for most phones, as only few ones at the time had the c2 decoder.
In 2.12 we added code to detect c2.android.mp3.decoder and account for the timestamp differences.

The bug is that some devices name their decoder OMX when the implementation is actually a C2 one. Thus they did not benefited from the 2.12.0 fix.

As 2.12.0 changed the code for C2 codec only, for this bug to be a regression the device should be exposing an C2 codec for an OMX implementation. Which is not what I have observed in the only device which I could reproduce the issue (Pixel 3 XL API 29 which exposes OMX for C2 impl). Which device are you testing with?

[1] All MP3 decoders remove the first 529 samples. In C2, the output timestamp is calculated based on number of samples returned in output whereas OMX plugin didn't account for the initial samples removed. OMX frame count is thus 529 frames shifted.

@krocard
Copy link
Contributor

krocard commented Jun 24, 2021

@ashutoshgngwr Could you post the output of adb shell "grep -A5 c2.android.mp3 $(find / -name media_codecs* 2>/dev/null)" on the device where you identified the regression? Please give us the device name too.

@ashutoshgngwr
Copy link

ashutoshgngwr commented Jun 24, 2021

@ashutoshgngwr Could you post the output of adb shell "grep -A5 c2.android.mp3 $(find / -name media_codecs* 2>/dev/null)" on the device where you identified the regression? Please give us the device name too.

@krocard I ran both versions of ExoPlayer (r2.11.8 and r2.14.1) on a Pixel 3 emulator running system-images;android-29;google_apis;x86 image. The following is the output that you requested.

/system/apex/com.android.media.swcodec/etc/media_codecs.xml:        <MediaCodec name="c2.android.mp3.decoder" type="audio/mpeg">
/system/apex/com.android.media.swcodec/etc/media_codecs.xml-            <Alias name="OMX.google.mp3.decoder" />
/system/apex/com.android.media.swcodec/etc/media_codecs.xml-            <Limit name="channel-count" max="2" />
/system/apex/com.android.media.swcodec/etc/media_codecs.xml-            <Limit name="sample-rate" ranges="8000,11025,12000,16000,22050,24000,32000,44100,48000" />
/system/apex/com.android.media.swcodec/etc/media_codecs.xml-            <Limit name="bitrate" range="8000-320000" />
/system/apex/com.android.media.swcodec/etc/media_codecs.xml-        </MediaCodec>

Before this encounter, when I had originally reported the issue #7994, my device (Xiaomi Mi A3) was running Android 10, and I observed the same behaviour on it as well. Gapless playback worked fine with r2.11.8, but it didn't on r2.12.0 and onwards.


I guess we're back to square one. Thanks for taking the time to explain the issue. It seems to me that there isn't much that can be done from ExoPlayer's end to mitigate this issue?

@giladgotman
Copy link

giladgotman commented Feb 14, 2022

I am having the same issue, gapless is not working. there is a short gap between tracks.
Tested on :

  • Samsung s10 , android 11
  • OnePlus 5, android 10

Is there any update on this?

Did anyone find a workaround?

@Tolriq
Copy link
Contributor

Tolriq commented Mar 14, 2022

Tested using ffmepg for mp3 gapless for the users who had issues and it seems that for some of them it still occurs.

Do I understand wrongly the explanation here that it's tied to the device codecs and it should work with ffmpeg or there's some other components involved?

@WendelVolm
Copy link

Any updates on this issue? Would be great to have this fixed. We still run 2.11.8, along with others affected by this bug

tonihei added a commit that referenced this issue Feb 28, 2023
MediaCodecRenderer currently has two independent paths to trigger
events at stream changes:
 1. Detection of the last output buffer of the old stream to trigger
    onProcessedStreamChange and setting the new output stream offset.
 2. Detection of the first input buffer of the new stream to trigger
    onOutputFormatChanged.
Both events are identical for most media. However, there are two
problematic cases:
  A. (1) happens after (2). This may happen if the declared media
     duration is shorter than the actual last sample timestamp.
  B. (2) is too late and there are output samples between (1) and (2).
     This can happen if the new media outputs samples with a timestamp
     less than the first input timestamp.

This can be made more robust by:
 - Keeping a separate formatQueue for each stream to avoid case A.
 - Force outputting the first format after a stream change to
   avoid case B.

Issue: #8594

#minor-release

PiperOrigin-RevId: 512586838
tonihei added a commit to androidx/media that referenced this issue Feb 28, 2023
MediaCodecRenderer currently has two independent paths to trigger
events at stream changes:
 1. Detection of the last output buffer of the old stream to trigger
    onProcessedStreamChange and setting the new output stream offset.
 2. Detection of the first input buffer of the new stream to trigger
    onOutputFormatChanged.
Both events are identical for most media. However, there are two
problematic cases:
  A. (1) happens after (2). This may happen if the declared media
     duration is shorter than the actual last sample timestamp.
  B. (2) is too late and there are output samples between (1) and (2).
     This can happen if the new media outputs samples with a timestamp
     less than the first input timestamp.

This can be made more robust by:
 - Keeping a separate formatQueue for each stream to avoid case A.
 - Force outputting the first format after a stream change to
   avoid case B.

Issue: google/ExoPlayer#8594

#minor-release

PiperOrigin-RevId: 512586838
tonihei added a commit to androidx/media that referenced this issue Mar 2, 2023
MediaCodecRenderer currently has two independent paths to trigger
events at stream changes:
 1. Detection of the last output buffer of the old stream to trigger
    onProcessedStreamChange and setting the new output stream offset.
 2. Detection of the first input buffer of the new stream to trigger
    onOutputFormatChanged.
Both events are identical for most media. However, there are two
problematic cases:
  A. (1) happens after (2). This may happen if the declared media
     duration is shorter than the actual last sample timestamp.
  B. (2) is too late and there are output samples between (1) and (2).
     This can happen if the new media outputs samples with a timestamp
     less than the first input timestamp.

This can be made more robust by:
 - Keeping a separate formatQueue for each stream to avoid case A.
 - Force outputting the first format after a stream change to
   avoid case B.

Issue: google/ExoPlayer#8594

PiperOrigin-RevId: 512586838
(cherry picked from commit 3970343)
tonihei added a commit that referenced this issue Mar 2, 2023
MediaCodecRenderer currently has two independent paths to trigger
events at stream changes:
 1. Detection of the last output buffer of the old stream to trigger
    onProcessedStreamChange and setting the new output stream offset.
 2. Detection of the first input buffer of the new stream to trigger
    onOutputFormatChanged.
Both events are identical for most media. However, there are two
problematic cases:
  A. (1) happens after (2). This may happen if the declared media
     duration is shorter than the actual last sample timestamp.
  B. (2) is too late and there are output samples between (1) and (2).
     This can happen if the new media outputs samples with a timestamp
     less than the first input timestamp.

This can be made more robust by:
 - Keeping a separate formatQueue for each stream to avoid case A.
 - Force outputting the first format after a stream change to
   avoid case B.

Issue: #8594

#minor-release

PiperOrigin-RevId: 512586838
(cherry picked from commit a02c8d8)
@tonihei tonihei closed this as completed Mar 23, 2023
@google google locked and limited conversation to collaborators Jun 1, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

No branches or pull requests