Saturday, 22 May 2021

ffmpeg performance with hundreds of cuts (atrim)

I have audio files (think ~2h) where I want to cut out a lot of pieces out of it (500+) and a ffmpeg command like this one:

['ffmpeg', '-i', 'pipe:', '-filter_complex', 
'[0]atrim=end=30.69:start=0.0[s0];
 [0]atrim=end=34.31:start=31.18[s1];
 [0]atrim=end=38.65:start=34.43[s2]; 
 (... hundreds more)
 [s37][s38][s39][s40][s41]concat=a=1:n=42:v=0[s42]', '-map', '[s42]']

Stream mapping built with ffmpeg-python:

  Stream #0:0 (mp3float) -> atrim
  (... hundreds more)
  Stream #0:0 (mp3float) -> atrim
  concat -> Stream #0:0 (libmp3lame)

Now this works as expected but it takes locally something like 10mins for the files I have and when I deploy it to some server in the cloud it takes something like an hour. It definitely depends on the machine obviously and I definitely scale there the speed but I'd also like to know if there's a way to speed up the processing with ffmpeg itself.

Thanks for any pointers!



from ffmpeg performance with hundreds of cuts (atrim)

No comments:

Post a Comment