I’m trying to extract the frames of a video as individual images but it’s really slow, except when I’m using jpeg. The obvious issue with jpegs is the data loss from the compression, I want the images to be lossless. Extracting them as jpegs manages about 50-70 fps but as pngs it’s only 4 fps and it seems to continue getting slower, after 1 minute of the 11 minute video it’s only 3.5 fps.

I suspect it’s because I’m doing this on an external 5tb hard drive, connected over USB 3.0 and the write speed can’t keep up. So my idea was to use a different image format. I tried lossless jpeg xl and lossless webp but both of them are even slower, only managing to extract at about 0.5 fps or something. I have no idea why that’s so slow, the files are a lot smaller than png, so it can’t be because of the write speed.

I would appreciate it if anyone could help me with this.

5 points

It probably becomes CPU limited with those other compression algorithms.

You could use something like atop to find the bottleneck.

permalink
report
reply
2 points

Yeah, that’s the probably the case for those. I looked at CPU usage when using webp and one CPU core was always at 100%. Even tough it seems to not be able to use multiple cores, that’s still really slow, no? Or is that normal?

Also, my CPU is a Ryzen 5 3600, just to get an idea of what performance would be expected.

permalink
report
parent
reply
2 points

My first thought was similar - there might be some hardware acceleration happening for the jpgs that isn’t for the other formats, resulting in a CPU bottleneck. A modern harddrive over USB3.0 should be capable of hundreds of megabits to several gigabits per second. It seems unlikely that’s your bottleneck (though you can feel free to share stats and correct the assumption if this is incorrect - if your pngs are in the 40 megabyte range, your 3.5 per second would be pretty taxing).

If you are seeing only 1 CPU core at 100%, perhaps you could split the video clip, and process multiple clips in parallel?

permalink
report
parent
reply
1 point

Coming back to this, what you said at the end was really interesting. I could manually split up the file and run the frame extract script for each one at the same time but do you know if it’s possible to automate this? Or even better, run each instance of ffmpeg on the same video file and just extract every nth frame, like I said in my earlier reply?

permalink
report
parent
reply
1 point

At this point I’m very sure that the drive speed is actually the bottleneck. I’m not sure why it’s so slow tho. Splitting it is an interesting idea, maybe it’s also possible to tell ffmpeg to only extract every 6th frame and start at a different frame for each of the 6 cores.

permalink
report
parent
reply
3 points
*

A) Export using a lower effort, with libjxl effort 2 or something will be fine.

B) Export to a faster image format like QOI or TIFF or PPM/PNM etc.

PNG, JXL, WEBP, all have fairly high encode times by default with ffmpeg. lower the effort or use a faster format

If you think that it really could be write speed limitations, encode to a ramdisk first then transfer if you have the spare ram, but using a different and faster format will probably help as PNG is still very slow to encode. (writing to /tmp is fine for this)

permalink
report
reply
2 points
*

A) I actually didn’t know about this before, do you know what option I need to use in ffmpeg to set the effort?

B) I tried those but it’s the same issue as with png, that the hard drive’s write speed is too slow (or it’s the USB 3 connection but the result is the same)

Edit: Just found out how to set the effort. Setting it to 1 is quite a bit faster but still slow at only 3.8 fps.

permalink
report
parent
reply
2 points
*

what are your system specs? at a low effort you should be getting a lot more FPS, what cli command are you using? but I guess it would be best for you to export to /tmp given enough ram and then go from there

EDIT: for context, when encoding libjxl I would do -distance 0 -effort 2 for lossless output

permalink
report
parent
reply
2 points

I have a Ryzen 5 3600. My command was ffmpeg -i video.mp4 -threads 12 -distance 0 -effort 1 extract/%06d.jxl.

permalink
report
parent
reply
1 point

I’ll bet with mpeg to jpeg it doesn’t have to re-encode the image, which it’s doing with the other formats.

permalink
report
reply
2 points

h.264 (the compression algorithm the video uses) and jpeg are entirely different, so it does have to re-encode

permalink
report
parent
reply
1 point

Actually they both use Discrete Cosign Transform!

PNGs use DEFLATE which is a generic compression standard that exhaustively searches for smaller ways to compact the data.

I would recommend comparing the quality of images of different formats against eachother to see if there is noticeable lossyness.

If the PNGs are indeed better, try to set the initial compression of the PNGs to “zero” and come back later to “crush” them smaller.

permalink
report
parent
reply
1 point

Even if they use the same technique, they’re entirely different algorithms and h.264 also takes information from multiple different frames, which is why the video is 1.7gb but a folder with each frame saved as a png is over 300gb.

The formats with the best compression, where it might be fine, are jpeg xl and webp, as far as I know. They’re even slower tho because they’re so CPU intensive and only use one thread.

Setting the png compression to 0 doesn’t help because the bottleneck for png is the hard drives write speed. I already tried that.

permalink
report
parent
reply

Linux

!linux@lemmy.ml

Create post

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word “Linux” in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

  • Posts must be relevant to operating systems running the Linux kernel. GNU/Linux or otherwise.
  • No misinformation
  • No NSFW content
  • No hate speech, bigotry, etc

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

Community stats

  • 2K

    Monthly active users

  • 2.9K

    Posts

  • 16K

    Comments