-
Notifications
You must be signed in to change notification settings - Fork 124
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Better use of local hardware? #561
Comments
There are indeed close and interesting connections between models that run efficiently on GPUs and lossless data compression. But as far as I know, there are several major problems with "productionizing" these ideas in software like OxiPNG that have prevented workable solutions from emerging:
This is just my (hopefully somewhat informed?) take on why there are no good "plug this GPU thing to make OxiPNG 200% faster" things out there yet, and it's unlikely there will ever be. Of course, other people are welcome to chime in. |
Just to add to the above points:
|
It's frustrating to have a PNG that's been "optimized" by any (combination of) tools 1 can throw at it, & then find any basic file compression (even OS-based transparent filesystem-kind) shrinking it further.
|
I would say the answer to your problem is simply "use a more modern image format", but there is one other potential area where advancements in compression and/or performance may be found: AI. Here's a recent news article about a paper discussing the usage of AI for lossless data compression, including comparisons with PNG (though the AI isn't actually producing PNGs itself). But this is way out of my league. And I imagine there's little interest from experts in applying AI to older formats because it's much more exciting to explore what the AI can do without being constrained to ancient data specifications. Still, in theory, I'm sure AI could be applied to reorder colour palettes, select filters and construct deflate streams to produce optimised PNGs... |
I get the feeling the AI concept is much like throw more hardware (GPU 😅, SIMD, threads, memory) at the problem: until someone actually does a proof-of-concept & then a community develops it further from there, it's just… vaporware. |
Also, my understanding of the article & chart is that the AI did develop some new "modern image format" that's purely custom to the datasets presented, not (necessarily) a general purpose format like anything developed by humans. But, unless the AI is able to explain it to humans, it might be a "black box" algorithm involved in constructing it (much like current AI themselves) & so we might never know, too. |
Of interest regarding AI: https://cloudinary.com/blog/jpeg-xl-and-automatic-image-quality |
Interesting in that this AI use seems to be for more efficient use of lossy compression. |
I almost hate to bring this up (as it seems far-fetched & virtually fantasy @ this point), but is there any way to use locally available hardware resources (if any) to improve
oxipng
's performance in any aspect? After an hour of Googling, the best compilation of GPU techniques I found is theC
repo @ https://github.com/BuzzHari/hp-project. Is there anything useful here or anywhere to apply?The text was updated successfully, but these errors were encountered: