Does gimp avif plugin allow lossless compression?

I put a similar question at gimp subreddit but I guess this is more related sub for avif.

I see gimp plug-in only gives me "nearly lossless" option and quality sliderbar. If I set the slider at 100 and uncheck "nearly lossless", will it give me lossless result from the raw picture?

πŸ‘︎ 3
πŸ’¬︎
πŸ‘€︎ u/just1candle
πŸ“…︎ May 08 2021
🚨︎ report
Lossless compression
πŸ‘︎ 12
πŸ’¬︎
πŸ‘€︎ u/gabriel_GAGRA
πŸ“…︎ May 07 2021
🚨︎ report
Any disadvantage to using Lossless Compression?

This may be a very silly question, but is there any downside to using Lossless Compression instead of Uncompressed when shooting RAW?

I have a D500 and always shoot 14 bit RAW (mainly for flexibility when processing images). I shoot sports, aviation and landscapes.

Any time I look into the Pros and Cons of Lossless Compression, the only thing I can find is the benefit of smaller files, but never any downsides. So help me out Reddit, what am I missing?

πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/pmac900
πŸ“…︎ Apr 07 2021
🚨︎ report
Lossless Text Compression [pdf] bilalonureskili.com/files…
πŸ‘︎ 3
πŸ’¬︎
πŸ‘€︎ u/qznc_bot2
πŸ“…︎ Apr 09 2021
🚨︎ report
What is the most powerful lossless compression algorithm? And is there any reason it couldn't be stronger?

I just want to preface, for this post, I do not care if the computation required to decode something is high. Lets say my goal is have the highest efficiency for archiving data, and i can afford to locally decrypt whatever is needed.

So i understand some of the basic lossless compression ideas. For example if you have AAAAAAAAB you could just say (8A)B or something. But how complicated and robust have people attempted to make lossless compression algorithms?

For example, could you represent a string of data purely as numbers, and then take the square root of that number to shrink the size? Here is an example. Say you have a string representing 1000000. You could then represent that as 1000:1 (where 1 represents the number of times the original number was square-rooted). Or if you had 100000000, you could have 100:2. Of course in practice youd probably have to break up your code into many numbers, but the idea still stands.

So back to my question, has any super powerful lossless compression algorithms been created? What is the strongest one out there?

πŸ‘︎ 22
πŸ’¬︎
πŸ‘€︎ u/Jstodd_
πŸ“…︎ Feb 02 2021
🚨︎ report
The Hitchhiker’s Guide to Compression - A beginner’s guide to lossless data compression go-compression.github.io/
πŸ‘︎ 912
πŸ’¬︎
πŸ‘€︎ u/mrfleap
πŸ“…︎ Oct 01 2020
🚨︎ report
[D] Any thoughts on $600K Hutter Prize for lossless language compression?

100mb to <15mb wikipedia excerpt.
From website: "This compression contest is motivated by the fact that being able to compress well is closely related to acting intelligently, thus reducing the slippery concept of intelligence to hard file size numbers."

I hope people put their mind to this. The most important area of science is the one that can read & understand all the other areas of science, nearly instantly.

πŸ‘︎ 47
πŸ’¬︎
πŸ“…︎ Nov 15 2020
🚨︎ report
Most efficient lossless data compression tool.

I have started downloading YouTube videos and a lot of them. I have a folder full of ones that I would like to keep but won't watch often. Is there a way to losslessly compress that folder really well. I don't mind if it takes ages and I have to keep the program running overnight. I also don't mind paying for some high end software. I don't need to encrypt or have a password or anything. How can I get the data as small as possible?

πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/Be_Careful111
πŸ“…︎ Feb 17 2021
🚨︎ report
ImageOptim: extremely useful free Mac app for lossless image compression

What do you use for image compression?

I recently found ImageOptim for mac (or online) and it's changed my game. I will never upload another image online without first running it through this.

Combined with iResize I'm able to format images for the web in seconds.

How do you all deal with image compression for the web? Let me know your tips and strategies!

πŸ‘︎ 13
πŸ’¬︎
πŸ‘€︎ u/seb-jagoe
πŸ“…︎ Dec 11 2020
🚨︎ report
Lossless Compression Equivalent to Intelligence? (George Hotz Ep. 2)

I just listened to the second George Hotz episode, and I'm very intrigued by one particular idea that was raised.

At around 2:36:50, Lex brings up the Hutter Prize. This quote is part of George's response:

>"It's lossless compression. And... that is equivalent to intelligence."

I'm not sure I fully understand what he means by that statement. Is he saying that the act of losslessly compressing data necessitates intelligence in some fundamental way?

I've googled "lossless compression implies intelligence" and similar phrases, but nothing seems to yield much and I've read about the Hutter Prize. I understand some of the computational similarities between text compression and AI, and from a conceptual standpoint I can understand why efficient compression of natural language text would be a hard problem.

But I would really like to go further and understand the logic behind all of this. What are the specifics of the connection between natural language compression and general intelligence? For example, can this same concept be extrapolated onto other AI problems outside of the space of natural language? Is object recognition in a digital image processing algorithm a compression algorithm in some ways, which serves to compress the entire pixel contents of an image down to a single classifier string? This wouldn't be lossless, which is what makes me think that I'm framing all of this incorrectly in my mind.

Would greatly appreciate any insight from others or any recommended reading on the topic. Thanks!

EDIT: Seems like this post is continually gathering a slow flow of new attention over time, as more people listen to the episode. I personally found Lex's episode with Hutter to be especially helpful in understanding this topic, and would recommend it to anyone else who's curious.

πŸ‘︎ 38
πŸ’¬︎
πŸ‘€︎ u/RyanBranch
πŸ“…︎ Oct 23 2020
🚨︎ report
Self Made Zip - File archiver and archive extractor programs based on Huffman’s lossless compression algorithm github.com/e-hengirmen/Hu…
πŸ‘︎ 24
πŸ’¬︎
πŸ‘€︎ u/cepci1
πŸ“…︎ Jan 21 2021
🚨︎ report
Lossless compression > Lossy compression
πŸ‘︎ 29
πŸ’¬︎
πŸ‘€︎ u/realheterosapiens
πŸ“…︎ Dec 06 2020
🚨︎ report
Lossless Compression

I saw this explanation of lossless compression on 4chan.

Take a sequence of 3 numbers that has length L ---> 1/3 + 1/3 + 1/3 = L

and after a lossless compression algorithm is used that length must remain the same in order to be lossless but this doesn't reduce the file size ----> 1/4 + 1/3 + 5/12 = L <---- one box is compressed and one box is stretched (2/3 of the inputs are altered in some way)

Now take any sequence of 5 binary numbers that has length L and try to map this set to the set of any sequence of 6 binary numbers that have a length less than L.

In other words, this is a mapping of a set with 2^n element(inputs) to a set with 2^m elements(outputs) where 2^n > 2^m.

This leads to a surjective function because there will be at least 2 cases where 2 strings of 5 bits will map to 1 string of 6 bits. This means that there is no way to reverse the function from the output back to the original inputs, therefore there is no way to produce a lossless algorithm that decreases the file size of an image/video, etc.

πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/Tak5035
πŸ“…︎ Dec 29 2020
🚨︎ report
Lossless data compression in Go - Kanzi 1.8 released

Release 1.8 brings bug fixes, improved performance, better parallelism and better compression at level 1 & 5.

See https://github.com/flanglet/kanzi-go for code and performance numbers.

Warning: The bitstream format has changed (and may change until release 2.0). Also, always keep a backup of your files.

πŸ‘︎ 15
πŸ’¬︎
πŸ‘€︎ u/flanglet
πŸ“…︎ Dec 05 2020
🚨︎ report
Lossy and lossless compression

PeaZip, as other archive manager applications, is a lossless file compressor - even if it implements some routines to convert, resize, and compress graphic files using both lossy and lossless algorithms.

Lossless compression means no information is lost in encoding the larger uncompressed content into the smaller compressed content, so after decompression the resulting output will be 1:1 identical to the original uncompressed content. This makes this class of algorithms suited to encode data that is not resilient to modifications, in which even a single different byte would not be acceptable (as executables, databases, documents, and so on).

Read more about similarities and difference of lossless and lossy compression.

πŸ‘︎ 3
πŸ’¬︎
πŸ‘€︎ u/peazip
πŸ“…︎ Nov 09 2020
🚨︎ report
Dolphin just got a new iso compression that’s 100% lossless we could use it to shrink the brawl iso!

Here’s a link to the latest blog updates!

https://dolphin-emu.org/blog/2020/07/05/dolphin-progress-report-may-and-june-2020/

Would this be beneficial to our netplay community?

πŸ‘︎ 96
πŸ’¬︎
πŸ‘€︎ u/Kirby5588
πŸ“…︎ Jul 06 2020
🚨︎ report
Reducing Pandas Memory Usage: Lossless Compression reddit.com/gallery/jti5cs
πŸ‘︎ 3
πŸ’¬︎
πŸ‘€︎ u/robofied
πŸ“…︎ Nov 13 2020
🚨︎ report
The Hitchhiker’s Guide to Compression - A beginner’s guide to lossless data compression go-compression.github.io/
πŸ‘︎ 13
πŸ’¬︎
πŸ‘€︎ u/mrfleap
πŸ“…︎ Oct 01 2020
🚨︎ report
Better Lossless Compression than FLAC?

Is there a lossless audio codec with a better compression ratio than FLAC? Or is FLAC still as good as it gets? If not, what other lossless audio codecs are comparable (in terms of their compression ratio)?

πŸ‘︎ 5
πŸ’¬︎
πŸ‘€︎ u/JohnTravolski
πŸ“…︎ Apr 28 2020
🚨︎ report
Is the compression performed on an app bundle guaranteed to be lossless?

I just did an update and users are getting errors that could only make sense if lossy compression was being applied to my asset files.

πŸ‘︎ 5
πŸ’¬︎
πŸ‘€︎ u/Nimitz14
πŸ“…︎ Jan 31 2020
🚨︎ report
Lossless compression of English messages using GPT-2 textsynth.org/sms.html
πŸ‘︎ 6
πŸ’¬︎
πŸ‘€︎ u/qznc_bot2
πŸ“…︎ Jun 23 2020
🚨︎ report
[D] Need help understanding a research paper: Bit-Swap: Recursive Bits-Back Coding for Lossless Compression with Hierarchical Latent Variables

Research paper at: https://arxiv.org/abs/1905.06845

Implementation on GitHub: https://github.com/fhkingma/bitswap

> The bits-back argument suggests that latent variable models can be turned into lossless compression schemes. Translating the bits-back argument into efficient and practical lossless compression schemes for general latent variable models, however, is still an open problem. Bits-Back with Asymmetric Numeral Systems (BB-ANS), recently proposed by Townsend et al. (2019), makes bits-back coding practically feasible for latent variable models with one latent layer, but it is inefficient for hierarchical latent variable models. In this paper we propose Bit-Swap, a new compression scheme that generalizes BB-ANS and achieves strictly better compression rates for hierarchical latent variable models with Markov chain structure. Through experiments we verify that Bit-Swap results in lossless compression rates that are empirically superior to existing techniques

I have been reading this research paper, and have had a look at the implementation. The demo runs as described in the paper. However, I am having trouble wrapping my head around the inference pipeline, as I mostly implement models in Keras while this is a Torch implementation. I would really appreciate if anyone could explain the paper/implementation in slightly simpler terms. Thanks :)

πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/mischief_23
πŸ“…︎ May 12 2020
🚨︎ report
Is lossless compression a solved problem?

After reading about Shannon's entropy and source-coding theory, it seems like there's no way to progress further in lossless compression. We've already hit the limit with things like Huffman coding. Is my understanding correct?

πŸ‘︎ 5
πŸ’¬︎
πŸ‘€︎ u/DeadpanBanana
πŸ“…︎ Jan 17 2020
🚨︎ report
what if we touched hands while using lossless compression
πŸ‘︎ 18
πŸ’¬︎
πŸ‘€︎ u/tunip3
πŸ“…︎ Apr 08 2020
🚨︎ report
Visualize Lossless vs Lossy compression difference ?

Hi,

I would like to know what would be the easiest way to visualize the difference lossless audio has over compressed version of the same file. The idea is to visually represent the amount of lost information when compressing in a lossy format like MP3.

I tried 'Analysis' option in Audacity, but I'm unable to notice any significant difference. Trying to graph the uncompressed data in python takes a long time and freezes my system.

What are my options ?

πŸ‘︎ 10
πŸ’¬︎
πŸ‘€︎ u/jayaura
πŸ“…︎ Nov 04 2019
🚨︎ report
Using a better ios app to record this video that offers lossless audio. It’s almost comparable direct to DAO. Nice chime with no noticeable compression compared to the previous video. Pardon the playing errors. v.redd.it/emow22ghva551
πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/jjjrrreee
πŸ“…︎ Jun 16 2020
🚨︎ report
xkcd 1683: One of the effects of Article 13 will be people intentionally downsampling or adding filters to content to get around copyright filters, making this comic remain a reality just as lossless compression was finally becoming popular. xkcd.com/1683/
πŸ‘︎ 319
πŸ’¬︎
πŸ‘€︎ u/GeneReddit123
πŸ“…︎ Mar 27 2019
🚨︎ report
Lossless compression of 2-D coordinates

Does anybody know of any algorithms that provide lossless compression of 2-D coordinates? Also would you happen to know the compression rate? Let's say you have 9 points on a 2-D plane, each with their respective X, Y coordinates. Currently that means that 18 different numbers have to be stored. Are there any algorithms that will consistently (i.e. regardless of if there are patterns or correlations in the data) reduce this amount of numbers?

πŸ‘︎ 24
πŸ’¬︎
πŸ‘€︎ u/decimated_napkin
πŸ“…︎ Oct 14 2019
🚨︎ report
What are the advantages or disadvantages or using raw lossless compression?

Seems like you get the benefits of raw with a smaller file. Am I missing something?

πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/jogonzal
πŸ“…︎ Jan 29 2020
🚨︎ report
NNCP: Lossless Data Compression with Neural Networks bellard.org/nncp/
πŸ‘︎ 116
πŸ’¬︎
πŸ‘€︎ u/eberkut
πŸ“…︎ Apr 06 2019
🚨︎ report
Lossless Image Compression Through Super-Resolution github.com/caoscott/SReC
πŸ‘︎ 7
πŸ’¬︎
πŸ‘€︎ u/qznc_bot2
πŸ“…︎ Apr 07 2020
🚨︎ report
Best Peazip lossless compression format for smallest output?

Not sure if this is the right subreddit. if not please redirect me to the best one.

I am using peazip to compress folders that have many files (doc, pdf, images) and other already compressed files so that it may be smaller.

I am confused what is the best lossless compression format is. I checked the peazip's benchmark but I got confused with the wording.

https://preview.redd.it/xb572qgfc8v41.png?width=1071&format=png&auto=webp&s=53deb1da7d3273d5eebbe3f4af4db2b5eb513c1d

Also when I try to compress some folders, the output is the exact same size. (e.g. 34gb folder to a 34gb compressed version). Am I using the peazip software incorrectly? Or is there a particular lossless compression format I should choose?

πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/ConceptionFantasy
πŸ“…︎ Apr 26 2020
🚨︎ report
Are there known limits to lossless file compression?

Have we hit the known limits for things like text, image, and video lossless compression? Can those be calculated somehow?

πŸ‘︎ 12
πŸ’¬︎
πŸ‘€︎ u/seven_seven
πŸ“…︎ Sep 14 2019
🚨︎ report
Lossless compression test: PNG vs WebP vs AVIF dataset docs.google.com/spreadshe…
πŸ‘︎ 20
πŸ’¬︎
πŸ‘€︎ u/Balance-
πŸ“…︎ Dec 28 2018
🚨︎ report
The next focus and big step for video games should be lossless compression

With games like RDR2 ad CoD MW being over 150GB in size, the games will soon become too big and you will have to have separate HDD for every game. It is understandable, because the game graphics get detailed more and more every year, but imho the growth in size is not sustainable.

So devs should really stop being "lazy" and start considering ways of maintaing the graphical fidelity but not being wasteful of resources. Sure, games as a streaming service are big thing and will probably take off, but the problem will remain, if you will have to download 10GB of data just to enter one room and move around it.

πŸ‘︎ 5
πŸ’¬︎
πŸ‘€︎ u/Adequately_Insane
πŸ“…︎ Oct 10 2019
🚨︎ report
Lossless Image Compression through Super-Resolution arxiv.org/abs/2004.02872v…
πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/ShareScienceBot
πŸ“…︎ Apr 10 2020
🚨︎ report
Why are there multiple compression levels to flac if it's lossless?

This was something that always bothered me and I wanna find out the answer. In FL Studio, when you export as .flac , you can choose between 8 different levels of compression, the more compression you apply the more it reduces its file size. However, flac is supposed to be lossless compression as far as I know and sound exactly identical to .wav , basically perfect audio, hence lossless.

If so, then why is there the option of having multiple levels of compression in the first place, if there would be no benefit for your file taking up more space in the 1-7 levels of space and the biggest level of flac compression always being the best objectively period? Is there something I'm missing?

πŸ‘︎ 28
πŸ’¬︎
πŸ‘€︎ u/Lastrevio
πŸ“…︎ Jul 18 2019
🚨︎ report
Lossless Compression
πŸ‘︎ 26
πŸ’¬︎
πŸ‘€︎ u/AnGenericAccount
πŸ“…︎ Sep 24 2019
🚨︎ report

Please note that this site uses cookies to personalise content and adverts, to provide social media features, and to analyse web traffic. Click here for more information.