Theoretically yes
All files are a sequence of 0 and 1
Compression find patterns so that they can represent the same values without writing them down
for example: 0000000000000000
instead of writing all 0, they instead add write "16 0s at this point" which the compressor reads and unpacks
same with repeating sequences
another example, if the sequence 010111010101 is found 8 times in the file, it's only saved once and told to repeat those times
you can have as many rules or patterns, but adding more have diminishing returns, as you are adding comprehension patterns for very very specific bit patterns, so you end increasing the decompressor size exponentially
.Zip .Rar, etc have defined rules and patterns for general files which include the most used and important patterns for compression
But if size is not a problem, you can have a decompressor with a huge set of patterns, that would
for instance, making a 500MB out of 1GB would for example take 5 seconds and a 100GB pattern archive
400MB would take 50 seconds and 800GB pattern archive
300MB would take 18 hours and 98TB pattern archive
200MB would take 78 days and 45,309TB pattern archive
making it 100MB would take 49 months and 29,549,295TB archive
these are all examples but should give you and idea