[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 56 KB, 584x574, F_27_9.gif [View same] [iqdb] [saucenao] [google]
12375154 No.12375154 [Reply] [Original]

Premise: Images are getting larger and larger with the demand for better quality. They can now take several dozens megabytes of space and movies in 4K can take more that 200 gb of space.

Solution: A algorithm that can take a image of a given size and output a single unique number. That number is what is saved on the image file. When you open the file, the computer solves the algorithm backwards to get the original image.

Could this work, /sci/? If not, why?

>> No.12375174

>>12375154
It could work but you'd have to be extremely intelligent to pull it off.

>> No.12375177

Isn't that literally what compression does

>> No.12375191

>>12375177
Compression algorithms don't manage to compress all the way down to a single number. Well, unless you count appending the resulting binary together and calling that a number.

>> No.12375197

>>12375177
Yes and images are already compressed, even if its just RLE, which is why they don't zip down that well.

>>12375191
You are talking about the Shannon limit. So no, you can't keep compressing an image down to some tiny representation without any data loss.

>> No.12375203

This is similar to what ML performs on images with encoders.

Input goes through a bottleneck so the neural network is forced to learn to extract only the most useful information about the image so it can restore it to its original form using only the information about the image that goes through the bottleneck.

This information about an image that can be extracted from the bottleneck layer of a network is called latent representation of the image.

>> No.12375216

>>12375191
Single number 100 digits long isn't any different from 10 numbers of 10

>> No.12375221

>>12375203
That is pretty much what DLSS is when used by Nvidia cards but that is a lossy algorithm. The network reproduces an image that looks good to the human eye but it's not the exact same image produced if rendered at the higher resolution.

>> No.12375225

>>12375154
lossless compression already does tgat retard

>> No.12375258
File: 23 KB, 500x500, 1605028486237.jpg [View same] [iqdb] [saucenao] [google]
12375258

>>12375154
GUYS GUYS, what if you could send a message via the internet. it would be some kind of electronic mail, e-mail if you would. is this even possible?

>> No.12375328

>>12375225
Down to one single number?

>> No.12375365

>>12375328
Any finite amount of data can be represented as a single number, anon. Your distinction is a meaningless one.

>> No.12375404

>>12375154
You can represent any sequence of numbers by a single number.
Just use the index of that sequence in any universe number.
Just don't expect that index to be easily storable.

>> No.12375443

>>12375404
>>12375154


PI is conjectured to be normal. Just provide an offset and a length into PI for your data.

>> No.12375453

>>12375216
Thisgetwreckedfaggot.

>> No.12375850

>>12375328
yes, but a single number can be absurdly large. if it can be represented in a digital way the it is in binary which represents a number. one single number

>> No.12376018

Okay anon I have an idea for a this system. We will store the image in the digits. We find the position at which this image exists then take that position number and the length. Boom. Unlimited storage with only 2 numbers.

>> No.12376025

>>12376018
>in the digits
of pi

>> No.12376032

Sure, but come up with a way to get back to your image uniquely from that single number

People a billion times smarter than you have thought about this problem. There are huge financial incentives in this problem. Go read a book on compression (Sayood's book is often recommended)

>> No.12376035

>>12375197
>Op gets his answer and doesn't reply
Typical

>> No.12376263
File: 448 KB, 1528x1424, high-res-doge.png [View same] [iqdb] [saucenao] [google]
12376263

>>12375154
>single unique number
apart from one big number not being different to millions of small ones a system like this would have only like 9 million possible images and you would have to add digits for more.
The challenge is to get high lossless (to 1 in 4K) compression and if you're using it for movies to keep the computation time down.
I don't know how normal image compression works but I would think recursion and transforms would be the way to go.

>> No.12376595

>>12375154
No, there's actual theory behind this. On fact something like the vast majority of images in image space can't be compressed.

>> No.12376670

>>12375154
Technically you can compress one gorillion of pentabyes to 1kb, but doing that and then getting it back would take (not) literally forever. Encoding is all about compression performance ratio

>> No.12376713

>>12375154
Doesn’t work. Look up Shannon Limit, and no, ML / DLSS approaches can’t bypass that. Look up Information Bottleneck

>> No.12376892
File: 37 KB, 155x223, zenon1m.gif [View same] [iqdb] [saucenao] [google]
12376892

>a pic gets compressed to a number that uniquely represents it
>there need to be the same number of possible numbers as pictures at a given resolution
>the generated number has the same filesize as the original image
compression is literally impossible

>> No.12376897

>>12375154
DUDE WHAT IF AN ALGORITHM DID IMAGE COMPRESSION?

>> No.12376902

>>12376892
all compression attempts to avoid saving redundant information. So uf your pic is just a black screen, you can just write one pixel (black) and say it covers everything. Lots of pixels are repeats of the same color. Pretty sure you can put this in the language of fourier transforms and cutoff high frequencies or something.

>> No.12376916

>>12376902
Still, for lossless compression, the number of possible compressed files = the number of possible uncompressed files.
Going with the number analogy, does compression essentially just assign images with lots of redundant information a low number, so that leading zeroes can be omitted and the filesize is smaller?

>> No.12376919

>>12375154
What you mean like a checksum?
You want it to be reversible though and you can't do that if you through that much information out.

>> No.12376926

>>12375154
Why not just use a dictionary on a server and store the video files on a server. You could then look up the movie index and time and get that particular part from the server when you need it?
You could call it streaming and make a service for it

>> No.12377166

>>12375328
> 10 38 86 45 54
> 10.38.86.45.54
A single number

>> No.12377202
File: 350 KB, 368x450, 1602018288870.gif [View same] [iqdb] [saucenao] [google]
12377202

>>12375328

>> No.12377249

>>12377202
Wow how did you make the cat dance?

>> No.12377272

>>12377249
Lots of motivation and training

>> No.12377304

every file can be interpreted as a single number. So you are asking for a bijective function that goes from the natural numbers to the natural numbers and log(n) < log(f(n)). doesn't take a genius to see that such a bijective function cant exist. Compression is generally impossible, it can only be done when the file contains some sort of pattern that the algorithm can detect, and where the description of said pattern would be smaller than its occurrences in the file.

>> No.12377305

>>12377304
shit I got that backwards.
should be log(f(n)) < log(n)

>> No.12377541

>>12375174
>>>12375154 (OP)
>It could work but you'd have to be extremely intelligent to pull it off.
for you

>> No.12378711

>>12375154
Nobody's gave a fuck, just increase maximum size of dictionary, and porn works like magic again.

>> No.12378914

>>12375154
then you would need a huge pre-installed library on the user's device, but i will assume you are smart enough to figure that out.