[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/vr/ - Retro Games


View post   

File: 130 KB, 1680x1050, steamworkshop_webupload_previewfile_244861148_preview.jpg [View same] [iqdb] [saucenao] [google]
6124985 No.6124985[DELETED]  [Reply] [Original]

Why is texture filtering disliked by so many in the retro gaming community? What's there not to like about creating smooth gradients out of low-res textures?

>> No.6125107

No one cares, you fucking autist.

>> No.6125119

>>6125107
You are not everyone, you fucking egotist.

>> No.6125125

>>6124985
It's mostly due to texture filtering taking away the HE HE I'M PLAYING A RETRO GAME which is really what most of these posers care most about

>> No.6125126

>>6124985
Pixelated textures are sharper and visually defined.

Texture filtering on games with pixelated textures/graphics fucks all of that up by creating blurry blobs of shit. If you like big blobs of blurry shit, then game on, fag.

>> No.6125135
File: 99 KB, 344x128, 14395515919464.png [View same] [iqdb] [saucenao] [google]
6125135

>>6125126
>visually defined.
Except that's not how texture filtering works, with most dumbshits thinking that it just applies a good old random blur over a texture.

Texture filtering works by reading additional samples from the texture. So while unfiltered would just grab the nearest texel and turn it into a pixel, filtered grabs the 4 nearest texels, does a weighted average on them, and then a pixel is drawn from that. Due to the additional sampling, the image is OBJECTIVELY more visually defined than unfiltered.

If I were to give two people a piece of paper to try and accurately trace the shape of this lava blob, with one person getting the filtered image and the other getting unfiltered, the person who got the filtered version would do better because you can obviously better make out the shape.

>> No.6125136 [DELETED] 
File: 31 KB, 474x570, th.jpg [View same] [iqdb] [saucenao] [google]
6125136

>>6125125
>Hahaha, yes! I love playing retro games, b-b-but I hate all that pixely stuff, that's why I use texture filtering, I am so friggin based!

>> No.6125141

>>6125136
This strawman literally doesn't exist. Retro hipsters love their pixels

>> No.6125142

>>6124985
Texture filtering is better than pixelated textures, except in most games that weren't designed for it like Playstation games. Quake looks good with it though, assuming you aren't playing at 1080p then the original textures look better.

>> No.6125146

>>6125135
Humans don't perceive things that way, dunning kruger-kun. The picture on the right looks blurrier any way you slice it.

>> No.6125161

>>6125146
It looks blurrier but the shape is better defined, so it is more detailed.

Left just has noise from aliasing making it look sharper, but noise is empty detail.

>> No.6125170

>>6125135
No retard. You have an objectively false view of how texture filtering works.

http://www.shawnhargreaves.com/blog/texture-filtering-mipmaps.html

It removes pixels from the fucking game, since it's objectively down-scaling the image.

You have no idea what "visually defined" means, which isn't surprising, since you're on fucking /vr/.

>> No.6125175
File: 662 KB, 933x522, AaIAW.png [View same] [iqdb] [saucenao] [google]
6125175

>>6125135
>the image is OBJECTIVELY more visually defined than unfiltered.
No, not really. It's multiple samples of the same information. You don't gain anything new by averaging neighbours together.
What it does do is effectively applying a low-pass filter that rids the image of spurious high-frequency transitions. A gaussian filter would be even better.

>> No.6125176

>>6125135
but that's wrong. you're not creating any new information, what you described is literally just blurring. resampling is not the same as filtering, and neither are necessarily appropriate for all artwork. pixel art is designed to show certain things without filtering, which is why you need crazy ass filter algorithms to make NES games not look like dick when compared with what linear linear filters do.

>> No.6125179
File: 575 KB, 1168x409, 1447422653521.png [View same] [iqdb] [saucenao] [google]
6125179

>>6125170
>I actually think texture filtering and mipmaps are the same thing
Haha holy shit

>> No.6125181

>>6125179
Texture filtering and mipmaps are objective visual improvements.

>> No.6125183

>>6125179
>doesn't know what mipmaps are

Not surprising honestly.

>> No.6125187

>>6125179
>being this stupid
Are you fucking kidding me, you dumb faggot?

https://en.wikipedia.org/wiki/Texture_filtering

>> No.6125189

>>6125161
>looks blurrier
>it is more detailed
tell me more about how looking like shit makes games look good

>> No.6125191
File: 85 KB, 600x894, 1448679830207[1].jpg [View same] [iqdb] [saucenao] [google]
6125191

>>6125179
>>6125135
>>6125141
>these dipshits are the people you argue with technically on /vr/ every day with

>> No.6125193

>>6125183
Why? That image perfectly showcases what mipmaps do.
While it's technically wrong to say that mipmapping isn't filtering, it's effectively the exact opposite of what bilinear interpolation does.
Mipmapping is downscaling, bilinear interpolation is upscaling

>> No.6125196

>>6125170
i mean, mipmaps exist because realtime texture filtering is usually dogshit

>> No.6125197
File: 869 KB, 1920x1080, q2_0004.jpg [View same] [iqdb] [saucenao] [google]
6125197

>>6125170
Keep in mind that old games, MAINLY early quake 2, had the retarded habit of compress its textures when applying filtering, mainly OpenGL games, resulting in OP's image.
source ports and patches fixed this fuckery along with nearest upscaling which works perfectly for higher resoluions, also you have ai upscaling for some stuff.
The only game that broke this paradigm was fucking UT99 and Unreal 1 since they used 3dfx and later on direct3d.

>> No.6125198

>>6125175
>You don't gain anything new by averaging neighbours together.
You've got it completely backward. It's not about what you gain, it's about what you don't lose. Unfiltered, texel data falls through the cracks.

>resampling is not the same as filtering
Filtering involves additional samples.

Unfiltered = 1 texel sample per pixel
Filtered = 4 texel samples per pixel

It doesn't take a high IQ to work out which one is going to yield a more accurate representation of the source

>> No.6125202
File: 400 KB, 488x519, 9f5.gif [View same] [iqdb] [saucenao] [google]
6125202

>>6125189
>Less blur is better, even if the details are fake

>> No.6125213

>>6125202
>even if the details are fake

>literally describing texture filtered mipmaps

>> No.6125220

>>6125198
>Unfiltered, texel data falls through the cracks.
There is no loss of information. Even if it's an inaccurate representation, there is no informational difference between nearest and linear filtering.
>More samples are better
That's a fallacy. How you sample matters more than how much you sample. You can make a 9-sample box filter, but it will look like shit.

>> No.6125224
File: 82 KB, 645x729, 1506656526143.png [View same] [iqdb] [saucenao] [google]
6125224

>>6125183
>>6125187
>>6125191
Mipmaps aren't filtering you goddamn drooling imbeciles. Wikipedia isn't an accurate source, it's edited by literal brainlets.

All mipmaps are is scaled down versions of the original texture. They are just small copies of the texture. If the mipmap is a filter, then the original source texture is a filter (which is obviously a completely incoherent argument).

What mipmapping does is provide an accurate set of texel to pixel mappings without filtering being required, assuming the camera is positioned at just the right distance from that mipmap. Really, mipmaps are used in conjunction with filtering to produce trilinear filtering, but mipmaps are not *themselves* filtering.

>> No.6125226

>>6124985
A pixelated picture makes my brain perceive it as if the original was a realistic picture and was then degraded (as if I were looking at the world through a bad camera). A smoothed picture makes my brain perceive it as if the original picture was poor and then was artificially improved (this scenario does not leave chance for me to imagine there was a realistic picture).

>> No.6125236
File: 30 KB, 474x729, OIP.jpg [View same] [iqdb] [saucenao] [google]
6125236

>>6125224
>REEEE, MIMAPS AREN'T FILTERING EVEN THOUGH THEY ARE A METHOD OF FILTERING

>REEEEEE

Holy dog-shit batman, a literal brain-void! Grab the wiki entries, they're like poison to it's negative intellect!

>> No.6125238

>>6125213
>Filtered mipmaps are blurry, but they also have fake detail
That makes no sense.

>> No.6125239

>>6125224
So many anons would have stopped at the first paragraph. Thanks for providing an explanation!

>> No.6125241

>>6125238
Blur IS fake detail, brainlet.

>> No.6125242

>>6125220
>there is no informational difference between nearest and linear filtering.
Aliasing is a visual representation of a loss of detail

>How you sample matters more than how much you sample
True, but bilinear filtering is as simple as a filtering algorithm as you can get, there's no fancy distortion of the sampled data

>> No.6125246

>>6125241
>Purging details adds detail

>> No.6125248
File: 75 KB, 960x960, 1518053192720.jpg [View same] [iqdb] [saucenao] [google]
6125248

>>6124985
>>6125135
>>6125179
>>6125193
>>6125224
>combining pixels to create an amorphous blob is somehow not only superior, but combining these separate, manually-placed pixels is more detailed

>> No.6125250

>>6125246
So you admit that blurring the pixels is purging actual detail then?

Glad we can agree on it.

>> No.6125254

>>6125242
>Aliasing is a visual representation of a loss of detail
Not if there wasn't any detail to begin with, which is the case with interpolation. You can perfectly reconstruct data after using either nearest or linear upscaling.
It is only when you downsample that information is lost.

>> No.6125259

>>6125250
>So you admit that blurring the pixels is purging actual detail then?
Purging details, yes.
Actual details, no.
The detail that is lost are the hard borders between texels. They are just artifacts of limited information, not a property of the actual represented surface.

>> No.6125295

>>6125254
By loss, that doesn't mean permanent loss, but loss as represented in the framebuffer.

>> No.6125323

>>6125295
You'd still theoretically be able to reconstruct data from a framebuffer. In fact, it is MUCH easier with nearest neighbour filtering since the original values aren't blended together.
Going back to the original claim, I don't disagree with the notion that linear filtering is more "defined", at least in the Fourier frequency domain sense, but in the end, it is just shuffling around existing data.

>> No.6125330

>>6125323
It's kind of a strange point you're making. Like if you encrypt a string of text (rather than hashing it), sure it's not permanent lost, but in a temporary sense the loss of the original string is near complete. Anyway, I barely even know what the point of my post is.

>> No.6125337

>>6125330
I'd use an analogy of language. Say you have to translate Cthulhu speak to human languages. Perhaps one language resembles the original more closely, but as long as both of them are unambiguously interchangeable, they contain the exact same amount of information.