[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/vr/ - Retro Games


View post   

File: 49 KB, 370x278, quake-software-vs-hardware-composite.jpg [View same] [iqdb] [saucenao] [google]
3323953 No.3323953 [Reply] [Original]

Is there any 3d_accelerated_era games, that actually looks better in software mode, like original Quake?

>> No.3323964

VGA quake looks terribad

>> No.3323978

>>3323953
Best recc I can give you is to try using the super8 source port, gives Quake a very cool mixture of nice opengl lighting and software rendered graininess

[also, not retro but try Devil Daggers for a modern pseudo-software rendered arcade shooter]

>> No.3324017

>>3323953
Back when GLQuake was new to me I thought it was the most amazing thing. Now I realise it still looks better. This unfiltered texture meme is stupid. Quake's pixelated textures are fuck ugly, fuck you. Turning filtering off is for newer games with higher resolution textures.

>> No.3324030

>liking chunky quake

>> No.3324035

>>3323953
I always thought software Quake 2 looks much better than OpenGL Quake 2. Finally I see this opinion somewhere else.

>> No.3324039

>>3324017
>Turning filtering off is for newer games with higher resolution textures

You've got it backwards, mate.

>> No.3324040

>>3323953
Buy a PS1.

>> No.3324041

>>3323953
Does quake have any effects and what not that only work in software mode?

I know Doom does.

>> No.3324048
File: 79 KB, 720x687, 1359567693140.jpg [View same] [iqdb] [saucenao] [google]
3324048

>>3323964
>>3324017
Well, tastes differ. But don't they fucked up the whole game palette with GLQuake? Maybe it's only me, but the whole GL port feels more like an afterthought, or rather experiment.
>>3323978
Thanks mate! Gonna check that unfiltered pixelated goodness out.

>> No.3324058

>>3324048
They fucked up lighting. For years anybody playing Quake in OpenGL didn't experience Quake near as dark as it is supposed to be. Ports like Quakespasm fix that and you can truly have the best of both worlds.

>> No.3324067

>>3324041
Underwater distortion

>> No.3324078

>>3324067
Is that all? No mapping tricks or colormap/palette related things?

>> No.3324082

>>3324040
PS1 is accelerated though, just poorly.

>> No.3324084

I prefer software Thief 1.

>> No.3324095

>>3323953
Carmageddon 1. And IMO all DOS Glide games. Esp. if they are run at a higher resolution than expected back then.
There's just not enough geometry details nor texture details for it to look good with filtering and high res. These games look cheap and inconsistent when run like that.
Plus games like Q1 or C1 even lose some atmosphere with the rendering being somewhat washed out. There's not that gritty look anymore.

>> No.3324108

GLQuake fucked up the lighting.
https://www.quaddicted.com/engines/software_vs_glquake

>> No.3324113

GL_NEAREST_MIPMAP_LINEAR master race

>> No.3324198

GLQuake would be better if they didn't fuck it up.

By the way, in a two way contest between good filtered and good unfiltered, the former is objectively better. Only nostalgia for shitty pixels can contest that.

>> No.3324220

>>3324108
That was quite a fascinating read. Not surprised by many of its shortcomings since it was created primarily as a dry run for Quake 2's renderer. I always hated the washed-out look that GLQuake's lack of overbright support gave the game.

>> No.3324678

Half-Life looks better in software mode I'd say; the water effects especially.
https://www.youtube.com/watch?v=DHgtWODyUUo

>> No.3324691

>>3324678
>Half-Life looks better
certainly not

>> No.3324737

>>3323978
The aesthetic looks frickin' awesome

https://www.youtube.com/watch?v=XSoRbNs9bDk

>> No.3324740

>>3324678
Unreal can do something very similar in 3D-Accelerated mode too.

>> No.3325201
File: 1.57 MB, 1280x2880, 1458422838561.jpg [View same] [iqdb] [saucenao] [google]
3325201

>>3324220
Speaking of overbright. Half-Life fixed this for acceleration, but it doesn't work on all systems.
Comparison including software mode.

>> No.3325204

>>3324691
Are you saying Half-Life is so ugly, it can't look better than anything, including versions of itself? Deep.

>> No.3325265

>>3323953
Outcast was all software mode, because muh voxels and Ken Silverman's dick is delicious.

>> No.3325271

>>3325201
I know people will use >anisotropic filtering
as an argument but I really can't tell what's wrong with the lack of mipmapping aside performance issues.

>>3325265
Ken Silverman didn't invent voxels, but he was very much an early pioneer of its use in video games.

>> No.3325658

>>3325271
he worked at Novalogic?

>> No.3325709
File: 38 KB, 600x300, pic06.jpg [View same] [iqdb] [saucenao] [google]
3325709

>>3325271
memes

>> No.3325979

>>3325201
>Half-Life fixed this for acceleration, but it doesn't work on all systems.
So, is it GPU/drivers dependent or something?

>> No.3325997

Software render all the way. The mess of pixels implies more detail than is actually there. The hardware smoothing shows what actually is there: not much.

>> No.3326128

Not truly a 3D accelerated game, but I absolutely don't get the appeal of OpenGL rendering in Doom.

>> No.3326137

>>3324017
The whole argument against texture filtering is that in old games, there really isn't enough texture to filter, and so you get a very obvious loss of detail (smudged, Vaseline covered textures) at higher resolutions it makes sense to use filtering because this loss in detail is pretty much unnoticeable and instead smoothes out rougher textures. Different strokes for different yolks, but you really do have it ass backwards

>> No.3326146

>>3323964
OpenGL looks like a shitty N64 game

>> No.3326464

>>3326128
The megawad trilogy BTSX definitely looks a whole lot better in Software rendering.

>> No.3326524

>>3326137
You literally have it backwards.

On an absolute technical level filtering preserves more detail. To produce filtered textures you have to sample more texels.

>> No.3326537

>>3323953
Might & Magic 7 and 8 looked worse when running in hardware mode with shit like the entire enemy sprites being recolored to represent tougher versions instead of only certain areas of them.

>> No.3326540

mdk

>> No.3326641

>>3325979
Not sure, but I had zero luck with Windows 7 + Nvidia. It works fine in D3D, though.

>> No.3326653

>>3326524
What? The amount of texels stay the same, filtering just does something different to them.
The problem is probably that low-res textures barely have any details, so unfiltered textures give the illusion of sharpness.

>> No.3326659

>>3326653
>What?
don't bother. That anon for some reason believes the interpolated values of bilinear filtering contain more information, because they sample from four neighboring pixels, instead of just one.

>> No.3326720

>>3325709
Why does linear look best of the three in that picture?

>> No.3326750

>>3326128
It's good in Zandronum where you want to be able to aim vertically.

>> No.3326756

>>3326720
I have no idea what no mipmapping linear is supposed to be

First one is point sampling, which means you just sample the texture once and call it a day. Works well up front, but in the distance the samples are so far apart, the pattern of the texture disappears and you're left with noise.

Third one is standard bilinear filtering, with mipmaps. Bilinear filtering only works if the texture is larger than the screen output, as it samples the four adjacent texels. If neighboring pixels on the screen are multiple texture samples apart, you get the same noise. So the mipmaps are introduced to prevent that. Whenever a threshold is reached, the texture gets swapped for a lower resolution mipmap. The change is sudden, at the threshold. So whenever a new mipmap shows up, it's all blurry up close, and gets sharper in the distance, before the next mipmap. This very effectively avoids the noise, but you got the very visible jumps.

Trilinear mipmap was the standard for a long time. Instead of using a hard threshold to swap out mipmaps, the mipmaps themselves are interpolated. So the transition happens very smoothly. Unfortunately it also exposes another issue. The way mipmaps are chosen leads to textures that get overly blurry in the distance, especially at low angles.

The solution is anisotropic filtering. In a very simplified way it works like this: normally mipmaps are scaled squarely. A 512x512 texture gets a 256x256, 128x28 mipmap and so on. Unfortunately, due to the low angle you don't really want that. You want the vertical (depth) axis to be really squished, but for the horizontal (width) to retain detail. Something like a 256x64 mipmap would be perfect for this cobblestone. AF effectively allows for doing just that. The multiplier is an indicator how far the mipmap can be squished. In this case, instead of a square mipmap it can be treated like a 1:16 mipmap. The result is crispy clear textures in the distance. No mipmap mush, no point sampling noise.

>> No.3326765

>>3326756
>128x28
make that 128x128. Squares

I tend to use something like 4x or 8x AF. Any higher and the textures end up "too sharp" again in the distance, usually exposing tiling. You can actually see it in that picture, where the cobblestone pattern repeats into the distance. The blurrier variants before that hide this.

It's kind of interesting how every single technique in that picture was motivated by shortcomings of the previous technique. It's kind of a natural progression.

>> No.3326781

>>3326653
>The amount of texels stay the same

Absolutely not. The amount of pixels stays the same. When you play a game you do not see texels, you only see the pixels that have been generated from the texels. And when you use filtering more texels are INVOLVED in the generation of those pixels (just under four times more texels involved to be exact).

>unfiltered textures give the illusion of sharpness.

Unfiltered textures just produces noise artifacts from jarring colored pixels.

>>3326659
>the interpolated values of bilinear filtering contain more information, because they sample from four neighboring pixels, instead of just one.

Which is self-evidently correct, even in the description you have provided.

Much in the same way a 1000 word speech by a intelligent person contains more information than a 1000 word speech by a stupid person.

>> No.3326785

>>3326781
>even in the description you have provided
both interpolations draw from the same pool of information. Interpolation does not add information

>> No.3326806

>>3326765
What is the downside of AF? I never noticed any significant drop in frame rate, even if the game was struggling with other settings.

>> No.3326808

>>3323953
>Interpolation does not add information

Interpolation does not add information in itself because alone it is nothing more than a mathematical method you fucking imbecile.

However, when you interpolate four values, you encode the information of those four values into a single number.

For a more advanced example, look at compression which can encode a larger number of bits into a smaller number of bits. What you're doing is basically saying that a file that is 4096 KB compressed and a 4096 KB raw are holding exactly the same amount of information. Physically, yes. Logically, no.

>> No.3326817

>>3326806
AF has a minor performance impact, though less on modern hardware. So it's largely a matter of preference. As said, high AF will lead to sharper textures in the distance, which may expose tiling artifacts.

>if the game was struggling with other settings
in general, nearest neighbor and bilinear filtering are free. Mipmapping is largely a memory thing (you have more than enough nowadays), AF is a very minor performance impact. Super sampling is a slightly larger performance impact. Multisampling eats framerate for lunch. Texture size is free, thanks to plenty of RAM. Model complexity can cost, but it's usually marginal. Shaders are a real killer, as bad as multisampling, or worse.

>> No.3326820

>>3326808
the texture contains all the information there is. Interpolation is not decompression, it's making up data, which does not increase the amount of information.
A compressed file usually has a higher information density than an uncompressed one, because that's all that compression does. Decompression does not increase the information.

>> No.3326827

>>3326820
>the texture contains all the information there is

Jesus fucking christ, Please learn how framebuffers work and the difference between pixels and texels.

>> No.3326829

>>3326817
What are some of the first cards to do AF? Does it kill the "retro" to add it in early 3D accelerated games?

>> No.3326836

>>3326829
I don't know, unfortunately. I think even my old Diamond Monster did AF, but my memory is too hazy to say for sure. At the beginning of AF though, its performance impact was quite strong.

As for it "killing the retro", resolution probably does more for that

>> No.3326840

>>3326827
the framebuffer contains data derived from the information in the texture. Data and information are not the same. Derived data never adds information, by virtue of being derived.

>> No.3326847

>>3326827
Pixels aren't even pixels, they are just samples that are technically infinitely small.

>> No.3326858

>>3326847
>Pixels aren't even pixels
that statement makes no sense.

>they are just samples that are technically infinitely small
logical pixels, yes. Physical pixels, no.

You thought pixels are little squares?
http://alvyray.com/Memos/CG/Microsoft/6_pixel.pdf

>> No.3326871

>>3326840
>Derived data never adds information

That's right, but that wasn't the argument. The argument was that filtering is more representative of the original information. So you have it backwards again - it isn't about information added, it's about information lost when texels and pixels are misaligned.

When you don't use filtering in that situation, not all texel data in a texture is accessed - while filtering ensures that values of texels are accounted for.

>> No.3326892

>>3326871
>filtering is more representative of the original information
conjecture. The information is missing, so any mechanism to fill in the data is equal

>When you don't use filtering in that situation, not all texel data in a texture is accessed
That whole statement requires the assumption that linear interpolation is closer to the original data the texture is derived from. That is a guess. If it does not hold, linear interpolation will produce worse output

>filtering ensures that values of texels are accounted for
linear interpolation ensures that adjacent samples are used. Whether that represents the missing information is unknown.

for the record, nearest neighbor and bilinear are both interpolations, often called filtering. You must use an interpolation to sample a texture.

>> No.3327015

Question: What games were among the last to have software rendering?

>> No.3327597

>>3326750
Meh. Thats another thing. Freelook is too weird for me in Doom.

>> No.3327678

>>3326836
Well, I definitely don't consider resolution to be killing the retro, at least not with 3D games like Quake. It supported high resolutions, though unplayable at the time, straight out of the box.

>> No.3327684

>>3327015
Unreal Engine 2.

>> No.3327695

wow are people actually defending blurry bilinear filtered textures? are you fucking crazy?

a world made out of squares is better than a world made out of smears.

>> No.3327706

>>3327695
t. Minecraft

>> No.3327713

>>3327706
Notice how nobody plays minecraft with fucking bilinear filtered textures, though I'm sure they could.

The ONLY people who defend that shit are douches who were BLOWN AWAY by how UNPIXELATED the graphics were when they first got a 3D card. It was a gimmicky novelty that looked terrible.

>> No.3327764

>>3327695
>>3327713
ur takin this nearest filtering meme way too seriously famarino

>> No.3327845

>>3326892
>The information is missing

The fuck? It's not missing. As you said above in your own post, all of the texture data is available. The problem is that texel and pixel mappings do not align.

With filtering, the texture unit ends up reading all of the texels from the sampled texture, while without filtering some texels from the texture will inevitably not be read.

>any mechanism to fill in the data is equal

You can't honestly believe this idiotic logic.

>That whole statement requires the assumption that linear interpolation is closer to the original data the texture is derived from

It is, because equal or more texels end up being read, that's a guarantee. It's not at all a guess that it is usually more representative, it's simply mathematics.

More texels read = more information preserved when re-sampling. It's that bloody simply.

>> No.3327907

>>3327845
>It's not missing
there is no known information between the texels. Whenever a lookup hits fractional points of the texture their value needs to be derived.

>ends up reading all of the texels
it's possible to completely skip texels regardless of the interpolation used. Linear interpolation does not necessarily represent the original lost information

>equal or more texels end up being read
the number of texels used to derive missing data is completely irrelevant. There is no guarantee the missing data has any correlation with these samples. That's an assumption

>> No.3327910

>>3327678
>I definitely don't consider resolution to be killing the retro, at least not with 3D games like Quake
You asked for my opinion, I gave you it. Resolution has a pretty strong impact on 3D visuals, as it shifts the distance at which objects turn from pixel messes into something recognizable. Also changes the distance at which the individual polygons stand out more, and the distance at which the texture on screen surpasses the resolution of the screen, and effects like interpolation are required to begin with.

>though unplayable at the time
Wasn't that the underlying reason against AF in >>3326829 ?

>> No.3328052

>>3326146
Not if you turn off texture filtering.

>> No.3328494

>>3327907
>it's possible to completely skip texels regardless of the interpolation used

Yeah, and it's far less likely to happen when you're using bilinear filtering since every bordering texel is also read with every texel.

>Linear interpolation does not necessarily represent the original lost information

No, it just tries to approximate the lost information with the bilinear interpolation method based pulled from texel data instead of just outright discarding entire texels.

>the number of texels used to derive missing data is completely irrelevant

That's like saying the number of samples taken in multi-sample anti-aliasing (MSAA) is irrelevant. The more times the memory interfaces bangs on the original texture data, the more likely it is that it will be accurately resampled into pixels.

>There is no guarantee the missing data has any correlation with these samples. That's an assumption

The bilinear interpolation method hardly requires any mathematical proofs. It's very simple. Just because you choose not to believe in it just makes it your opinion.

Literally no expert of computer graphics would agree with you on your claim that nearest-neighbor texture 'filtering' more accurately resamples textures than bilinear filtering.

The only reason some people do think so is because they confuse aliased artifacts in nearest-neighbor for actual detail. That's perfectly fine, the subjective eye of the beholder is free to judge those artifacts as more detail.

However, in a mathematical objective sense, it is not a more 'accurate' resampling.

>> No.3328512

I get what op is thinking. Gl mode makes it look like monitor is smeared with a layer of grease. Everything is blurry.

>> No.3328771

>>3328494
>it's far less likely to happen when you're using bilinear filtering
it's just about as likely when the texture is smaller than the screen pixels available for it

>it just tries to approximate the lost information with the bilinear interpolation method
just like nearest neighbor approximates the missing information using the 4 surrounding samples

>That's like saying the number of samples taken in multi-sample anti-aliasing (MSAA) is irrelevant
different situation, the image MSAA is based on contains more information

>The bilinear interpolation method hardly requires any mathematical proofs
correct, and it says nothing about the information not in the texture

>your claim that nearest-neighbor texture 'filtering' more accurately resamples textures than bilinear filtering
Not a claim I am making. I am saying it is no worse than bilinear interpolation. The information between samples is unknown.

>they confuse aliased artifacts in nearest-neighbor for actual detail
In quite a few occasions the nearest neighbor interpolation represents a more accurate interpolation. For example any time the original data has sharp edges.

>> No.3328872

>>3328771
>it's just about as likely when the texture is smaller than the screen pixels available for it

It's closer, but bilinear still has a very small edge in those situations. Anyway, that's what mipmaps are for, but you'll probably argue that they make graphics look worse off due to some aversion to better technology.

>just like nearest neighbor approximates the missing information using the 4 surrounding samples

Hope this is sarcasm because nearest neighbor does no such thing. It only determines the relative distance of a texel between pixels, chooses only the closest texel, and nearby texels have a good chance of being ignored completely.

>the image MSAA is based on contains more information

Oh? So do you want to define how much information is your threshold?

>it says nothing about the information not in the texture

Sure, but it says a lot about the texels nearest neighbor ignored.

>I am saying it is no worse than bilinear interpolation

Except, y'know, the shimmering, aliasing, pixelization.

>For example any time the original data has sharp edges.

That's a very special case. It's because the bilinear filtering does not know whether a sharp edge is part of a larger silhouette or not and therefore attempts to interpolate that edge with the background. (There would be no problem if bilinear considered the edge of a silhouette the same as the edge of the entire texture and stopped sampling past that point) But an artist creating such a texture with sharp edges against high contrast backgrounds would be considered bad texturing for use with bilinear filtering.

>> No.3328887

>>3328872
>Hope this is sarcasm because nearest neighbor does no such thing
it needs to know the distance to all immediately adjacent neighbors to decide which sample represents the queried position

>So do you want to define how much information is your threshold?
interpolation does not introduce information. Averaging combines samples that were taken from the area of the output pixel.

>it says a lot about the texels nearest neighbor ignored
and it may all be wrong

>It's because the bilinear filtering does not know
It's because interpolation can not add information

>> No.3328889

>>3326137
Then turn off texture filtering but keep mipmapping. Problem solved. Sharp pixelated textures and high detail from a distance.

>> No.3328937

>>3328887
>it needs to know the distance to all immediately adjacent neighbors to decide which sample represents the queried position

But that's just coordinates, it has nothing to do with the texel value except for the one chosen texel. Bilinear does exactly the same thing, except with two extra benefits: it reads the surrounding texels, and it uses the relative texel/pixel positions for a weighted average of their values.

>interpolation does not introduce information

No, but it does hash that information.

>and it may all be wrong

It is objectively less wrong because it reads more of the source texels.

>It's because interpolation can not add information

Nobody ever said it did. The only thing that was ever said was that it's less lossy in resampling information.

>> No.3328949

>>3328937
>it has nothing to do with the texel value except for the one chosen texel
because that particular interpolation will pick the color of the nearest texel

>it uses the relative texel/pixel positions for a weighted average of their values
indeed, the name kind of gives it away. There is no telling which value is closer to the actual lost information, or if any is at all. They're two different interpolations, not more.

>hash
that word has no meaning in this context

>less wrong
no telling, the original information is not available

>Nobody ever said it did
you do. You said several times, in multiple variations, that a bilinear interpolated image contains more information than a nearest neighbor interpolated image. That is not the case.

>less lossy
interpolation is not lossy. It adds data, retains all information

>> No.3328986

>>3328949
>because that particular interpolation will pick the color of the nearest texel

There's no interpolation in nearest neighbor. It just chooses the closest texel coordinate and that's it.

>There is no telling which value is closer to the actual lost information,

You're treating this like some kind of Schrodinger's cat quantum mechanics type shit which is ridiculous. It's obvious that a resampling system which reads more texels is going to represent the original texture more accurately. With almost complete certainty, the bilinear filtered texture will be a closer to the original texture's RGB values as a whole than the NN alternative.

>that word has no meaning in this context

A hash function is a type of resampling where you convert an arbitrary number of values to a fixed number of values. It can be used as a good example here, as an arbitrary number of texels have to be resampled into a fixed number of pixels. The thing is that the arbitrary number of texels (as a subset of the total number of texels) is higher with bilinear filtering so the hash is more representative of the original texture.

>that a bilinear interpolated image contains more information than a nearest neighbor interpolated image

All I said was that it logically contained more information (which is true) because of a more accurate representation, not that it physically had more information.

>interpolation is not lossy. It adds data, retains all information

You've lost me here because it doesn't follow on from your own arguments. Interpolation in this case IS lossy because it's not reversible (much like hashing is not reversible).

>> No.3328995

>>3328986
>There's no interpolation in nearest neighbor
nearest neighbor is an interpolation. It determines the values between samples, which is what an interpolation does.

>It's obvious
it's obvious to you. It's still wrong. The missing information between texels is unknown.

>logically contained more information
an interpolated image can not contain more information than the source image. Considering input and output are both logical, the distinction between logical and physical is meaningless.

>Interpolation in this case IS lossy because it's not reversible
interpolation produces new data between the original samples, and leaves the original samples intact. To recover the original information, you remove all interpolated values. That works on all interpolations that do not disturb or replace the original samples

>> No.3329017

>>3328995
>It determines the values between samples

I can concede that in a very pedantic sense some interpolation does exist, but only a 2D interpolation (x,y), as opposed to a 3D interpolation like bilinear filtering (x,y,rgb). Considering the whole point is to get at the RGB values and NN does not interpolate these values, in practical terms nobody except the anally retentive refer to NN as a type of interpolation.

>The missing information between texels is unknown.

Except bilinear reads more texels so the texel derived pixel RGB values are still influenced by their missing texel neighbors. Those missing texels live on in some form within their neighbors.

>an interpolated image can not contain more information than the source image.

No, but that wasn't the point being made. It does *logically* contain more information than an equivalent nearest-neighbor re-sample even if it *physically* contains the same amount of information.

>Considering input and output are both logical, the distinction between logical and physical is meaningless.

Input and output are not merely logical, not even in computers. They are ultimately represented in memory bits which are physical.

>That works on all interpolations that do not disturb or replace the original samples

The thing is that the framebuffer does not retain the original texels. The texels are read, resampled into pixels kept in the framebuffer. In theory, if the texture was guaranteed not to be used in the following frame, sections of it could actually be systematically erased after each fragment is sampled and nothing would be adversely affected.

>> No.3329036

>>3329017
NN and BF are both 2D interpolations. They determine the RGB value of a point that sits between known samples. Both of them use the position and all adjacent samples as input (if you drop any of the surrounding samples, NN can not correctly determine the color of the interpolated value, if that missing sample would be the closest)

>It does *logically* contain more information than an equivalent nearest-neighbor
in the case of an interpolation, the amount of information is identical

>re-sample
interpolation and re-sample are different. Re-sampling discards the original samples

>the framebuffer does not retain the original texels
It's not used as input

>> No.3329049

>>3329036
>NN and BF are both 2D interpolations. They determine the RGB value of a point that sits between known samples

No, NN is a 2D interpolation and bilinear is a 3D interpolation.

NN does not interpolate the RGB value, while bilinear does. NN just grabs an existing RGB value from an existing point. Interpolation is about creating new points.

>the amount of information is identical

Yes, the amount of information is identical, but bilinear filtering's information is more representative because unlike NN it has a chance of using RGB values from neighboring texels in interpolation that NN would otherwise completely discard.

>interpolation and re-sample are different. Re-sampling discards the original samples

There is no codified basis in this assertion as neither term makes reference to what happens to input data. It's a bit pointless to bring it up, really.

>> No.3329056

>>3329049
>NN does not interpolate the RGB value
it derives a new value from the 4 adjacent samples.

>Interpolation is about creating new points
interpolation is the process of deriving data of unknown inputs from known samples

>bilinear filtering's information is more representative
that's unknown

>it has a chance of using RGB values from neighboring texels
usually adjacent pixels are not correlated, so assuming such a correlation is not necessarily valid

>neither term makes reference to what happens to input data
interpolation fills gaps between samples. Re-sampling produces a different set of samples, using an interpolation as a transitional step. Re-sampling is lossy, because it discards the original samples. interpolation is not lossy.

>It's a bit pointless to bring it up
you brought it up, I'm just addressing inaccuracies

>> No.3329079

>>3329056
>it derives a new value from the 4 adjacent samples.

No, it just uses 2D interpolation so it can choose the position of the closest texel and grabs that texel's already existing RGB value from it. Please don't tell me you thought NN generates new RGB values, I will feel bad to have argued at length with somebody that clueless all along.

>interpolation is the process of deriving data of unknown inputs from known samples

Jesus christ, what a convoluted description. Inputs can't be unknown interpolation otherwise you'd have no values to average between.

The definition of interpolation is actually bloody simple:
>interpolation is a method of constructing new data points within the range of a discrete set of known data points.
So as you can see, it explicitly says the input data points are known.

>that's unknown

I really don't feel like repeating more texel reading = usually more representative a hundred times more even though it is plainly obvious. I don't have to convince you of anything. You can just be wrong for all I care.

>usually adjacent pixels are not correlated, so assuming such a correlation is not necessarily valid

Yes, you are right, but not in the way you think. It is precisely because adjacent pixels are not correlated is the reason why bilinear is more representative. In patterns of straight correlated lines, nearest neighbor does perfectly fine as missing texels wouldn't look obvious (as correlated texels would be uniformly missing), and there's no risk of bilinear's one weakness of not knowing where the end of a silhouette of correlated pixels lies.

>interpolation fills gaps between samples

Not in the framebuffer, it doesn't. It just creates pixels with RGB values which have retained input from nearby texels.

>Re-sampling is lossy, because it discards the original samples. interpolation is not lossy.

Still doesn't mean anything to this discussion of NN and bilinear as they use broad framebuffer pixel / texture texel in the same way

>> No.3329094

>>3329079
>it can choose the position of the closest texel and grabs that texel's already existing RGB value from it
that is the way the algorithm works, indeed. Nothing says an interpolation needs to create new values.

>Please don't tell me you thought NN generates new RGB values
it's generating values for unknowns. These values happen to be sampled 100% from the nearest neighbor. The name of the algorithm hints at that.

>what a convoluted description
math is fun

>Inputs can't be unknown
To clarify, the original data defines a grid, usually treated like an integer grid. All fractional values are unknown. interpolation produces values for these otherwise unknown positions.

>you'd have no values to average between
interpolation does not require averaging

>I really don't feel like repeating
thanks. If any readers are still in this exchange, they can probably decide for themselves by now

>why bilinear is more representative
bilinear assumes a correlation

>missing texels
the grid is fully defined

>Not in the framebuffer
interpolation is a sampling technique. It has nothing to do with the framebuffer, or even graphics. It can apply to graphics. The output of an interpolation is an individual value. What's done with that value (like writing it to a framebuffer) is secondary.

>> No.3329127

>>3329094
>Nothing says an interpolation needs to create new values.

Except the whole thing about interpolation drawing new points. The only time the values aren't new is where by coincidence the interpolated values lies along one of the existing points.

There's no chance of that not happening with NN because it will always be an existing RGB value.

>interpolation does not require averaging

But it does when it comes to texture filtering. Again with the anally retentive bullshit on unrelated rubbish.

>If any readers are still in this exchange, they can probably decide for themselves by now

Yes, and they can open virtually any 3D graphics textbook they will see a description along the lines of "nearest neighbor is an inaccurate texture sampling method only used back in the time when computers didn't have enough memory bandwidth to read texels multiple times and nobody uses it now for obvious reasons of image quality" and probably wonder why there is some kind of nostalgia ridden fool spending his time defending something in vain that was obsolete even 20 years ago.

>bilinear assumes a correlation

To be more accurate, bilinear assumes all texels will be correlated in their lack of correlation.

> It has nothing to do with the framebuffer, or even graphics. It can apply to graphics

Yes. Graphics. Which is what we are talking about. And your discussion of where input/output is saved is still completely irrelevant.

But I'll give you a mathematical proof for bilinear being more representative.

Take a typical texture without any uniform straight lines (say a nature landscape) and resample those texels into pixels of a different resolution (p/t not aligning); two images one with NN and one with bilinear filtering.

Then take the original texture and do an average across all RGB values of texels until you get the average RGB value of that texture. Then average the RGB values of each resampled texture.

See which one has RGB values closer to the original.

>> No.3329129

>>3324058
Quake's default gamma is very dark, I can't play in daylight at all without adjusting which washes out the shadows. I think the brightness setting implementation is just flawed.

>> No.3329134
File: 322 KB, 1600x1120, soft_vs_opengl.png [View same] [iqdb] [saucenao] [google]
3329134

oc

>> No.3329143

>>3329134
>comic sans ms
You're one cruel motherfucker, anon

>> No.3329148

>>3329143
I needed a "soft" looking font, comic sans was just around the corner offering its cheap pleasure service.

>> No.3329154

>>3329127
>interpolation drawing new points
interpolation draws nothing. Interpolation produces values for unknown positions by deriving them from values of nearby positions with known values

>The only time the values aren't new is where by coincidence the interpolated values lies along one of the existing points.
or if at least two adjacent samples have the same value and the data is linear interpolated

>There's no chance of that not happening with NN because it will always be an existing RGB value.
coming up with new values is not a requirement of interpolation

>defending
been fairly neutral. I've only commented on the claim that bilinear interpolation introduces (or retains more) information, which is wrong

>> No.3329185

>>3329154
>coming up with new values is not a requirement of interpolation

Right, which is why the definition of interpolate is literally "to introduce".

>bilinear interpolation introduces (or retains more) information,

But it does retain average RGB of the original texture more accurately, and you do know it very well.

>> No.3329193

>>3329185
>which is why the definition of interpolate is literally "to introduce"
interpolation defines values for previously undefined inputs. It does not require for these values to be different from already defined values

>does retain average RGB of the original texture more accurately
NN and BF both retain the defined samples completely. They just interpolate differently between them. Which one is closer to the original and missing information is unknown and depends on the use case

>> No.3329220

>>3329193
>It does not require for these values to be different from already defined values

Sure, in a bubble. But it's rare for the values to be same. Certainly the general expectation is that the values are different.

And NN's RGB values are not directly derived from interpolation. They are derived from RGB value of a texel, the coordinates of which are found from the results of a "closest to" operation, the data of which is provided by interpolation of coordinates.

>NN and BF both retain the defined samples completely.

If by this you mean that the coordinates are identically interpolated, then yes you'd be right!

But bilinear goes an extra step and also interpolates RGB values, based on the data from the previous interpolation of coordinates, while NN just does a "closest to" operation and does not do any more interpolation. And yes, alright you could even pedantically consider a "closest to" operation to be a form of interpolation but that's a bit like calling NN a form of texture filtering. They're just too simple.

So really even playing along with your definitions, the RGB values of a texel in NN aren't found by interpolation. It's the coordinates of that texel that could be considered to be the product of interpolation. Those coordinates just happen to be assigned an RGB value, but the value itself is not the direct product of interpolation like it is in bilinear filtering.

>> No.3329857

>>3328889
>High detail from a distance
Uhh, no, mipmapping gets rid of details. Of course, it only does it because you get too far away for the details to even be visible anyway. It just removes aliasing.

>> No.3330089

>>3329220
>NN's RGB values are not directly derived from interpolation
NN is interpolation. It does not use it

>If by this you mean
I don't. It means that the interpolated value of a known value is identical to the known value.

>based on the data from the previous interpolation
bilinear interpolation is stateless. You couldn't parallelize it otherwise

>NN just does a "closest to" operation and does not do any more interpolation
that is the interpolation

>They're just too simple
complexity is not a criterion of interpolation

>the RGB values of a texel in NN aren't found by interpolation
there are no known values at the polled position, NN makes them up. That's all that interpolation is. How it makes them up is irrelevant

>It's the coordinates of that texel that could be considered to be the product of interpolation
please do some basic look up of interpolation. You're using
the term very badly. There's nothing 3D about interpolation of planar data, it is not required to produce new different values or exceed some arbitrary complexity. The coordinates are not interpolated, they are input.

>Those coordinates just happen to be assigned an RGB value
that is interpolation

>the value itself is not the direct product of interpolation like it is in bilinear filtering
that is averaging, which is one of many many mechanisms to interpolate. NN is another one

>> No.3330132

>>3330089
You know, it's pretty hilarious you've driven the conservation down the tangent of basically admitting that NN is just like bilinear filtering except that it uses a less complex form of interpolation. One that requires fewer values to be read (the RGB values of nearby texels don't have to be read in a piecewise constant interpolation like NN).

Because that was pretty much what I was getting at in the first place. The original poster in this big chain of replies basically tried to imply that NN was a pure representation of the texture, and bilinear was some kind of massively lossy resampling that lost lots of information relative to NN, when the truth is that the two methods are exceedingly similar except that bilinear goes that 'extra' step.

The thing is that for the sake of clarify in discussion one does have to draw the line somewhere when describing the two methods, otherwise it just ends with pedantic red herrings that miss the point of the discussion like it has here. And the reason you've done it is because you know that bilinear preserves the average RGB value of the original texture better which is pretty much the only method for accurately determining accurate resampling.

>> No.3330158

>>3330132
>admitting that NN is just like bilinear filtering except that it uses a less complex form of interpolation
NN and BF are both interpolations. They do not use interpolations, they are.

>The original poster in this big chain of replies basically tried to imply that NN was a pure representation of the texture, and bilinear was some kind of massively lossy resampling that lost lots of information relative to NN
So you were reading things into statements that weren't there? NN and BF are both valid and useful interpolations.

>bilinear goes that 'extra' step
bilinear does not add new information. The data produced by bilinear is not necessarily closer to the original information than NNs output.

>one does have to draw the line somewhere when describing the two methods
I did, you dismissed it, because you insist on BF containing more information

>bilinear preserves the average RGB value of the original texture better
BF assumes the missing information is linearly derived from adjacent texels. There is no reason to believe that is what the missing information is like.

>which is pretty much the only method for accurately determining accurate resampling
BF is one of the more simple interpolations, there are more complex ones, that are not based on averaging. The whole thing has nothing to do with resampling.

>> No.3330176

>>3330132
>The original poster in this big chain of replies basically tried to imply that NN was a pure representation of the texture, and bilinear was some kind of massively lossy resampling that lost lots of information relative to NN
To clarify a bit here, because NN does not assume a linear correlation and because it does not introduce new values, the sharpness of the texture is preserved, and so is the total output palette (it matters when you're dealing with paletted or high color modes). Especially on low resolution textures it was not uncommon to have sharp objects (in a scifi settings you'd find monitors, buttons or pipes modelled with just a handful of texels). These are good examples where BF produces data that is less likely to match the original information, which is why people perceive it negatively. You are completely obsessed with the idea that more samples are better, and that's the sole problem here. I am not saying few samples are better either. Both are valid interpolations, for different use cases. Both try to recover lost information in different ways. Understanding that is key to understanding why people dislike bilinear filtering on older games with low resolution textures. That you're more interested in calling these people dumb, than actually researching the underlying reasoning is bad enough. That you try to speak with any authority though, just tops it off.

>> No.3330184

>>3329134
you can run GL without bilinear filtering

>> No.3330186
File: 41 KB, 350x424, Icewind_dale_heart_of_winter_box_shot.jpg [View same] [iqdb] [saucenao] [google]
3330186

>>3323953
Okay, might not have actually looked better, but it sure ran better!

>> No.3330198

>>3330186
if a game runs better in software mode than 3D accelerated it must have been an S3 card. The whole purpose of 3D acceleration is taking the burden off the CPU

>> No.3330205

>>3330158
>NN and BF are both interpolations. They do not use interpolations, they are.

They are forms of texturing filtering that involve interpolations, exactly as I said. They are not equivalent to interpolations. Now if you want me to get pedantic, you'll notice that bilinear interpolation is not actually a type of 'filter'. The word 'filter' really does not have any meaning when it comes to interpolation. But it is used because the method of bilinear interpolation happens to produce pixels with fewer artifacts, so the result looks 'filtered'. "BF" describes the results, not the method. It's the same for NN which is really called NNF but nobody in practice calls it that.

>The data produced by bilinear is not necessarily closer to the original information than NNs output

You do know that it can be objectively proven in an experiment that the data produced IS closer to the original than NN's output right? The only exception perhaps is that uncommon situation with the silhouette confusion that I described.

>you insist on BF containing more information

It doesn't contain more information in the output but it does involve more information in the input. You can't deny this.

>There is no reason to believe that is what the missing information is like.

The "missing information" from adjacent texels isn't completely missing at all because BF reads their values.

>BF is one of the more simple interpolations, there are more complex ones

Yes, and in any case, NN is lower down the hierarchy of complexity.

>> No.3330225

>>3330205
>They are forms of texturing filtering that involve interpolations
They are interpolations. At this point we're simply on different terms, so it makes little sense to argue these details. If you don't want to use the commonly accepted usage, that's fine. Just don't expect much support, or understanding.

>filter
I have not used that word throughout the conversation for a reason. I see no need to start now.

>"BF" describes the results, not the method
BF describes quite exactly the method. Again, if you don't like the terms, that's your issue. You're entirely on your own though, and will have to accept that people will constantly "misunderstand" you.

>it does involve more information in the input
the coordinates and the available samples, just like NN. That's kind of the challenge with interpolation, the limited input.

>The "missing information" from adjacent texels
the missing information is the value of the position to be sampled. Adjacent pixels are known information, and BF assumes a linear correlation between them and the missing information/unknown data.

>NN is lower down the hierarchy of complexity
No denying that. Not implying it's less useful either.

>> No.3330238

>>3330176
>Especially on low resolution textures it was not uncommon to have sharp objects

Resolution of textures doesn't correlate with the commonality of uniform RGB values in straight lines.

>hese are good examples where BF produces data that is less likely to match the original information

And I've described those situations in which they do. I perfectly agree they have their use cases, it's just that.

1) NN is only more accurate for a very specific type of texture

2) In ALL cases that don't involve minification NN will produce a more aliased shimmer output than BF. Minification screws them both up.

>You are completely obsessed with the idea that more samples are better

Because they are better. Not only is that a more intuitive assertion, it's also mathematically provable.

>why people dislike bilinear filtering on older games with low resolution textures

People dislike it for completely unrelated reasons to accurately because older games don't necessarily use textures with blocks of similar RGB values in straight lines. They dislike it because think it takes away detail due to the visual illusion of aliasing noise being detail. As said way up, I don't begrudge them this visual illusion. All I wanted to say was that BF does not actually cause a loss of detail because aliased noise is not detail. So when you say that I think BF creates more detail you have it backwards. I say that it doesn't cause a loss relative to NN.

>> No.3330259

>>3330238
>NN is only more accurate for a very specific type of texture
accuracy is wishful thinking here. NN tends to produce more pleasant results for the types of textures common in these old games.

>Because they are better
subjective

>I say that it doesn't cause a loss relative to NN
interpolation does not lose information.

>I don't begrudge them this visual illusion
that's your angle? Everybody is stupid but you? I'll bow out of this exchange and let the readers come to their own conclusions.

>> No.3330370

>>3330259
>NN tends to produce more pleasant results for the types of textures common in these old games.

It really depends on whether those textures have the kind of uniformity I describe. I think quite a few old games do, but there are more that don't.

>subjective

It's not subjective, it's provable through the average RGB method, but whatever, you've been ignoring that example this whole time.

>interpolation does not lose information.

Which is exactly why I was saying that bilinear does not cause a loss of information.

>that's your angle? Everybody is stupid but you?

Never said that in the slightest, perhaps your projection. My point was that people were subjectively observing a loss of "detail" in bilinear filtering (actually a loss of aliasing artifacts) which in an objective sense was not a loss of detail at all. Perceiving things through visual illusions that aren't there has nothing to do with intelligence. I also see this alised "detail" but I don't like it too much for its shimmering but that's subjective.

>I'll bow out of this exchange

Good idea, I will too because although we agree on many fundamental points on others we are irreconcilable

>> No.3330403

>>3330370
>we agree on many fundamental points
I'd appreciate if you don't put words in my mouth

>> No.3330419

>>3330198
NOPE. That's not how it worked. The openGL mode in HoW added special effects and slowed the game down. This was on top of how the expansion generally tacked shit on and slowed everything down over the original IWD, in which the openGL mode actually did speed things up somewhat - HoW reversed that too.

>> No.3330437

>>3330403
It's not up to you

>> No.3331379

>>3330419
So it it still mostly software rendered while the hardware is just the post-processing engine?
What other games do this?

>> No.3331387

>>3331379
not necessarily. Before T&L was a thing (on PCs), the CPU had to do a lot of preparation work for the data the GPU is being fed. The GPU did little more than rasterizing. So, when you switch an engine to 3D acceleration you gain some quality, and free some cycles. Games tend to use that for higher framerate and resolution. It seems though that this particular game is changing models and adding effects. The theory would be, the gained free CPU cycles could be used for that extra stuff. Unfortunately they might have overdone it, so with acceleration the total CPU burden is even higher.

>> No.3332301
File: 423 KB, 1600x1200, icewind_dale_screenshot_37d148c3.jpg [View same] [iqdb] [saucenao] [google]
3332301

>>3331387
Indeed, even though without the expansion pack the openGL mode still did manage to speed the game up, there wasn't all that much of the standard types of work for the GPU to offload, very observably.

>> No.3332509

>>3328052
This you fucking numbnuts. Learn some shit about 3d programming.

>> No.3332510

>>3330419
That's not OpenGL's fault, that's the programmer being shit at his job and not knowing how to properly use the tools he's been given.

>> No.3332559

>>3332510
No, you can't really blame the HoW programmers, BG2 also was similarly slow, and likely also had extra slowdown from the "accelerated" mode, though I can't really remember. And I suspect HoW was kind of rushed.

>> No.3333184

>>3332509
Is it possible to control texture filtering per texture on all retro OpenGL hardware?

>> No.3333190

>>3332301
As pretty as Infinity Engine can be, being pre-rendered makes it all the less impressive.
Age Of Empires 2 can dynamically generate levels while looking good too, without any 3D acceleration.

>> No.3333192 [DELETED] 

>>3333184
should be possible, is definitely possible globally.
min_filter and mag_filter could be declared per texture since OpenGL 1.0 (1992). Possible values include nearest, all the way up to linear_mipmap_linear

>> No.3333194

>>3333184
min_filter and mag_filter could be declared per texture since OpenGL 1.0 (1992). Possible values include nearest, all the way up to linear_mipmap_linear

>> No.3333209

>>3333194
So did anyone ever make use of this?
It would be cool to apply modern knowledge and techniques of game graphics to old retro hardware. If one were to pick a specific card to target and create an engine that would look as good as possible with 60 FPS, which one would be just right for the job? Preferrably 3dfx due to Glide, unless late Nvidia OpenGL drivers are better.

>> No.3333216

>>3333209
>So did anyone ever make use of this?
I think so. Can't recall specific examples right now though

>If one were to pick a specific card to target and create an engine that would look as good as possible with 60 FPS, which one would be just right for the job?
cards aren't specifically made for 60fps or not. That's entirely a developer decision. In general, the later cards are more powerful, naturally. So, a Voodoo 5 or something with Glide, or one of the GeForces with T&L support. Got to make up your mind what you really want though.

>> No.3333324

>>3333216
>cards aren't specifically made for 60fps
True, but the first Voodoo could barely hit 60 FPS unless you made your game look worse than software Quake. What I want is something that is balanced between "Too new and powerful to be retro" and "Too old and limited to provide aesthetically pleasing 3D with high speed".
I think Voodoo 2 is the bare minimum for this, but only if one is happy with 16-bit colour. Perhaps the Geforce 256 would be a good fit too. Released in 1999, it can surely claim the throne of the last millennium, and thus be retro by /vr/ standards.

>> No.3333351

>>3333324
>What I want is something that is balanced between "Too new and powerful to be retro" and "Too old and limited to provide aesthetically pleasing 3D with high speed".
that would be every /vr/ level card then. Also, "aesthetically pleasing 3D" is one hell of a can of worms you're opening up there. You'll find a lot of people on this very board, including me, that love the early accelerator period, and even (especially) the pre-accelerator period, for various reasons.

>only if one is happy with 16-bit colour
it does fairly retro-happy dithering in 16-bit mode, not bad.

>Released in 1999, it can surely claim the throne of the last millennium, and thus be retro by /vr/ standards.
right, but then you're just looking for the strongest of the weak. You can certainly do that, but it's a bit boring, no?

If you want to see what you can do on a tight polygon budget and fairly small textures without bilinear filtering, the DS is actually an excellent showcase. I'm not sure where the 3DS falls, actually. Its graphics were too complex for my taste, so I didn't pay close attention.

>> No.3333478

>>3333351
Sure, aesthetically pleasing is subjective. I like the gouraud shaded and lightmapped aeshtetic of the late 90's. GoldSrc and Id Tech 3 did this and aged well, not sure if Unreal uses lightmaps, but it was nice looking too. Half-Life and Quake 3 didn't support Glide, and Unreal's engine had a weakness with CPU off-loading. If one took the best ideas out of these and combined them into an ultimate showcase of "Look at what this puny hardware from 1998 can do at 60 FPS", my dream would come true.