[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/vr/ - Retro Games


View post   

File: 50 KB, 1200x984, 3dfx_logo.svg.png [View same] [iqdb] [saucenao] [google]
8861051 No.8861051 [Reply] [Original]

Was Voodoo really as good as people made it out to be?
16-bit color even in its third iteration seems like a gaping flaw that leaves the demise of 3dfx as no surprise. It couldn't even run Quake fast until they added multitexturing.

>> No.8861064

>>8861051
for a short period, yes

>> No.8861069

>>8861051
they put out one good graphics card, the rest were out of date trash thanks to superior nvidia and eventually ati offerings. They didn't even do anything interesting they just had a proprietary API which was better than early clusterfuck direct3d and open gl. That's their major contribution. Fanboys exist for everything but 3dfx sitll having them today is comical.

>> No.8861074

It was the first "good enough" 3D accelerator in a market where every 3D accelerator makers and their dogs had their own API, and Glide was better. It didn't conform well to DirectX or OpenGL and was left behind Nvidia + ATi when those 2 unified API took off. Bad business sense and dumb investments didn't help, but they were dead cards walking far before their bankruptcy.

>> No.8861079

>>8861051
It was. Even at lower resolutions, the output looked better than the competition at the time.

>> No.8861081

>>8861051
Yep. Really hard to explain though, something you had to experience, hardware progressed super fast turning those few years.

>> No.8861084

>>8861074
>It didn't conform well to DirectX or OpenGL
How so? It had one of the most performant miniGL drivers, and there was eventually a full OpenGL implementation released.
Direct3D also "just worked".

>> No.8861089

>>8861069
>they put out one good graphics card, the rest were out of date trash thanks to superior nvidia and eventually ati offerings. They didn't even do anything interesting they just had a proprietary API which was better than early clusterfuck direct3d and open gl. That's their major contribution. Fanboys exist for everything but 3dfx sitll having them today is comical.
not much has changed, now we have two major players, one has shit hardware and uses drivers to push performance, one has shit drivers and tries to make up with hardware, both barely doing any progress though

>> No.8861130

>>8861051
>the demise of 3dfx
they rebranded as Nvidia you stupid fuck

>> No.8861136

>>8861130
And then they died their second death after they released the FX series you abortion.

>> No.8861138

>>8861136
>nvidia are now dead
do people pat you on the head and call you "special"?

>> No.8861140

>>8861081

This. You had to be there. It was really cool when you got a card, and the maker of a game you were playing for a while released a 3dfx patch for it.

>> No.8861142

>>8861130
Technically true, Nvidia bought them and merged it into their company, taking the IPs, patents and better developers and engineers.
Things like SLI were trademarks of 3dfx and Nvidia used them, even though with a slightly different meaning.

>>8861136
Nobody forced you to buy FX though, it's bad because it was a common GPU for lower end off the shelf machines and people thought it was better than it actually was, but you could always just buy a better GPU yourself.

>> No.8861157

>>8861051
A third iteration in a new market created entirely thanks to 3dfx. Thanks to the way the dithering works on voodoo 3 its more like 22-bit.

>> No.8861414 [DELETED] 
File: 544 KB, 640x576, brainlet-brainlet-bike.gif [View same] [iqdb] [saucenao] [google]
8861414

>>8861138
>A majority of the engineering and design team working on "Rampage" (the successor to the VSA-100 line) that remained with the transition, were requested and remained in house to work on what became the GeForce FX series.

>> No.8861425

>>8861138
You absolute humongous retard.
>A majority of the engineering and design team working on "Rampage" (the successor to the VSA-100 line) that remained with the transition, were requested and remained in house to work on what became the GeForce FX series.

>> No.8861468

>>8861051
Voodoo 3 was pretty cash in the late 90s and didn't need a new motherboard. It was all ogre with GeForce and AGP 1.5v slots in the 2000s

>> No.8861487

>>8861142
>just buy a better GPU yourself
yeah, from ATI

>> No.8861504

>>8861051
They really fuck themselves up by buying STB and selling their own cards

>> No.8861545

even if one omits all the graphics hype, now it's almost forgotten that Voodoos provided a great performance boost: P133/Voodoo 2 could be as fast as P200 with some Maxtor

>> No.8861621

>>8861545
kek, you mean Matrox? Maxtor made hard drives
(Maxtor Fireball, what a stupid name)

>> No.8861694

>>8861621
yes, pardon me, Matrox of course (Mystic was everywhere)

>> No.8861759

>>8861621
>Maxtor Fireball
*disgusted sigh*

>> No.8861807
File: 146 KB, 2800x476, scam.png [View same] [iqdb] [saucenao] [google]
8861807

>>8861142
>but you could always just buy a better GPU yourself.
Read: Avoid FX altogether.
There was no competing with ATI. Their lineup delivered a punch so hard that would only ever be matched by the G80 a few years later.

>> No.8861827

>>8861051

story tiem:

>be me, circa 1998
>family buys first puter
>gaymen, 28.8 modems, printing porn on printer commences
>learn about GPU
>buy one
>mostly computer illiterate, so take new GPU to place we bought comp to have installed
>come back a week later
>get tower back
>have a feeling....
>open that shit up
>no GPU installed
>say so. I may be tech illiterate at the time, but momma didn't raise an imbecile
>fat neckbeard shuffles out from shop , mumbles apology, and installs GPU

fat neckbeard tried to rip me off....

>> No.8861831

>>8861504
Why is that bad in principle? It certainly is a lot more friendly to the customer not having to deal with shitty OEM drivers and idiosyncrasies.

>> No.8861919

>>8861831
every other company from Diamond and Creative to Taiwanese OEMs jumped ship to Nvidia, so they had all the retail presence

>> No.8861949

>>8861051
When I finally moved off of my aging Voodoo card, framerates in most games went up a little, but Descent: Freespace didn't look as good as before.

>> No.8861984

>>8861949
I think that game supported Glide. Some old games had worse direct3D implementation at the time, like Unreal.

>> No.8862016

>>8861984
Yes, it had native Glide support, and some effects were only available under Glide. It didn't look better until the SCP.

>> No.8862019
File: 2.42 MB, 3008x2000, DSC_0003.jpg [View same] [iqdb] [saucenao] [google]
8862019

Do you still use them?

>> No.8862051

>>8862019
I probably would if I still had a CRT monitor, then the limited resolution wouldn't matter as much.

>> No.8862054

>>8862019
I saw an S3 Trio something in a flea market. I've heard that they are good for 2D. Was it a mistake to not buy it?

>> No.8862061

>>8862054
depends on the price, but they are really common
that's a Virge DX btw

>> No.8862082

>>8861130
>rebranded
I don't think that accurately describes the situation, some of them were brought on as extra talent for Nvidia
>>8861136
>second death
Eh, were they fired or anything? I'm sure they still had some good ideas that were implemented over the years and were kept on board. Apparently no one at Nvidia could into DX9, neither the old guard or the 3dfx newbies, and ATI just knocked that shit out of the park. I don't think the 3dfx guys were solely responsible for FX sucking. ATI just delivered an unprecedented god-class product.

>> No.8862121
File: 196 KB, 512x384, c1a1d0000.png [View same] [iqdb] [saucenao] [google]
8862121

>>8862019
Yep.

>> No.8862130

>>8862082
The next generation was much more competitive for nvidia, with the 6000 series.

>> No.8862140
File: 84 KB, 512x384, mechwarrior2.png [View same] [iqdb] [saucenao] [google]
8862140

>>8862121
real screenshot?

>> No.8862143
File: 303 KB, 640x480, c1a1f0001.png [View same] [iqdb] [saucenao] [google]
8862143

>>8862140
Yes. From a Voodoo 1 4MB. HL wasn't exactly playable on a Pentium 200MMX processor but I tried.

>> No.8862151

>>8862140
man i want to play the old mechwarriors so badly after playing the new one.

>> No.8862176

>>8862121
What VGA timings does the Voodoo output at that resolution? Is it just a 480p signal with some sort of scaling?

>> No.8862207

didn't have one back in the day but managed to buy a pair of Voodoo 2's just before everyone and their dog became aware of the "RETRO VINTAGE CULT UNIQUE PC HARDWARE" craze and started demanding ten times the worth of the card.

>>8861051
there was some article describing the process how Voodoo's "16 bit" color is actually processed as 22 bit or something so it looked better than actual 16 bit color and insignificantly different from true 32 bit

>> No.8862208

>>8861051
Yeah it just werked. Also games of the era were more optimized for Glide so they both tended to run better and look better too. I recall gaming friends talking about how much D3D and every other API of the era were terrible. It went something like Glide > OpenGL > D3D > everything else.

>> No.8862215

>>8862207
If you're talking about dithering, that's something they all did. If you mean the special filter used by the Voodoo 3, it comes at the cost of some sharpness.
Besides, 16-bit color extends to textures. Deep-color textures such as lightmaps will show significant banding.

>> No.8862234

>>8862176
It can do 60, 75, 85 or 120 in 640x400, 640x480 and 800x600 (only possible on 8MB Voodoo cards). At 512x384 its minimum refresh rate is 72.

>> No.8862260

>>8862234
>At 512x384 its minimum refresh rate is 72.
Interesting. So it's actually outputting that raw resolution and upping the frequency to fall within the limits of 31.5KHz? I didn't think that was possible.

This would almost explain why Half-Life has an FPS cap of 72, except that this is inherited from Quake which was only 70Hz in its default display mode.

>> No.8862274

Lots of tech minded people in here so I have a somewhat related question:

Even though many MS-DOS games ran natively in 320x200 how common was it for people to be playing them in that resolution in the 90s? Since most monitors were VGA by then it would seem that people were playing these games in double-scanned mode at 640x400, right?

>> No.8862289

>>8862274
I saw a photo of someone's CRT playing Doom, and it had a 1:1 ratio between scan lines and rastered lines.

I have no idea how. My own attempt at running Doom natively looked quite odd with its line doubling.

>> No.8862364

>>8862274
>Since most monitors were VGA by then it would seem that people were playing these games in double-scanned mode at 640x400, right?
usually people ran the best resolution that had still playable framerate
which would depend wildly between computers - someone could run higher (e.g. 640x480 or even 800x600), some could not and had to stick to 320x200
and I don't think anyone cared about technical details like "double-scanning" and just focused on performance vs fidelity

>> No.8862445

>>8862260
CRTs don't really care about resolution as long as it falls within its scan rate limits. That's how some people are able to drive true 240p out of PC CRT monitors for emulation purposes: by outputting at 120Hz so the scan rate is 31KHz.
>>8862274
If they were playing using at least a card from the VGA-era or later and a VGA monitor, they were most assuredly playing them double-scanned at 70Hz to conform to the minimum 31KHz scan rate.

Now, with CGA and most EGA games played on actual CGA and EGA-era hardware, they would've displayed in actual 320x200 at 60 Hz, but VGA-era Mode 13h games were always double-scanned.

>> No.8862472

>>8862151
The new Mechwarrior 5 game is horrible. It’s the same levels but different colors (red/brown = lava world, white = snow, green = forest, etc.) and you’re fighting in a shitty sandbox. It feels rushed and as if they ran out of money. The older games have more depth and I’m not just being nostalgic being that this is /vr/.

>> No.8862625

>>8862215
i think most CRTs have sharpness settings in their GUI.

>> No.8862882

>>8862274
>>8862289
>>8862364
>>8862445
What's double scanning and line doubling? Google is failing me.
I played Quake on a 1995 era machine with a 14" CRT back in the day, and had to run at 320x240 down from the 640x480 I used for Windows. If anything was secretly doubling any lines, it was certainly transparent to the user.

>> No.8862893

>>8861130
being bought out isnt the same as "rebranding" you fucking autist

>> No.8862901

>>8862882
You know how people here and elsewhere salivate over CRT scanlines, 240p and all that? Well, that wasn't a thing on PCs from the VGA era onwards, at least not like on consoles anyway.

See, unlike CRT TVs and older CGA-era monitors, which had a scan rate of roughly 15.7 KHz and could display between 192 to 240 progressive lines at 60 Hz (though TV and movie content was, of course, 480 interlaced lines), VGA-era PC monitors had a minimum scan rate of 31 KHz (so double that of TVs and older monitors) and could not physically display so few lines, at least not at that low of a refresh rate. So what they did for backwards compatibility with CGA and EGA games that ran at 320x200 is they duplicated every line and set the vertical refresh rate to 70 Hz to meet the minimum 31 KHz threshold for VGA monitors. There was also a 256 color VGA mode called Mode 13h that also ran at line-doubled 320x200 at 70 Hz, which is what the vast majority of VGA-era DOS games ran at. The end result was games that looked blocky and pixelated due to the line-doubling, kind of like unfiltered emulator shots on modern monitors. So internally, they ran at 320x200 or sometimes 320x240, but the output was scaled 2x vertically, so it was really 320x400 or 320x480.

>> No.8862936

>>8862901
Ah thanks. I assume this was done quietly by the VGA adapter which is why I never knew it was happening

>> No.8863165

>>8861807
FX wasn't the whole 5000 series, anon.

>> No.8863169

>>8862176
You can pick. Even custom ones. As long as it's 31+ kHz.
There's environment variables you can use.

>> No.8863171
File: 556 KB, 2000x1500, 1637251490025.jpg [View same] [iqdb] [saucenao] [google]
8863171

>>8862019
Based

>> No.8863173
File: 696 KB, 1500x2000, 1643132532492.jpg [View same] [iqdb] [saucenao] [google]
8863173

>>8863171

>> No.8863178 [DELETED] 
File: 3.22 MB, 3000x4000, 1622507283007.jpg [View same] [iqdb] [saucenao] [google]
8863178

>>8862901
Good explanation.

>>8862289
You mean this? It's a 15kHz CRT, that's how.

>> No.8863181
File: 1.18 MB, 2000x1500, 1622833954144.jpg [View same] [iqdb] [saucenao] [google]
8863181

>>8862901
Good explanation.

>>8862289
You mean this? It's a 15kHz CRT, that's how.

>> No.8863243

>>8862882
>If anything was secretly doubling any lines, it was certainly transparent to the user.
you could tell if your monitor had an OSD that showed its current resolution, and there was a difference in what you set in the game vs what the monitor displayed (e.g. 320x200 in-game and 640x400 in the monitor)

>> No.8863259

>>8861807
>Avoid FX altogether.
while FX cards had poor performance for games older than 2001, they're perfectly fine for running anything 2000 and older
a FX5200 is like a newer S3 ViRGE - excellent compatibility and dirt cheap

>> No.8863267

>>8863259
Still no real point in getting a FX 5200 card, cheap? Sure but so are 4000 or 6000/7000 cards, which offer either even better compatibility or performance, considering what you need. Still having AGP options too if that's the factor.

>> No.8863274

>>8862274
>>8863243
Not much has changed, GPU scaling is the default again for non 1:1 modes
Render resolution and display resolution were always two different concepts for me

>> No.8863336

>>8863267
last time I checked they tend to be scarcer and way more expensive
FX5200 cards are in the "free to $10" range

I think 6000/7000 cards had dropped some compatibility for something, which completely slipped my mind what it was

>> No.8863338

>>8863336
Here everything from GeForce4 Ti to 7800 GS are 10€ - 20€ range.

>> No.8863651

>>8862445
I have never understood why anyone would use 240p @120. Motion clarity is far more important than scanlines, and 120hz looks like complete garbage when displaying 60fps content.

>> No.8863658

>>8863651
Just use BFI.
Both 240p120Hz+BFI or 480p60Hz with scanline filter will look identical.

>> No.8863662

>>8863658
120hz will have the wrong scan-out slant which makes it look wrong.

>> No.8863708

>>8862143
>HL wasn't exactly playable on a Pentium 200MMX
Nonsense, I played it on a 166 at like 10fps when I was a wee nipper

>> No.8863712

>>8863336
8bit paletted textures and table fog. Does anyone know if any modern patches/ports for Thief replicate the fog properly or close-enough? Never looked up a comparison on this.

>> No.8863762

>>8863662
What are you on about? It will look identical, it's even the same timings for the most part, just double the lines.

>> No.8863827

>>8863762
120 will take 8ms to scan the screen, and then 8ms of black. 60hz will take 16 ms to scan the screen. It does not look the same.

>> No.8863829

>>8863827
BFI dims the screen a little, that's a given but so do fake blank lines to get scan lines
also re-check your math, anon, obvious mistake

>> No.8863845

>>8861064
/thread

It was king in 1997-1998 and that's about it.

>> No.8863869

>>8863827
first you said it makes it look wrong
now you said it will not look as bright? which one is it

>> No.8863897

>>8863869
Do you see the word bright anywhere in my comment. I never said a thing about it. Its the timing which is wrong, and the different timing results in a different apparent slant on motion. I want the authentic 60hz scan-out slant.

>> No.8863903

>>8863897
that's what you implied, since anything else would not make sense
no the slant will not be wrong, you're drawing a framebuffer while emulating, you can't control the beam directly anyways, it makes no difference

>> No.8863905

>>8863827
>120 will take 8ms to scan the screen, and then 8ms of black.
What? No back or front porch? No beam travel time? What kind of logic is this. CRTs don't work this way.

>> No.8863963

I owned an FX5500 from 2006 to 2014. Jesus fucking Christ, what a shit vga. I remember I couldn't even open Oblivion.

>> No.8863974

>>8863963
>2000-fucking-14
JFC I at least had a 360-tier laptop from 2010-2014.

>> No.8863976

>>8863963
Hell I had a GF TI 4200 in 2002 that smoked that.

>> No.8863980

>>8863976
GF4 series was better than FX, not a secret
Except the really high end 5000 series, which aren't actually FX GPUs

>> No.8863997

>>8863980
I'd argue a 5800 or 5900 were the ONLY FX cards.

>> No.8864018

>>8863997
Depends how you look at it I guess

>> No.8864137

>>8861051
Soldier of Fortune

>> No.8864205

>>8863181
Is that native Doom? What display adapter and output?

>> No.8864287

>>8863897
>I want the authentic 60hz scan-out slant.
Why would you want a deficiency like that?

>> No.8864308

>>8864287
Because that's how its been my entire life, and at this point my brain is very hard wired to expect it. It just looks wrong any other way. I have spent more time in front of a CRT than doing anything else. And I always use 60hz. I even sit a front the a 60hz CRT this very second typing this message.

>> No.8864313

>>8863963
I think I had an FX 5600 that I played source games with until The Orange Box, which ran much worse even if I turned down the settings to mud.
I should have bought a fucking 8800 GT, but the in-game advertisement made me go for Ati instead, and it gave me tons of issues.

>> No.8864321

>>8863259
>a bunch of diarrhea dooky
Where do I even begin? A FX5200 sucked at everything. It was Geforce 2 tier in 2003. It sucked then and its levels of shitty beyond comprehension now. It was a fucking girlfriend computer/casual user/PCI mobo card at BEST. A last gen 4200 or even a GF3 was far better at everything it did across the board. The FX series was actually super-good at DX8/pre-2001 shit but sucked cock at DX9 even with all the cheats and hacks. Unless you wanna play DX7 era shit at newest, it's useless and not at all recommended.

>> No.8864326
File: 5 KB, 1920x1080, Untitled.gif [View same] [iqdb] [saucenao] [google]
8864326

>>8864308
>I have spent more time in front of a CRT than doing anything else. And I always use 60hz.

>> No.8864342

>>8864321
I'm pretty sure some 5200 variant with compromised memory runs even worse than a Geforce 256 in some instances. That's barely good enough for even DX7.

>> No.8864356

>>8864326
I don't even see the flicker anymore. Staring a mostly text for 14 hours a day for a decade will do that.

>> No.8864531

>>8864205
Chocolate Doom at 60Hz. GeForce 2070 Super, DP to VGA with VGA to RGB cable. Output is 2560x240p@60Hz, Doom is running at 200p with 1:1 scaling, so 20 pixels on the bottom and top are blank, like virtual back and front porch, the geometry is adjusted on the CRT itself.

>> No.8864545

It was the best there was and only cost $200, now for the best graphics card expect your wallet to hurt.

>> No.8864696

>>8864531
>60Hz
>Doom
Yikes.

>> No.8864801

>>8864696
Chocolate Doom doesn't have the 35FPS limit for 1/2 v-sync of 70Hz. You can actually run it at 60 FPS.

>> No.8864940

>>8864801
That doesn't sound very chocolaty, but sure.

>>8863980
>>8863997
Elaborate. Is this just pedantry? FX has always been synonymous with Geforce 5 in my book.

>> No.8864950

>>8864940
>FX has always been synonymous with Geforce 5 in my book.
FX is GF5, but majority of the series are really bad, like mid range GF5 FX card is worse than entry level GF4 card, but the high end GF5 FX cards were actually decent

>> No.8864973

>>8864940
>That doesn't sound very chocolaty, but sure.
Technically Crispy Doom but the only option used from there was the limit remover, still running at 200p.

>> No.8865000
File: 33 KB, 738x977, smug.jpg [View same] [iqdb] [saucenao] [google]
8865000

>>8864950
FX is FX.

And it wasn't just a few bad apples. Even the high-end suffered. Forget the Fermi housefire, this is the real disaster waiting to happen.

>> No.8865004

>>8864973
I wish Crispy Doom went above 480p.

>> No.8865008

>>8865000
No, high end were better than GF4's high end, so it was progress. If you had a high end FX card, it would have still be a worthwhile GPU.
It just sucked compared to the competition, since you could get a equal Radeon card for cheaper that didn't require as beefy cooling.

>> No.8865016

>>8861051
>Was Voodoo really as good as people made it out to be?
Yes. It was the first 3D accelerator that actually worked and its API was the first to become a viable industry standard.

>>8861069
>they put out one good graphics card, the rest were out of date trash
>implying the Voodoo 2 was trash
lmao. Go home, zoomer. 3dfx's contribution was basically creating the market for 3D accelerators and proving they weren't just snake oil.

>> No.8865112

>>8864321
nothing you posted contradicts the post you quoted
you're just making an angry venting post at a 20 year old graphics card, in a thread about a 25 year old graphics accelerator

>> No.8865124

>>8865000
>Forget the Fermi housefire, this is the real disaster waiting to happen.
https://www.youtube.com/watch?v=WOVjZqC1AE4

>> No.8865151

>>8862274
Every 320x200 VGA game was lined doubled in hardware. SD monitors were not compatible with VGA. Even the basic VGA text mode used by the boot screen was 720x400.

>> No.8865156

>>8865151
So in the case of someone wanting to play 80s or early 90s DOS games on their mid-to-late-90s VGA set-up they would be seeing 400p.

Hmm.

>> No.8865158

>>8863827
True, but faster scanout looks better. Games assume frames are temporal point samples. The real drawback to BFI is the lower brightness.

>> No.8865164

>>8865156
Yes, not to mention at 70Hz rather than 60Hz as on original CGA/EGA hardware. Though I have to say I'm not sure how much of a problem that would have been pertaining to framerate and such.

>> No.8865170

>>8865158
I consider scan-out skew to be an important part of the authentic experience.

>> No.8865182

Also, IIRC hi-res 640x350 EGA played on VGA hardware games still ran at that exact resolution, except in letterboxed 1:1 PAR as opposed to 4:3 on EGA hardware, and also at 70Hz. Can someone confirm?

>> No.8865245

>>8865008
Tinnitus isn't progress.

>> No.8865257
File: 341 KB, 600x593, sharpening.jpg [View same] [iqdb] [saucenao] [google]
8865257

>>8862625
>Sharpness settings

>> No.8865805

>>8861051
This was the best 3D card 25 years ago. You could run Deus Ex just fine with that

>> No.8865823

>>8863963
I had a Radeon 9200 with some kind of AMD CPU from 2005 to early 2011. Jumping to a systen with a first-gen i7 and GTX 465 was pretty amazing even if I already owned an Xbox 360 by that point.

>> No.8865829

>>8861468
SLI Voodoo2s were more powerful than the Voodoo3. Only 16 bit color didn't help when ATI and Nvidia were releasing 32 bit color cards either.

>> No.8865989

>>8865805
True. I was playing Deus Ex on my Voodoo 4 4500. Beautiful game and ran exceedingly well with it.

>> No.8866153

>>8863171
>>8863173
Looks nice! What components did you use?

>> No.8866231

>>8865170
There's no scan-out skew, why would there be.

>> No.8866234

>>8865182
Yes, because you had 25 empty pixels on the top and bottom

>> No.8866243
File: 800 KB, 750x1000, 1624047897206.jpg [View same] [iqdb] [saucenao] [google]
8866243

>>8866153
First picture is a general up to early 2000's jack-of-most-trade build.

>1GHz P3
>512MB RAM
>GeForce 7800 GS AGP
>2x Voodoo 2 SLI 12MB
>Audigy 2 ZS
>1GbE NIC
>Sound Blaster 16 with real OPL3
>VIA SATA controller
>Raptor 10k RPM SATA HDDs

Second is a more era appropriate year 2000 build.

>1GHz P3
>384MB RAM
>GeForce 2 GTS/Pro
>Voodoo 2 12MB
>Adaptec SCSI 29160N controller
>1GbE NIC
>USB 2.0 cards because onboard was 1.1 only
>Sound Blaster Live!
>Sound Blaster AWE32
>Several 15k RPM SCSI HDDs

>> No.8866253

>>8862019
I still have my win98 pc from 1998 but I don't have space to have it set up with a proper monitor and all (I don't really care about CRT for that, any VGA LCD will do)

>> No.8866257

>>8866243
That first build sounds like an absolute dream. Have you run into any outstanding compatibility issues with it that requires the second rig or something older?

>> No.8866310
File: 888 KB, 1000x750, 1646073721668.jpg [View same] [iqdb] [saucenao] [google]
8866310

>>8866257
Those were more fun projects that I built over time and then sold after using them for a while to the highest bidder. I didn't own them both at the same time.

But compatibility was pretty good, there weren't really any major issues I ran into trying and playing dozens of games from the 80's to early 2000's, both machines did that fine. The 78000 GS should technically have a few problems with a couple DOS games and early Windows games, but as far as I remember, there wasn't anything that wasn't solvable, plus it let me play at high resolutions and refresh rates / FPS on a nice CRT that could handle it. The multiple graphics cards and sound cards really shine with giving a huge selection of APIs and features to pick from for games on the same machine though without having to result to wrappers or emulation.

>> No.8867401

>>8861051
Before 3dfx even decent pc's were weaker than the Sega Saturn, just compare Toshinden dos to ps1. Duke Nukem the biggest game of 1996 didn't even use polygons. Then suddenly 3dfx voodoo 1 was out and pc could do graphics 2 to 4 times better than ps1. The argument of the time being that it didn't matter that consoles were better at games because pc could do all this other stuff.

>> No.8867439
File: 39 KB, 258x387, 1622110448005.jpg [View same] [iqdb] [saucenao] [google]
8867439

>>8867401
>Duke Nukem the biggest game of 1996 didn't even use polygons.

>> No.8868016

>>8863259
>>8863267
I refuse to use anything newer than Geforce 4 on Windows 98. DX9 just doesn't "feel" right.
For that purpose, I bought a 7900 GTX for a Windows XP machine. I also got a Radeon of that era, but it's not the absolute high-end, so I might as well trash it.

I'm hoping to find a Geforce 2 Ultra for a 98 machine and replace my TNT2 pro.

>> No.8869221

>>8867401
>>8867439
Kek

>> No.8869226

>>8868016
>I'm hoping to find a Geforce 2 Ultra for a 98 machine and replace my TNT2 pro.
Get a GTS or Pro and OC the memory, they are a better fab and the core overclocks much better than Ultra. The only difference is that Ultra had better memory clocks out of the box, but if you have a nice GTS or Pro that manages to easily match the Ultras memory clocks, you can get much more out of the core. Get a aftermarket cooler for it too.

>> No.8869689

>>8869226
Why would the lower-end model clock better? The high-end SKU's usually exist because they were the best out of the fab.

>> No.8869717

>>8869689
I actually meant the GeForce 2 Ti and not GTS/Pro, my mistake, there were a lot of good models in the series. But you're still right, on paper the Ti is a lower end SKU.

But te Ti came out later and used a higher density node than the Ultra, while the Ultra has 30MHz faster memory chips but they both have the same core clock. So if your Ti overclocks 30MHz higher on the memory, it's already identical to a Ultra, except the core is lower power and generates less heat, so you can push it's clock more than a Ultra, both by default are 250MHz.

The actual difference between GTS, Pro, Ti and Ultra are just the clocks and how high they clock, they all use the same actual cores, NV15/NV16, just different fab.

>> No.8869747

>>8867439
>256 colors

Oof.

>> No.8870320

>>8869747
>what is glquake with texture smoothing
there enjoy :^)

>> No.8870448

>>8862274
The popularity of 320x200 in the DOS days was a software thing rather than a hardware thing, really. The thing was, VGA was the last real hardware video standard for PC. 320x200 was the only standard resolution (by setting the video mode with the BIOS) that supported 256 colors. VGA also supported 640x480 (and so everyone had a monitor that supported at least that resolution), but only at 16 colors.
Later on, some people figured out how get higher resolutions with 256 colors by skipping the BIOS and setting the video mode directly, but if I remember correctly, the highest resolution was 360x480. The highest you could get with 256 colors and square pixels was 320x240. But, in the standard 320x200, the pixel buffer was flat, whereas all the higher resolutions required paging, so a lot of people stuck to 320x200 anyway.
By the mid-90s, everyone had "Super VGA" cards that supported higher resolutions at 256 colors, but if you were writing a program to use those higher resolutions, you had to write special code for each card, because they were all incompatible. But even then you ran into the problem that the standard video buffer at A0000 was only 64 kilobytes (only enough for 320x200), so you couldn't access everything without paging.
Later on, the graphics card manufacturers got together and came up with a BIOS extension giving you a standard way to set a higher resolution and also get a buffer larger than 64K, but then you needed to get into protected mode, which DOS didn't support, etc.
It was basically a big mess trying to use anything beyond 320x200, until Windows 95 solved all that, with support for 32-bit protected mode and video drivers provided by the video card manufacturer.

>> No.8870464

>>8861051
It wasn't as good as people said it was. it was even better than that.

>> No.8870536

>>8870320
Not 1996.

>> No.8870541

>>8870320
glquake looked like complete ass, software rendering looks way better.

>> No.8870548

>>8870541
What about GL ports with restored effects?

>> No.8870554

>>8870548
Well obviously newer source ports are better. But the original glquake from the era looked like ass.

>> No.8870557

>>8870448
What were these special codes for each card? I've never heard of any post-VGA display mode that wasn't VESA or on Windows.

>> No.8870670
File: 21 KB, 508x260, 3dvide1.jpg [View same] [iqdb] [saucenao] [google]
8870670

>>8870464
Voodoobros...

>> No.8870778

>>8870541
That was the joke.

>>8870536
Yeah, but the game came out in 1996. GLQuake came out in early 1997, but that's not related to anon pretending Duke3D was a bigger game than Quake in 1996.

>>8870548
>>8870554
You could just turn off texture filtering and there's a patch that fixes lighting, from back in the day already.

>> No.8870791

>>8870670
Oh no no no a GPU that came out a year later is a little bit faster.

>> No.8870862
File: 38 KB, 450x343, image78.gif [View same] [iqdb] [saucenao] [google]
8870862

>>8870778
>but that's not related to anon pretending Duke3D was a bigger game than Quake in 1996.
It's related to the notion of what PC's had to offer against consoles being limited.
>You could just turn off texture filtering and there's a patch that fixes lighting, from back in the day already.
Filtering isn't the issue, the bad scaling is. Quake 1 and 2 both have textures that don't align with powers of 2.
>>8870791
Just in time for Quake 2 :^)

Now here's a far more fair comparison with just a few months apart.

>> No.8870957

Were there any issues with old CGA and EGA games running at 70Hz rather than the 60Hz they originally ran at? I don't think they ran faster, so maybe just screen judder and/or tearing?

>> No.8870985

>>8870957
Are there any tools/games with smooth scrolling and support for different modes so this can be tested?

>> No.8871021
File: 3.84 MB, 204x204, disappointment.gif [View same] [iqdb] [saucenao] [google]
8871021

>>8870985
>smooth scrolling
>CGA/EGA

>> No.8871146

>>8870862
At the same resolution and frame rate, the voodoo card would have had a noticeably better picture. Even at a lower resolution, the voodoo card would have probably looked better.

>> No.8871181

>>8871146
>Even at a lower resolution, the voodoo card would have probably looked better.
LOL

Voodoo 2 textures are limited to 16-bit at 256x256. No such issues on the TNT.

No nasty Z-fighting if using a full framebuffer either.

>> No.8871185

>>8871021
Just need the horsepower.

>> No.8871267

>>8865829
Ehhh, it traded and was hardly worth using that janky 3 card Frankenstein over a 1 card solution. Voodoo3s didn't even cost that much.

>> No.8872471

>>8870862
>It's related to the notion of what PC's had to offer against consoles being limited.
Yeah, true. Quake ports didn't even come out on consoles until 1997.

>> No.8872476

>>8870862
>Now here's a far more fair comparison with just a few months apart.
That's Voodoo 2 already though, I thought the discussion was about Voodoo and Quake. 3dfx was pretty much irrelevant by that time. The only card they released that made sense and started the 3D accelerator craze was the original Voodoo.

Pretty much what I said in >>8861064

>> No.8872482
File: 48 KB, 436x450, 1625321421739.gif [View same] [iqdb] [saucenao] [google]
8872482

>>8871267
Totally worth, you could have a nice 2D card like Matrox and the V2's in SLI did pump out better performance, sometimes significantly so.
I still have both V2 SLI and V3 rigs and if I had to pick back in the day, I'd go with V2 SLI over V3.

>>8870862
>Just in time for Quake 2 :^)
>Now here's a far more fair comparison with just a few months apart.
Just like V2. Next time post the whole page and not just one cherry picked benchmark. I know, bait but whatever.

>> No.8872506
File: 1.89 MB, 1482x1358, When_we_were_5.jpg [View same] [iqdb] [saucenao] [google]
8872506

>>8861051
It was the graphics cards of choice at the time amongst all my friends. The performance gains and resolution improvements made 3D accelerator cards so desirable. Unreal Tournament overnights were made possible when we all connected up our computers.

>> No.8872604

>>8861084
DirectX's design arguably was heavily influenced by a certain company beginning with n that had a vested interest in 3dfx not being the dominant standard. 3dfx was really really good at rasterisation, so good that nobody could touch them for performance. However there was no way to support the full OpenGL spec, and no one expected them to since it was a workstation spec that was more AutoCAD than Quake. DirectX on the other hand didn't even attempt to compete with OpenGL in the pro space, yet somehow it curiously managed to avoid supporting things that 3dfx was really good at and instead filling itself with stuff that would be rather difficult for them to implement but nVidia had (lacklustre) support for... Big think...
So yeah, without the tinfoil hat the unvarnished reality was that Voodoo 3/4/5 cards had rough Direct3D support. Games would have rendering errors and crash bugs that didn't happen on competing cards. Stuff would also render slowly if the game made use of too many features that 3dfx didn't directly support in hardware. And the company was mismanaged as fuck, bleeding money out the ass so they couldn't afford to pivot to this new world order where drivers had to support games as opposed to games supporting cards.

>> No.8872613

>>8872604
What, you're saying that even the early DirectX were made with Nvidia in mind? I doubt DirectX 5 had anything to do with Nvidia. Did they even have DirectX capable card at that time?

>> No.8872615

>>8872604
>So yeah, without the tinfoil hat the unvarnished reality was that Voodoo 3/4/5 cards had rough Direct3D support.
I have a voodoo 3 and 5 machine here, DX works great, what are you on about?

>> No.8872654

>>8872613
Not the super early ones, they were just shit all on their own. But starting with DX5 it started to get weird where the number of things that 3dfx could do that DirectX flat out didn't support compared to the competition started to accelerate. Once the T&L stuff started it became rather obvious that a certain company had far too much say in how the spec was going to evolve from now on.
>>8872615
Good for you. The reputation of the voodoos at the time were the same as the reputation of AMD today. Games would crash or glitch out and drivers were late if at all and games themselves tended to have to get "voodoo detection" in their code to work around flaws.

>> No.8872665

>>8872604
both table fog and 8-bit paletted textures are in direct3d just because 3dfx cards had that hardware feature. So no Microsoft wasnt trying to destroy 3dfx

>> No.8872674

>>8872665
Cards that predated 3dfx also had that feature. And I didn't say microsoft, it was 3dfx's competition who seemed to have more say in the future direction of the spec than was healthy. Still true today. AMD had to create Mantle in order to show what GCN could do because DirectX was dragging its heels on the ground up rewrite that would modernise the whole stack. Remember that full shader based DirectX was supposed to be in version 10? And then 10.1, 11... And given that nVidia was incredibly invested in their existing codebase, it's mighty suspicious that microsoft kept having reasons to delay and delay and delay right up until nVidia had a new chipset that was actually performant at full shader pipelines... We all know because it was reported from the horse's mouth that OpenGL 3 was not the modernised API it was supposed to be because of AutoCAD, so why is it so hard to believe that the kingpin of GPUs manipulated the roadmap to ensure their lead?

>> No.8872779

>>8872604
What feature did Direct3D possibly miss or push?
I went through the Glide spec, and the only things out of the ordinary I recall are some special texture formats and additive vertex alpha channels. I don't know about Direct3D, but it's pretty damn close to being a subset of OpenGL.

>> No.8872785

>>8872482
>Banshee literally getting crushed
>Voodoo 2 can't even do this test without doubling up on cards (fucking THREE cards in total!)

I... rest my case?

>> No.8872794
File: 72 KB, 580x368, 3dfx.jpg [View same] [iqdb] [saucenao] [google]
8872794

>>8872476
>I thought the discussion was about Voodoo and Quake
Ah yes, let's add this $300 piece of hardware so you can improve the performance of Quake from 30 FPS to... 30 FPS, now with an extra frame of latency.

Oh, but at least the game looks pretty at 480p, I guess.

>> No.8872847

>>8872785
Voodoo 2's point was always SLI. It's for the same type of people who buy 3090's today, it's not about price, it's about having the best.

>> No.8872851

>>8872794
See >>8872482
Doesn't matter, you're wrong in any case. We were talking about performance, nobody ever mentioned money until you started getting assfucked for being a dumbass on /vr/.

>> No.8872882

>>8872851

>We were talking about performance, nobody ever mentioned money
And? The thread talks about the perception of certain hardware during their relevance. You can't just handwave economics away from that discussion. Besides, the post I was quoting was specifically addressing the first iteration of Voodoo, so what's the point of linking to a post talking about Voodoo 2?
>>8872847
See, if that were really the case, why would you not explicitly compare against the 3090 in a SLI setup?
Nobody in their right mind would do that. GPU's are normally measured by their own merit, not by some esoteric scalability with a Frankenstein setup, whether it's 1998 or 2022.

>> No.8872895

>>8861807
Fx cards are good for cheap win98 cards. They support everything and have good driver support

>> No.8872930

>>8872794
>at least
>480p
You know, just more than twice the resolution, you fps-obsessed zoomer.

>> No.8872950

>>8872930
>More than twice
I think you mean more than quadrupled, but that's technically also more than doubling, so I will let that one slide.
>you fps-obsessed zoomer
The notion that nobody gave a shit about performance in the 90's is zoomer revisionism. A game as competitive as Quake demanded performance, and that shows with the ludicrously low configurations seen by pro Quake 3 players.
Performance always helps with responsiveness, even with FPS caps in place.

>> No.8873008

>>8861051
Briefly good. Incompetent company beyond that. Just like Commodore.

>> No.8873203

>>8872950
>I think you mean more than quadrupled
I stand corrected.
>nobody gave a shit about performance
Of course we cared about performance, but running Quake at 640x480 and 30 FPS was very good for anybody outside of the ultracompetitive scene.
And that sweet texture filtering that we all loved.

>> No.8873236

Recently sold off most of my vintage computer gear last month. Don't get me wrong, still have a ton of stuff and a way to play 98 games. With glide wrappers you can can you a later cheap GeForce and get better results than voodoo. I normally don't like wrappers and what not but I had two machines side by side and outside my agp GeForce having triple the frame rate, the wrapper gave me all the effects perfectly.

We need a Matrox wrapper to get them water effects you could only get in a few games. That would be cool since Matrox cards are kinda slow for the games that gave the feature

>> No.8873260

>>8873203
And poorly upscaled textures, washed out lighting, no texture glows.

Not to mention that gl_flashblend had to manually be turned off to restore dynamic lights, but nobody did that because it ran like fucking dogshit.

>> No.8873265

>>8873260
First generation of 3d acceleration had some growing pains, yeah. And Quake in particular was full of novel software rendering trickery that Carmack cooked up which was not supported by accelerator cards for many years. I seem to recall that underwater effect being missing until shaders became available.

>> No.8873271

>>8873265
>I seem to recall that underwater effect being missing until shaders became available.
Which was quite a few generations down the line.
Even DX8 did not have dependent texture reads, so that would have been hard to pull off.

The more glaring issue with the lighting was already solved by the first TNT. Modern ports now use the combiner extension that was missing on Voodoo 3, if not also Voodoo 5.

>> No.8873309

>>8873271
"Solved" by one company, and I think their solution was proprietary so other companies had to figure out their own solutions.
I'm saying I don't think it's fair to shit on 3dfx for not having some sort of "industry standard" features when the industry standard was just starting to be hammered out.

>> No.8873396

>>8872882
Exactly, hence why Voodoo made sense at the time. Wanted the best of the best? 3dfx.

>> No.8873578

>>8873309
It wasn't strictly proprietary. Only a superset remained at the hands of NVIDIA, while the neutral subset dating from 1998-99 was also adopted by Ati IIRC
It then became an official part of OpenGL 1.3

>> No.8873609

>>8873396
If you strictly champion its extensibility, sure, but you gotta realize how much grasping at straws one is doing by promoting a solution as superior by having to outnumber the competition.
And in spite of this, it was still behind on its capabilities. 1024x768 at most, no deep color, and the same small pool of texture memory as a single card, so there's hardly any room to flex its muscles. This could potentially fill a performance-at-all-costs niche, except that 3dfx mysteriously refuses to support 320x240, so that edge is kind of lost as well. You can get 300+ FPS in GLQuake on a TNT, if you so desire.

>> No.8874025

>>8872895
TRUE. fx cards are a solid choice. dont expect high performance at high resolutions.

performance depends on the model, i have 3 fx5200's, all 128mb, 64bit, an asus tmagic 9520, a palit and an msi,
the msi performs the best as it will run taito type x games full speed where as the asus wouldnt run them at all,
i compared the palit to a radeon 9250 256mb 128bit, in max payne 2 at 1024x768, on max settings they were passive cooled, i has some jumping when they heated up, a small fan helped a lot, dont think i had 60 fps but it was good enough and looked pretty,
on both cards performance was on par despite the radeon having double the vram/bus of the fx, all cards were agp, downsides to the radeon is it only supports dx8 and drivers for win98 up

i went for the fx5200 for dx9 and driver support from win95 upto win7 using vista drivers in compatability mode, think i had it working in win10 as well. i forget

>> No.8875130

>>8872895
>>8874025
I just don't get why you'd gett a FX card if a GF4Ti or GF6 card is the same price.

>> No.8875391

>>8875130
GF6 forewent texture palettes, which was commonly used in old games.

>> No.8875419

My FX5200 64bits card died, gives no picture, wonder if it's caps that died, they're not the solid ones. It was by Chaintech and it was great, managed to do 1920x1200 digital, which was a rarity back on those old Nvidia cards who almost all only do 1600x1200 max on digital.

>> No.8875639

>>8875130
The other guy said it, GeForce 6 dropped a few things that make effects not work in a few games. Also, the more modern you go the less features you have on Nvidia drivers. 4 and fx series is the sweet spot. GeForce 4s are climbing in price. When they get to high people will buy up the fx series.

I'm just saying, get them now why they cheap.

>> No.8875856

>>8865016
First good CONSUMER accelerator.

>> No.8875863

>>8872604
DirectX was basically written for the Reditition Verite. Nvidia was still pretending that quads were the future at the time.
>>8861084
minigl is basically just a hack for a missing feature

>> No.8875874

>>8861759
>>8861621
You mean Quantum Fireball you fucking retards?

>> No.8875913

>>8870557
I think Simcity 2000 DOS had some custom drivers for cards that didnt support VESA. Later, when 3D accelerators were first released, there were proprietary APIs like Glide and Speedy3D for DOS.

>> No.8875921

>>8868016
I hope you mean a real geforce 4, not the rebranded MX crap

>> No.8875930

>>8875639
The good FX cards are already being scalped. I sold a quadro one for 70€. Wonder how much some idiot would pay for my 5950 ultra

>> No.8875976

>>8875930
That's how it goes, if someone is new to retro oc gaming they better buy up now

>> No.8875986

>>8875976
If I wanted a geforce 5 i'd buy a PCX variant. Those don't require AGP.

>> No.8876020

>Wanted a bunch of different GPU's some years ago
>Found a Geforce 3 Ti 500, so I was satisfied
>It's now dead
>Everything else good is expensive

Fuck, I should just have bought everything when I could.

>> No.8876068

>>8875921
MX is not bad. It has the Z culling feature that older DX7 hardware lacked.

>> No.8876072

>>8876068
still not a DX9 card, has no pixel shader support and so on

>> No.8876279

>>8875856
>First good CONSUMER accelerator.
given that non-consumer 3D accelerators cost like 10x as much, that isn't really a useful statement

>> No.8876305
File: 58 KB, 563x529, Screenshot_20220504_200716.png [View same] [iqdb] [saucenao] [google]
8876305

Accelerated Quake in 1996.

>> No.8876308
File: 110 KB, 565x851, Screenshot_20220504_200636.png [View same] [iqdb] [saucenao] [google]
8876308

>>8876305

>> No.8876313
File: 76 KB, 550x634, Screenshot_20220504_200655.png [View same] [iqdb] [saucenao] [google]
8876313

>>8876308

>> No.8876537

>>8870985
The smoothest scrolling games EGA games I know are the latter Apogee or Id platformers, but even they don't run at a smooth 60fps. I'd love to be proven wrong, though.

>> No.8876920

>>8875130
like the others said, GF6 upwards was missing features compared to older cards, forget exactly think it was some thing to do with textures, fog or T&L, in games like thief 1or2 theres missing fog and stars in the night sky, which some of the radeon cards also didnt support. there are posts about it on Vogons.

>> No.8878707
File: 67 KB, 1024x755, 3dmark.jpg [View same] [iqdb] [saucenao] [google]
8878707

>>8876308
>Verite at half price
Not bad, even though vQuake was a piece of shit, but so was GLQuake arguably.

>> No.8878916

I didn't have a gaming PC in the 90's; was OpenGL used back then?

>> No.8878928

>>8878916
Most used subsets of OpenGL in the form of miniGL drivers, but that's still technically OpenGL, so yes.

>> No.8880538
File: 1.91 MB, 2013x1446, Pyramid_scheme.jpg [View same] [iqdb] [saucenao] [google]
8880538

>>8872895
I got a couple of really cheap Chinese graphics cards which are PCI claiming to be "geForce FX5500" for $25USD each a while back. They seem to be new, and I wonder if the wrapper nGlide would be too taxing for a 1.4GHz P3. I read that was the advantage of Glide with the palettized textures and fog tables, but I've never seen the difference. I just accepted the 3Dfx look was how 98 games are supposed to be, but is it really better than how it is rendered with more powerful nVidia cards? Any comparison images?

>> No.8880545

>>8880538
I've wondered this as well. I've heard Glide games have a distinct look, and while you can get wrappers to preserve compatibility and effects, it may not be 1:1 in terms of output. I'd like to know if this is indeed the case.

>> No.8880665

>>8880538
i saw those on ebay a while back thought about getting one for an old socket 7 pc, i know you can flash an fx5200 to a 5500, not sure the benefits, maybe higher clock speeds, guessing the 5500 has a higher bus speed tho thru pci not sure if it would be any faster

>> No.8880907

>>8880538
The dog tables and palettized textures are effects only supported till the rx series. Thief is a good example, look up at the sky and if you see stars you're good. I haven't seen a wrapper reproduce the effect. I don't think your 1.4 is going to be able to do glide wrappers. I run mine on a p4 with a geforce 4 ti 4800. I compare that to my voodoo 3 and as far as I can tell it's a 1 to 1 copy using glide wrappers and the more modern GeForce is way faster. Was thinking of making a YouTube video about it. That and how EAX only works right with accelerated sound cards and the emulation for it now is flat out not working even if the check box is clicked in game

>> No.8881520

>>8880538
>>8880545
I really like the look of Diablo 2 in glide. It's not a 3d game but still. I think Blade of Darkness also looked alright in glide.

Don't know any of the technicalities and never had the actual hardware myself so that's that.