[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/3/ - 3DCG


View post   

File: 1.31 MB, 1344x674, godot.png [View same] [iqdb] [saucenao] [google]
965820 No.965820 [Reply] [Original]

Godot main dev is literally making next gen graphics on a fucking 2014 gpu.

Unreal and AAA faggots are a fucking joke.

>> No.965821

>>965820
gigabased
if your "cutting edge tech" needs a cutting edge graphics card your tech is shite

>> No.965823

meanwhile my 1650 is chugging bc of a 4 million tris scene :(

>> No.965829

>>965820
With the latest release it seems they are committed to the path of compatibility with lower end machines, very smart move.

>> No.965833

>>965820
wow that looks amazing

>> No.965867

>>965820
[*] real time
(*not actually real time but baked lightning available in every game engine since the 2000s)

>> No.965871
File: 65 KB, 603x498, nitter_hdda.png [View same] [iqdb] [saucenao] [google]
965871

>>965867
>baked lightning
oops, brainlets wrong again

>> No.965872

>>965871
It says in the shitter thread
https://twitter.com/reduzio/status/1730279205248672191

https://docs.godotengine.org/en/stable/tutorials/3d/global_illumination/using_sdfgi.html

So no matter what they call it it's not dynamic gi but static gi.

>> No.965873

>>965872
>nooo I wasn't talking about baked lighting
I accept your concession.

>> No.965876

>>965873
Baked lightning == precalculating lighting and storing it in a data structure rather than calculating it every frame
It's baking, doesn't matter if they say no baking if it fits the definition of baking.
This is double deceptive, one from the dev overpromising in the shitter thread for hype and (You) implying that it's a competition to UE5 Lumen.

>> No.965877
File: 10 KB, 769x66, sdfgi.png [View same] [iqdb] [saucenao] [google]
965877

>>965876
>Baked lightning == precalculating lighting and storing it in a data structure rather than calculating it every frame
Then according to your definition, Lumen and every other accumulating lighting tech is "baked lighting" since it stores around a second worth of previously calculated data.

Btw, SDFGI does work with dynamic lighting, so since you're wrong even with moved goalposts you can stop posting entirely now.

>> No.965878

>>965877
We'll see how it turns out then won't we.

>> No.965880

hello nodevs, how is your xitter argument going?

This looks like something you could get in ancient opengl demos. The world needs to go back before they try to advance to see that they are failing to do what was done in the past with less hardware already.

>> No.965881

>>965877
Don't get me wrong moving the lightning baking (for you not baking) step into an asynchronous process is a good idea.
It's still not an alternative to Lumen which converges over a few frames (not a second as you claim)

>> No.965882

all next gen rendering methods are broken fundamentally to just sell cards because they will run things at "acceptable" framerates rather than push visual boundries.


Just make a very poorly done rendering technique then sell it as a feature while slapping giant heatsinks on the card. Worked back with crysis 2 and it'll keep working with the next set of releases till the end of time. Why optimize when you can just fake it?

>> No.965884

>>965820
Btw this is not sdfgi. HDDA is a term invented by Juan which stands for "Hierarchical Digital Differential Analyzer"
Looks like he is voxelizing the scene into a hierarchical structure (I'm guessing it's essentially overlapping voxel cascades, with more resolution closer to the camera as in cascaded shadow maps) and then stepping through it with something like a multi-resolution bresenham line algorithm.
It can use the lower-res cascades to take large steps through empty voxels.
So instead of using distance fields to estimate the closest possible hit like with ray-marching sdfgi, it's doing voxel-marching in a 3d cascaded mip structure
See his (limited) explanation: https://twitter.com/reduzio/status/1726692512965017728

And yes, it's real time, it's blasting out millions of rays per second to compute dynamic lighting and reflections

>> No.965889
File: 317 KB, 908x758, 1682033795382597.jpg [View same] [iqdb] [saucenao] [google]
965889

>>965882
because there's autists who cares about their craft as an art form, chud.

have you ever tried to make shit runs on literal old as fuck machines?

https://www.youtube.com/shorts/G87eGOSCl4A

you're fucking stupid if you're ok with e-waste because Nvidia needs more money.

NIGGER.

>> No.965894

>>965889
Touch grass you angry manchild

>> No.965895

>>965894
I don't consider the opinions on people that don't admire retro demoscene, as valid opinions worthy of me giving a shit about plebian opinions.

https://www.youtube.com/watch?v=dOqxLBZiBRA

Educate first on coding autismo, pleb.

>> No.965896

>>965895
I don't care, touch grass

>> No.965899

I love how godot fags are, they can't distinguish this crap from good looking graphics, shit, I can get this on rpg maker.
99% of godot userbase never made or will make a game but you guys sure spam a lot of crap all over the internet.

>> No.965902
File: 6 KB, 240x240, 1678974674342019.jpg [View same] [iqdb] [saucenao] [google]
965902

>>965899
retard can't understand what is running 1080 30 fps on a 10 year old gpu.

>> No.965906

>>965902
And? You can play real games, games, not a single 3d scene on old gpus, valorant runs better than 30 fps on a mobile gpu from almost 10 years ago too, what is your point? Also, last time I tried godot run like shit when the scene started to get bigger on a 1060.

Also, what game did you made? if not, how much are you paid to defend godot on the internet?

>> No.965908

>>965906
I cant run unreal, and I'm not paying 700 usd to play unoptimized GARBAGE.

lmao.

get fucked Nvidia shill.

>> No.965910

>>965906
That entire post can be summed up with "I am unaware of how to optimize, it must be the engine's fault"

>> No.965911

>>965908
>>965910
Why do I care? I would a paying player care? "the engine runs on devs machine" was never a selling point for any game dude, if even you don't believe your game will make money to the point of investing some money on hardare this says a lot about the pixel art plataform you are making lol

Also, show you fucking game, if you guys can talk about gamedev stuff you guys have to at least have made a game.

>> No.965912

>>965911
I care because I don't consider acceptable to run a 3D game at 10 fps in a 4080.

retarded goy cattle.

>> No.965913

>>965912
Show your game. And as if godot would run a real game better than that on any hardware lmao

>> No.965914
File: 53 KB, 1027x632, 1694366834195585.jpg [View same] [iqdb] [saucenao] [google]
965914

>>965913
show your AAA render, then pleb.

>> No.965929

>>965911
>Why do I care? I would a paying player care?
>"the engine runs on devs machine" was never a selling point for any game dude
Painfully unaware of how shitting out unoptimized garbage will reduce your playerbase. Take a look at e.g. Steam stats to see how many people run "good" hardware, tard.

>> No.965948

>>965889
Brother if you could not read between the lines of that post why bother posting at all?

Yes you are out of line you emotional fool.

>> No.965949

>>965914
All those filters and still pushing out low density desu you could get better looking wood than that and still keep clean normals. Also the lack of consistency in texture density all around could use some balancing. I hate to see the screen filter super clean and everything else be how it is. It's very jarring.

>> No.966010

>>965914
>Cris is the only one on /3/ who has tried to make an appealing render in godot

>> No.966054

>>965820
thanks to MIT licence, every other engine is able to copy the global illumination code to their codebase and turn it proprietary. so nobody loses

>> No.966065

>>965820
I honestly give no fuck what engine someone uses as long as their techniques and processes are well optimized and done properly without a billion shortcuts.

Of course you still find gigantic teams throwing out background assets far as the eye can see with 2048 x 2048 multiple texture maps with a single fucking strip of the entire space used with multiple materials and such but hey that's the "next" gen experience I suppose.

>> No.966087

>>966065
>Of course you still find gigantic teams throwing out background assets far as the eye can see with 2048 x 2048 multiple texture maps with a single fucking strip of the entire space used with multiple materials and such but hey that's the "next" gen experience I suppose.
That's the problem with the mix and match style asset store projects, be it asset flips or have some substance.
Before that games simply re-used tiled textures on multiple places and shaders were something that were global for the whole project.

>> No.966113

this entire map is one big fucking normal map

>> No.966904 [DELETED] 
File: 197 KB, 1080x1510, Screenshot_20231212_054238.jpg [View same] [iqdb] [saucenao] [google]
966904

>>965949
Post your work you inbred chud incel.

>> No.967001

>>965948
Don't reply to cris you dumb newfag

>> No.967012

if you think that's next gen you're fucking blind.

>> No.967278

2014 GPU's can do a lot lot lot with basic scenes

>> No.967439

>>965820
>>965820
I do professional previz for the live entertainment and motion picture industries. I'm doing pretty massive real-time visualization of stage lighting, fireworks, lasers, and performers on an old as shit Quadro P5000 in a micro atx workstation. You don't always need the latest greatest GPU. I'd rather invest in more software modules until I hit a point where I am forced to update the GPU.

>> No.968139

>>965820
global illumination was a thing in 2014 you tech illiterate retard
this is only groundbreaking for godot, the "engine" that's 10 years behind everyone else

>> No.969792

>>967278
bakechads still win overall

>> No.971001

>>965820
that existed since 20010, stop licking pablo's cock godotard