[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/3/ - 3DCG


View post   

File: 293 KB, 633x758, 1445867773932.png [View same] [iqdb] [saucenao] [google]
733004 No.733004 [Reply] [Original]

>Short film is 720 frames long
>Each frame at 1080p takes 45 minutes to finish
I will fucking die of Corona before this shit is done

>> No.733017

I decided to take a big brain approach and make it with an "old school" aspect ratio, so since it's not that wide, it cuts the amount of pixels by, like, half. I'm sick of 21:9 anyway.

>> No.733019
File: 77 KB, 380x349, boomer.png [View same] [iqdb] [saucenao] [google]
733019

>>733004
your film is 30 seconds long? we don't call that a short film where I'm from boi, that's a scene.

>> No.733020

What kind of potato are you using? 400 frames at 4k usually finishes in 15 minutes for me

>> No.733022

>>733020
Render times depend on a million things, you can't compare it like this.

>> No.733024

>>733004
You should probably have a render time budget and cut down down features from your scene until you hit it. Also see if you can do some things in comp, for example if you have volumetric fog try zdepth.

>> No.733025

>>733020
>t. someone rendering some text slide on a white background

>> No.733035

>>733004
Show a frame from your animation, i want to see why it takes 45 minutes.

>> No.733041
File: 27 KB, 720x303, f7300ccad0575bc19d9c8ee76acd5e82.jpg [View same] [iqdb] [saucenao] [google]
733041

>those volumetrics look nice I guess let's keep them for the whole shot

>> No.733046

>>733041
Yeah something like that, but who knows, maybe it's even worse and more stupid than that.

>> No.733048

>>733004
use a real app not fucking Blunder

>> No.733053

this tells me you have no idea what your'e doing

>> No.733054

Even when I do a lot of effects, color editing etc 4k DCI takes nowhere near 45min lmao. I think I'm usually 120 frames an hour? In resolve

>> No.733056

>>733048
Don't be a massive retard, other renderers would probably also suffer with his scene if he doesn't know how to optimise. Also, you can use other renderers in Blender, I use Octane.

>> No.733058

>>733056
i usually never render video with raytracing unless its absolutely necessary.
i also keep motion blur to a minimum. most of my video renders are done in blender internal.

wish op would give us more details

>> No.733059

>>733058
Have you tried using a game engine? Surely it would improve your visuals and give you even faster render times.

>> No.733062

>>733059
i thought internal and game are relatively the same.
i have not used evee for video yet

>> No.733165

>>733062
He means something like UE or Unity for 60fps renders. They'll look better and render faster than internal, which doesn't even exist anymore.

>> No.733396

Is it even worth using pathtracer if your scene is set in a foggy location? It is beyond painful, and realtime renderers are pretty good in that area these days. I'm seriously reconsidering my entire story because I would require chimneys, thick fog and diffuse dark moody lighting on top of it. I could achieve it all with realtime renderer, but doing it with a pathtracer is slow as fuck. I'm not sure I could use z-depth effectively either. Also, dark interiors as well, hurts so much.

For sunny scenes even pathtracing is comfy and fast though. I know it's ridiculous to put volumes everywhere if they are not a must, but they would be necessary for every shot in my case.

>> No.733458

>>733396
does depth buffer fog not cut it for you? that's more of less what real-time renderers are doing. That and drawing fake light scattering for light sources.

It's nice to have something like the pokemon renderer in this case, because you can easily switch between renders as the shot requires, without having to try too hard to match materials and things.

>> No.733561

>>733458
Hmm, I don't actually know how the tech behind it works, so that's interesting to know realtime renderers basically use zdepth. But even if I manage to make it decent looking with zdepth, what about light scattering? If I overlay zdepth pass I don't get scattering as well, right? How do realtime renderers do that part then and can I do it in post if I render with pathtracer?

And also, what's the actual advantage real volumetrics have compared to zdepth fuckery realtime renderers do?

"Pokemon renderer"? What's that, I can't find it on google?

>> No.733566

>>733561
realtime renderers fake light scattering by ignoring the global illumination effects and having optimized approximate solutions for specific things. If your scene is simple enough for example, you can probably get away with additively blended billboards with radial gradients on them. On the other hand, you might have a pathological shot with very dramatic lighting and materials (combination of caustics, varying density fog, mirrors, translucency) that can't be achieved without tracing, or very specialized shaders.

>> No.733572

>>733566
Interesting, thanks. I'd really like to see one example of a fog rendering which can't be achieved in game engines. I guess you can't animate it and vary density, at least afaik, but I haven't actually tried doing that. But that's all I can think of.

>> No.733599

>>733165
i mostly do 2D animations and effects.
internal is perfect in that regard, they even made commercials for comic books with blender internal.

if i really want to integrate 3D graphics i would use cycles on a different file and put it inside a video later. and hey, i know cycles isn't fast or pretty but its enough for what i do.

>> No.734351
File: 928 KB, 1280x1280, 133Eevee.png [View same] [iqdb] [saucenao] [google]
734351

>>733004

>> No.734358

>>734351
Kek

>> No.734362
File: 61 KB, 750x802, 1577384915865.jpg [View same] [iqdb] [saucenao] [google]
734362

>>734351
based

>> No.734363

>>733561
>>734351

>> No.734384
File: 165 KB, 800x450, crying-cat-know-your-meme-61.jpg [View same] [iqdb] [saucenao] [google]
734384

>>733004
if only you organized and setup everything well it wouldn't take you 2/3 of a month to render 720 frames.
If you're simulating physics, ex. destruction of a building, set up the cuts before and THEN you simulate it all crumbling down, not the 2 at the same time.
This will make it so you don't have to simulate both at the same time every single frame, only simulate the cuts once in the first frame.

>> No.734400

>not getting free bucks from google cloud platform
>not using your free bucks on zync rendering to render your entire film in a few hours

>> No.734514

>>733004
are you using corona render haha

>> No.734516

>>734514
omg haha thats so funny hey! maybe hes i drinking corona BEER also haha

edit: thanks for the subs and upvotes! didnt expect it :)))

>> No.734517
File: 34 KB, 407x405, 1c3.jpg [View same] [iqdb] [saucenao] [google]
734517

I also came to laugh at "45 minutes a frame".

>> No.735234

>>733004
>wojakposter makes another worthless thread

>> No.735711

>>735234
>bumping a worthless thread

>> No.736503

>>733048
>app