[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/3/ - 3DCG

Search:


View post   

>> No.375599 [View]

>>375484
RIGHT. Well then, it'd be a butt-ton longer.

>> No.375480 [View]

>>375410
My own personal experience rendering has been purely with Blender Cycles. I have twin 660s, it would take about 2.8 years to render a film-length movie at near pixar quality. Your best bet for rendering film at that quality would be render farms or beowulf-clusters. Either way you're probably looking at tens-of-thousands of dollars.

>> No.375404 [View]

>>375387
If you're going to state somthing that goes against common sense, common agreement, and (most importantly) documentation, could you please post some proof?

I have a much more powerful CPU than the i7 (4 Ghz, eight-cores) and I get significantly slower render times using CPU than GPU on a gtx 660. Did you properly configure cycles to use Cuda cores on your gpu?

>>375400
Your Xeons are not likely to be capable of rendering any photo-realistic animations in a timely manner. They might pass for doing still-shots, if you have patience though.

>> No.375382 [View]
File: 48 KB, 575x305, small_gtx-660-ti-16.jpg [View same] [iqdb] [saucenao] [google]
375382

Let's take a moment to talk hardware, /3/. Specifically, home-end hardware. GTX and Readeon-tier products.

Which product do you use, and which are you realistically looking at upgrading to, and why?


Currently, I'm running twin GTX 660s under SLI. They're not too bad and were a major boost over my past single-gt8800. But after some poking around I've found that I'd be better off with twin 570s than 660s. Mainly because between the 5xx and the 6xx series, Nvidia decided GPGPU wasn't as important to sell product as important for the average consumer, which is gamers. I'll gladly take a hit in game-running power for a boost in rendering.

Navigation
View posts[+24][+48][+96]