Finis' Experiments and Scribbles

pugman 1
Captain
Posts: 1533
Joined: 21 May 2009, 19:26
Type the number ten into the box: 0
Location: Germany

Re: Finis' Experiments and Scribbles

Post by pugman 1 » 13 Nov 2017, 15:23

you are on the right tracks with the 10 series gpu wish i could get one :mrgreen:

User avatar
Draise
Captain
Posts: 2900
Joined: 21 Sep 2009, 19:33
Type the number ten into the box: 0
Location: Bogota, Colombia
Contact:

Re: Finis' Experiments and Scribbles

Post by Draise » 14 Nov 2017, 13:42

I use a GTX 1060 now, very happy.

User avatar
Finis
Captain
Posts: 4473
Joined: 21 May 2009, 18:26
Type the number ten into the box: 0
Location: America!
Contact:

Re: Finis' Experiments and Scribbles

Post by Finis » 22 Dec 2017, 21:27

Thinking about animations I did a test for render time and storage use. Using Blender with the "pabellon barcelona" benchmark I ran a 4k (4096x2160) cycles gpu render with 200 samples. Result: 26 minutes on my gtx 1060 3gb.

So for a 10 minute 4k 60fps animation, 26 minutes x 36000 frames, that is about 15600 hours or 1.78 years render time and 117gb disk storage for the images.

Guesstimate from that for 1080p (1/4 pixels), 30 fps (1/2 frames), and gtx1070 (about 2x speed). That is 975 hours or about 40 days render time. Much time could be saved with clever settings and maybe the new denoiser but that is still days of render time. The benchmark used is realistic and may be more demanding than the non-photoreal things I have in mind but it would still take a long time.

Maybe eevee realtime renderer can do it faster. Otherwise that means a renderfarm and I doubt that could be free and this is a hobby.

Other stuff. I'll hopefully be responsible and not get a gtx 1070 at least until later when prices may go down. However, if I do decide to do that would any of you want to buy my MSI gtx 1060 3GB 1152 cores? I have the instructions and software that came with it but I tossed the box. https://msi.com/Graphics-card/GeForce-G ... -X-3G.html

UPDATE 12-24-17

To render a full 4k 60fps 10 minute animation in 48 hours would require each frame to take 4.8 seconds. I don't have to have that awesomeness of course. 1080p 30 fps is great. Also, a 4k image as an rgba png is 14mb.

Fooled with settings for more speed. Mostly the only things that make a big difference are samples and image size. Using only 30 samples plus the denoiser with its default settings produced a satisfactory (for my purposes) 4K image in 3.6 minutes using the benchmark scene. That's about 90 days for 10 minutes 4k 60 fps. A test with my boy-hoop-stick scene exported from Daz to Blender took 2 minutes to render with no materials and many instances not present.

So if I do an animation it needs to be 1080p 30 fps. That's fine for me. I'll test with that resolution soon.

What am I supposed to be doing? Baking a pie for the family Christmas Eve celebration. You all have a Merry Christmas!
Why assimilate a species that would detract from perfection? -- Seven of Nine

User avatar
RAYMAN
Captain
Posts: 2104
Joined: 21 May 2009, 18:56

Re: Finis' Experiments and Scribbles

Post by RAYMAN » 29 Dec 2017, 23:00

Like I always said in the past....! Its all about the right software and the right workflow!
I am into realtime animation now! I went for a 1070 with 8 gig and the solution is now a mix of Vue .. Blender .Sketchup...Cinema 4d and most important the realtime animation of i clone!
Looking much foreward to the possibilities of eevee inside Blender....!
The trick basically is first of all to shave off a lot of modelling time! Moi 3d is a good start... in a workflow with any kind of sds modeler...!
Sketchup is the turntable that lets you do kitbashing in the easiest and most efficient way!
I updated Vue from Inf to stream in the subscription form ! Cost me 300 $ per year which is still much but not earth shattering! The aspect of Vue ist that For the really big scenes I can use eco systems to drive thousands of instances! Throigh stream I can export all those while keeping the instane hirarchy to either Blrnder..... or Cinema 4d an then on to Thea for almost realtime gpu based render...! For the simple scene animation I use Iclone..! Its the most modern realtime animation that lets you use mocap.... and also record mocap!
Most of the mocap come from our old TS friend Truebones..! Iclone with crazytalk and daz id awsome!
All that is a workflow that shaves off ..tons of time
and lets you produce almost in realtime !

User avatar
Finis
Captain
Posts: 4473
Joined: 21 May 2009, 18:26
Type the number ten into the box: 0
Location: America!
Contact:

Re: Finis' Experiments and Scribbles

Post by Finis » 31 Dec 2017, 17:39

My nephew said his workplace has a machine with 4 Tesla k40's. No private use allowed of course.

Yeah, game/realtime render looks like the way to go for the projects I have in mind. Eevee should be great for that. I'll see when it is in the main release.

This is an interesting idea https://www.sheepit-renderfarm.com/. I don't think I could provide enough service to earn many points but it might be worth investigating. I'd be concerned about security. If it grows huge it could be amazingly fast with thousands of machines participating.

For a team project I hope to do if render time is an issue then it could be split among participant's computers. Each would render a different set of frames to be composited later.

Update Jan 1:

Looks like eevee doesn't use cuda/opencl/gpu cores. That means eevee and the real-time renders Rayman mentioned work without such expensive hardware. Still need cycles/vray/iray for ultimate accuracy and such but real-time is or will soon be good enough for many users including animators.

I have thought about the gpu trend that people were throwing more and more hardware at the same old rendering methods rather than finding more creative or elegant software solutions.
Why assimilate a species that would detract from perfection? -- Seven of Nine

User avatar
RAYMAN
Captain
Posts: 2104
Joined: 21 May 2009, 18:56

Re: Finis' Experiments and Scribbles

Post by RAYMAN » 03 Jan 2018, 12:44

The idea is both! You need to be picky about every mb you spend... pn objects in your scene! Use any kind of trick! Use instancing a lot... all that helps!
Then comes the hardware....!
You can save a lot on hardware if you know what you are doing! Very important is the power supply!
Try ti get around 1000 watts! Is not much much more money.... but .. I explane later!
The processor doesnt need to be the best best!
You can get a decent i 5 for 200$!
If you dont want to overclock a very simple board for 100$ will do! Ram isnt very important since only Adobe products only eat up lots of memory!
Just get 8 gig that plenty..16 even better.. but not necessary..!Try to get ssd rather then more HD!
They are a lot faster then old HD..etc!
And here comes the deal..get 2x graphics card with
8 gig ram each...ie 2x 1080! No ti necessary!
Motherboard has to support it though! Nvidia because of cuda! Why 2x 8 gig! Because 8 or even 6 gig dont even load big scenes..let apart render them!

User avatar
Finis
Captain
Posts: 4473
Joined: 21 May 2009, 18:26
Type the number ten into the box: 0
Location: America!
Contact:

Re: Finis' Experiments and Scribbles

Post by Finis » 03 Jan 2018, 18:50

By elegant solutions (to rendering) I mean software engineers should solve the render speed need by inventing new and better ways to render. Gpu cards are more hardware power thrown at the same PBR methods. There are such things being done including real-time methods.

Tricks! Yes! I got the cycles render time for the pavillion benchmark down to 3.5 minutes for a 4K image with decent quality and 52 seconds at 1080p using tricks and compromises. Cycles denoiser plus fewer samples reduces time with little effect on quality. Kerkythea's photon mapping did a similar thing. Still I will probably use eevee for animations. It is real looking enough for me and I'm not usually interested in realism as much as creative ideas, art, etc.

Here's a puzzling thing. It looks like the real time renderers don't need all those gpu cores. Game engines are real time, right? So why do gamers want their mommies to buy them $1000 video cards with 3500 cores for their games? The big cards obviously improve game performance but the game engine doesn't use the gpu cores?

Nvidia's Tesla cards can combine their memories. I think that the gtx cards don't do that.
Why assimilate a species that would detract from perfection? -- Seven of Nine

User avatar
RAYMAN
Captain
Posts: 2104
Joined: 21 May 2009, 18:56

Re: Finis' Experiments and Scribbles

Post by RAYMAN » 04 Jan 2018, 09:48

Because I guess that eevee runs on gpu!
It practically is a game engine! The reason I said 2x 1080 was that gets you 16 gig ram..and that is needed for bigger scenes! They dont even load on 8 gig thats the problem!
Thea is software wise the better solution because its hybrid! You throw a scene at it and it uses the gpu until that reaches the limit and it starts to use the cpu on top! See you dont run out of memory!
Drawback..it sets you back 399$!
But its worth it! What I prefer though in cycles is that its pbr node based shader standard! But you cant have everything Im afraid!😏

User avatar
Finis
Captain
Posts: 4473
Joined: 21 May 2009, 18:26
Type the number ten into the box: 0
Location: America!
Contact:

Re: Finis' Experiments and Scribbles

Post by Finis » 04 Jan 2018, 18:35

"Would multiple GPUs increase available memory? No, each GPU can only access its own memory." from https://docs.blender.org/manual/en/dev/ ... ering.html I've seen this noted in several places. More cards do add more cores or computing power. Each additional card probably adds overhead so that less power is added with each additional card. I found that adding my gt640 to the gtx1060 is no faster and may be slower. Same for the cpu. Probably best if multiple cards are similar in size/power.

If you have double gtx cards you can test the memory combining ability. Run a "gpu memory burner" benchmark or test big enough to fail on one card. Then try it with two.

I think the tesla cards can combine their memories at least using Nvidia's systems for scientific computing. Those cards aren't worth the money for CG rendering. They are for high accuracy scientific computing.

Thea, yeah baby! I'd get that if I could spend more on toys or if using in business. Thea has a blender plug in so it is easy to use there.

The ti versions have more cores for little more cost than non ti ones. So they are a bargain. A gtx1070 ti is only a little less powerful than a gtx1080 non ti.
Why assimilate a species that would detract from perfection? -- Seven of Nine

User avatar
RAYMAN
Captain
Posts: 2104
Joined: 21 May 2009, 18:56

Re: Finis' Experiments and Scribbles

Post by RAYMAN » 05 Jan 2018, 14:30

No it actualy does reason I will get 2x 1080 and not expensive ti version!
It also works with 2x 1060 with 6 gig!
Only deal is you need a mainboard that supports double card use! Greetings!

Post Reply

Who is online

Users browsing this forum: No registered users and 1 guest