Evidence? I suspect that you will be disappointed. I hope you are right. There are always new things and it would be great to get this ability with our regular cards. When you have two cards please report back with the results of your tests of combined memory.
People ask the gpu memory question all over the web and are constantly told that the memory is not shared for the gtx/gaming cards.
"The available memory for rendering will not be the sum of all the memory installed on the cards, but will be limited to the vRAM of the smallest card. In other words: adding a new GPU with 8GB to a system that has an old GPU with only 2GB, will make the new GPU only use 2GB of vRAM."
-- https://blender.stackexchange.com/quest ... imultaneou
Expensive? None of these things are cheap. The ti versions of the 10xx series aren't about memory. Most have the same memory as the non ti ones. EVGA 1070 non-ti $520 1920 cores = $0.27 per core. EVGA 1070 ti $540 (only $20 more) 2432 cores = $0.22 per core. They have the same memory and the same base clock rate (the non ti has a faster boost rate). The 1070ti is a better deal than the non-ti. It is similar for 1060's and 1080's ti vs. non-ti. If you have over $1000 to spend for two cards adding $40 for all those extra cores is not a significant expense.
Finis' Experiments and Scribbles
- Finis
- Captain
- Posts: 5262
- Joined: 21 May 2009, 18:26
- Type the number ten into the box: 0
- Location: North Venezuela or West Korea
- Contact:
Re: Finis' Experiments and Scribbles
The more laws, the less justice. -- Marcus Tullius Cicero
Re: Finis' Experiments and Scribbles
No its a technical fact that they share the ram!
That doesnt stack up in speed thats the point!
It stacks up in scene size!!!!!! The scenes that I want to load dont load in 8 Gig if they are big!
Some do bit many dont! If I use 2x 1060 or 2x 1070
then I can use and build the scene I want even if the are a tad slower the a 1080 or 1080ti bit they load in 16 Gig! In the more expensive one piece 1080ti the scene doesnt load! But only if the main board is made for it! The newer versions for series 8 processors are....many of them! Also the MSI boards are! Alternative is a dual core grafic card board but they are even more expensive ! Thats the story!
That doesnt stack up in speed thats the point!
It stacks up in scene size!!!!!! The scenes that I want to load dont load in 8 Gig if they are big!
Some do bit many dont! If I use 2x 1060 or 2x 1070
then I can use and build the scene I want even if the are a tad slower the a 1080 or 1080ti bit they load in 16 Gig! In the more expensive one piece 1080ti the scene doesnt load! But only if the main board is made for it! The newer versions for series 8 processors are....many of them! Also the MSI boards are! Alternative is a dual core grafic card board but they are even more expensive ! Thats the story!
Re: Finis' Experiments and Scribbles
https://www.lifewire.com/multiple-graphics-cards-834088
Here is the full story and the technology is called crossfire for AMD or SLI for the other system!
Mainboard and software has to support the technology! So ask for an SLI compliant motherboard!!!!!
Blender and cycles support it! Eevee should also!
Thea definitly does so does Keyshot and Octane !
Dont forget that you want an SLI card.. since most of these are not made for CL but for Cuda..so your into the Nvidia.. SLI world!
Here is the list of SLI compatible cards..
https://www.geforce.com/hardware/techno ... orted-gpus
Here are the SLI compatible Nvidia mainbords..!
http://www.nvidia.com/object/sli-ready- ... oards.html
further read and WiKI..!
https://en.m.wikipedia.org/wiki/Scalable_Link_Interface
Here is another awsome way of addressing multiple gpus in Cycles and Blender over GPUBOX!
https://youtu.be/atdcvMA0ULg
Here is the dicussion on Iray! It does use all cores but somewhat doesnt simultanously use shared memory!!!
https://www.daz3d.com/forums/discussion ... ng-in-iray
Here is the full story and the technology is called crossfire for AMD or SLI for the other system!
Mainboard and software has to support the technology! So ask for an SLI compliant motherboard!!!!!
Blender and cycles support it! Eevee should also!
Thea definitly does so does Keyshot and Octane !
Dont forget that you want an SLI card.. since most of these are not made for CL but for Cuda..so your into the Nvidia.. SLI world!
Here is the list of SLI compatible cards..
https://www.geforce.com/hardware/techno ... orted-gpus
Here are the SLI compatible Nvidia mainbords..!
http://www.nvidia.com/object/sli-ready- ... oards.html
further read and WiKI..!
https://en.m.wikipedia.org/wiki/Scalable_Link_Interface
Here is another awsome way of addressing multiple gpus in Cycles and Blender over GPUBOX!
https://youtu.be/atdcvMA0ULg
Here is the dicussion on Iray! It does use all cores but somewhat doesnt simultanously use shared memory!!!
https://www.daz3d.com/forums/discussion ... ng-in-iray
- Finis
- Captain
- Posts: 5262
- Joined: 21 May 2009, 18:26
- Type the number ten into the box: 0
- Location: North Venezuela or West Korea
- Contact:
Re: Finis' Experiments and Scribbles
Now I'm concerned for you. Your wishful thinking will cause you spend a considerable amount of money but not get the benefit you hope for. It is well known that SLI does not combine the memory of the cards connected. https://www.nvidia.com/object/slizone_ask_mmm013.html ... and many others all over the web.
None of the links you provided (or the links I followed from there) support your hypothesis. It takes a huge amount of wishful thinking to see even the slightest vague hope of that in the ones that aren't completely irrelevant.
Multi cards operating as in that Daz post is normal. There is nothing new or surprising there. It is the way cycles works too. It is the way it really works.
On another note: Someday when prices go down, probably not soon, I might get a gtx1070ti. Anyone who might want to buy my msi gtx 1060 3gb then?
None of the links you provided (or the links I followed from there) support your hypothesis. It takes a huge amount of wishful thinking to see even the slightest vague hope of that in the ones that aren't completely irrelevant.
Multi cards operating as in that Daz post is normal. There is nothing new or surprising there. It is the way cycles works too. It is the way it really works.
On another note: Someday when prices go down, probably not soon, I might get a gtx1070ti. Anyone who might want to buy my msi gtx 1060 3gb then?
The more laws, the less justice. -- Marcus Tullius Cicero
Re: Finis' Experiments and Scribbles
It depends on the programing of the render engines!
Cycles does so does Thea! Iray doesnt and many othwrs dont either! Thea even is programmed to use all Ram it gets the the cpu ram! Its the only one that distributes the workload on the gpu and the cpu together! I know people who alteady have the combination and it works!
Iray simply doesnt support it in the code!
Cycles does so does Thea! Iray doesnt and many othwrs dont either! Thea even is programmed to use all Ram it gets the the cpu ram! Its the only one that distributes the workload on the gpu and the cpu together! I know people who alteady have the combination and it works!
Iray simply doesnt support it in the code!
- Finis
- Captain
- Posts: 5262
- Joined: 21 May 2009, 18:26
- Type the number ten into the box: 0
- Location: North Venezuela or West Korea
- Contact:
Re: Finis' Experiments and Scribbles
You are a victim of self deception.
On the subject of my experiments:
This talk of gpu memory made me wonder about the limits of my card with 3gb. Tests with Daz showed that I could add 24 gen 8 figures, no instances, high poly, no hair, no clothes, before crashing during iray render.
On the subject of my experiments:
This talk of gpu memory made me wonder about the limits of my card with 3gb. Tests with Daz showed that I could add 24 gen 8 figures, no instances, high poly, no hair, no clothes, before crashing during iray render.
The more laws, the less justice. -- Marcus Tullius Cicero
Re: Finis' Experiments and Scribbles
Yup...?What does that have to do with big scenes with tons of instances..?As long as one is rendering on cpu.. no problem..it loads into the ram blocks until those are full..which they mostly arent..and then uses the HD or SSD..! On the gpu you cant do that for realtime....! Big scenes dont work..so do large renders cause trouble...!...
You will see pretty soon what runs on 8 Gig and what needs 16...
You will see pretty soon what runs on 8 Gig and what needs 16...
- Finis
- Captain
- Posts: 5262
- Joined: 21 May 2009, 18:26
- Type the number ten into the box: 0
- Location: North Venezuela or West Korea
- Contact:
Re: Finis' Experiments and Scribbles
My test above is only about my curiosity regarding how much I could get into my card's vram. Your talk of huge scenes inspired my curiosity but the test was obviously not intended to test the combined memory hypothesis. It is strange that you thought was.
Yes, if you have a lot of ram then cycles, iray, probably others can render a huge scene by using only the cpu and ram. No gpu's. No combined memory. Slow. Very slow if virtual memory (the HD) is engaged but would probably still work. My vram test of my card had nothing to do with cpu rendering.
Let me be sure I understand your claim. You claim that there is a way existing now, for gtx cards not teslas, to use the memories of multiple gpu's as one big memory for renders? For example, two 8gb cards could behave as one 16gb memory and render a scene needing more that 8gb? Is that correct or do I misunderstand what you claim?
I'm tired of discussing this since you have provided zero evidence of such combined memory. If this much-wanted ability currently existed it would be "front page news" all over the CG world. It would be easy to find many articles about it. Yet it is not and you have only provided links to things that don't support that hypothesis.
Yes, if you have a lot of ram then cycles, iray, probably others can render a huge scene by using only the cpu and ram. No gpu's. No combined memory. Slow. Very slow if virtual memory (the HD) is engaged but would probably still work. My vram test of my card had nothing to do with cpu rendering.
Let me be sure I understand your claim. You claim that there is a way existing now, for gtx cards not teslas, to use the memories of multiple gpu's as one big memory for renders? For example, two 8gb cards could behave as one 16gb memory and render a scene needing more that 8gb? Is that correct or do I misunderstand what you claim?
I'm tired of discussing this since you have provided zero evidence of such combined memory. If this much-wanted ability currently existed it would be "front page news" all over the CG world. It would be easy to find many articles about it. Yet it is not and you have only provided links to things that don't support that hypothesis.
The more laws, the less justice. -- Marcus Tullius Cicero
Re: Finis' Experiments and Scribbles
Finis! It is front page news! I will dig through all the articles that I read the past few months in the 3d mags! There is a whole new development on 2 and 3 gpu card coming! One has 16 gig ram at the moment! They are around 3000$ each but they are going to revolutionize the industry ! Realtime render is here to stay! If I have time I will copy out all snippets that actually talk about it...!
Dont be so.....when I come and show you guys what the industry is up to...! I posted a link to a nodal texture creator that was free back then!that was 10 years ago!!!! It was called mapzone..its now Substance..!https://www.awportals.com/aw/archives/t ... read_1951/
Its now subscrption....and is defact Industry standard..its Substance! Substance designer and Substance Painter! Greetings
Ps: Look at this.. its the latest from the trade fair and new pro render with Radeon dual pro card on latest C4d rel 19!
https://youtu.be/XdRKgT8BOIg
https://www.blendernation.com/2017/06/2 ... available/
Just because we are just a few select here shouldnt let us be on a current page!!!
Dont be so.....when I come and show you guys what the industry is up to...! I posted a link to a nodal texture creator that was free back then!that was 10 years ago!!!! It was called mapzone..its now Substance..!https://www.awportals.com/aw/archives/t ... read_1951/
Its now subscrption....and is defact Industry standard..its Substance! Substance designer and Substance Painter! Greetings
Ps: Look at this.. its the latest from the trade fair and new pro render with Radeon dual pro card on latest C4d rel 19!
https://youtu.be/XdRKgT8BOIg
https://www.blendernation.com/2017/06/2 ... available/
Just because we are just a few select here shouldnt let us be on a current page!!!