Just Cause 2 performance increase tip - major How to increase Just Cause 2 performance with dedicate
I knew it immediately when I installed Just Cause 2 that it is not utilising resources correctly. However there was no way how to tweak it. I was not playing games several months but now I have tried new CUDA setting and it is working.
I did something in CUDA and have asked whether it is possible to enable only specific adapters (for my app and also for this failed game), well now it is with env variable CUDA_VISIBLE_DEVICES. I have GTX295 and 9800GTX so I have CUDA_VISIBLE_DEVICES=2 (to use only 9800GTX - it can be verified, I have implemented benchmark in my app so I am using it for verification).
For me performance increase is major (cca from avg 25 to avg 30 fps when looking into vegetation) and also gameplay is fluid and not microchoppy.
Enjoy.
----------
I have tested it with concrete jungle benchmark and it is real, fps went from 17 to 20 (which is brilliant, because in that benchmark each fps is paid with blood).
Brilliant is also fact that with this I can distribute PhysX and Cuda to devices I want so everything is correctly utilized.
I knew it immediately when I installed Just Cause 2 that it is not utilising resources correctly. However there was no way how to tweak it. I was not playing games several months but now I have tried new CUDA setting and it is working.
I did something in CUDA and have asked whether it is possible to enable only specific adapters (for my app and also for this failed game), well now it is with env variable CUDA_VISIBLE_DEVICES. I have GTX295 and 9800GTX so I have CUDA_VISIBLE_DEVICES=2 (to use only 9800GTX - it can be verified, I have implemented benchmark in my app so I am using it for verification).
For me performance increase is major (cca from avg 25 to avg 30 fps when looking into vegetation) and also gameplay is fluid and not microchoppy.
Enjoy.
----------
I have tested it with concrete jungle benchmark and it is real, fps went from 17 to 20 (which is brilliant, because in that benchmark each fps is paid with blood).
Brilliant is also fact that with this I can distribute PhysX and Cuda to devices I want so everything is correctly utilized.
I knew it immediately when I installed Just Cause 2 that it is not utilising resources correctly. However there was no way how to tweak it. I was not playing games several months but now I have tried new CUDA setting and it is working.
I did something in CUDA and have asked whether it is possible to enable only specific adapters (for my app and also for this failed game), well now it is with env variable CUDA_VISIBLE_DEVICES. I have GTX295 and 9800GTX so I have CUDA_VISIBLE_DEVICES=2 (to use only 9800GTX - it can be verified, I have implemented benchmark in my app so I am using it for verification).
For me performance increase is major (cca from avg 25 to avg 30 fps when looking into vegetation) and also gameplay is fluid and not microchoppy.
Enjoy.
----------
I have tested it with concrete jungle benchmark and it is real, fps went from 17 to 20 (which is brilliant, because in that benchmark each fps is paid with blood).
Brilliant is also fact that with this I can distribute PhysX and Cuda to devices I want so everything is correctly utilized.
I knew it immediately when I installed Just Cause 2 that it is not utilising resources correctly. However there was no way how to tweak it. I was not playing games several months but now I have tried new CUDA setting and it is working.
I did something in CUDA and have asked whether it is possible to enable only specific adapters (for my app and also for this failed game), well now it is with env variable CUDA_VISIBLE_DEVICES. I have GTX295 and 9800GTX so I have CUDA_VISIBLE_DEVICES=2 (to use only 9800GTX - it can be verified, I have implemented benchmark in my app so I am using it for verification).
For me performance increase is major (cca from avg 25 to avg 30 fps when looking into vegetation) and also gameplay is fluid and not microchoppy.
Enjoy.
----------
I have tested it with concrete jungle benchmark and it is real, fps went from 17 to 20 (which is brilliant, because in that benchmark each fps is paid with blood).
Brilliant is also fact that with this I can distribute PhysX and Cuda to devices I want so everything is correctly utilized.
[quote name='TrekCZ' date='24 November 2010 - 02:35 AM' timestamp='1290566128' post='1150782']
I knew it immediately when I installed Just Cause 2 that it is not utilising resources correctly. However there was no way how to tweak it. I was not playing games several months but now I have tried new CUDA setting and it is working.
I did something in CUDA and have asked whether it is possible to enable only specific adapters (for my app and also for this failed game), well now it is with env variable CUDA_VISIBLE_DEVICES. I have GTX295 and 9800GTX so I have CUDA_VISIBLE_DEVICES=2 (to use only 9800GTX - it can be verified, I have implemented benchmark in my app so I am using it for verification).
For me performance increase is major (cca from avg 25 to avg 30 fps when looking into vegetation) and also gameplay is fluid and not microchoppy.
Enjoy.
----------
I have tested it with concrete jungle benchmark and it is real, fps went from 17 to 20 (which is brilliant, because in that benchmark each fps is paid with blood).
Brilliant is also fact that with this I can distribute PhysX and Cuda to devices I want so everything is correctly utilized.
[/quote]
Sorry to bump an old thread... /sad.gif' class='bbc_emoticon' alt=':sad:' /> but I want to offload compute processing to my gtx 275 and have my gtx 470 as my main card for JC 2. How do I go about setting this env variable? I am totally clueless... /crying.gif' class='bbc_emoticon' alt=':'(' /> any help would be appreaciated.
[quote name='TrekCZ' date='24 November 2010 - 02:35 AM' timestamp='1290566128' post='1150782']
I knew it immediately when I installed Just Cause 2 that it is not utilising resources correctly. However there was no way how to tweak it. I was not playing games several months but now I have tried new CUDA setting and it is working.
I did something in CUDA and have asked whether it is possible to enable only specific adapters (for my app and also for this failed game), well now it is with env variable CUDA_VISIBLE_DEVICES. I have GTX295 and 9800GTX so I have CUDA_VISIBLE_DEVICES=2 (to use only 9800GTX - it can be verified, I have implemented benchmark in my app so I am using it for verification).
For me performance increase is major (cca from avg 25 to avg 30 fps when looking into vegetation) and also gameplay is fluid and not microchoppy.
Enjoy.
----------
I have tested it with concrete jungle benchmark and it is real, fps went from 17 to 20 (which is brilliant, because in that benchmark each fps is paid with blood).
Brilliant is also fact that with this I can distribute PhysX and Cuda to devices I want so everything is correctly utilized.
Sorry to bump an old thread... /sad.gif' class='bbc_emoticon' alt=':sad:' /> but I want to offload compute processing to my gtx 275 and have my gtx 470 as my main card for JC 2. How do I go about setting this env variable? I am totally clueless... /crying.gif' class='bbc_emoticon' alt=':'(' /> any help would be appreaciated.
I've been testing this recently but i had nothing but crash in 3D. However i saw a huge increase in benchmark test it went from an average 84 FPS to 130 FPS with a dedicated CUDA card using a GTX 580-GTX460 setup. Performance were so awesome it even top the 200 FPS limit with everything maxed out.
I've been testing this recently but i had nothing but crash in 3D. However i saw a huge increase in benchmark test it went from an average 84 FPS to 130 FPS with a dedicated CUDA card using a GTX 580-GTX460 setup. Performance were so awesome it even top the 200 FPS limit with everything maxed out.
[quote name='Majorgamer' date='17 January 2011 - 02:36 PM' timestamp='1295242617' post='1178219']
I've been testing this recently but i had nothing but crash in 3D. However i saw a huge increase in benchmark test it went from an average 84 FPS to 130 FPS with a dedicated CUDA card using a GTX 580-GTX460 setup. Performance were so awesome it even top the 200 FPS limit with everything maxed out.
But...it's crashing so it's useless lol
[/quote]
how can u enable it?
would u tell me the way more detaily?
i want to test it
[quote name='Majorgamer' date='17 January 2011 - 02:36 PM' timestamp='1295242617' post='1178219']
I've been testing this recently but i had nothing but crash in 3D. However i saw a huge increase in benchmark test it went from an average 84 FPS to 130 FPS with a dedicated CUDA card using a GTX 580-GTX460 setup. Performance were so awesome it even top the 200 FPS limit with everything maxed out.
[quote name='ndlrjajdlfo' date='17 January 2011 - 05:44 AM' timestamp='1295243080' post='1178221']
how can u enable it?
would u tell me the way more detaily?
i want to test it
[/quote]
Yeah, I wanna test it too, just for the hell of it. Can you tell us how to configure it correctly, please? /biggrin.gif' class='bbc_emoticon' alt=':biggrin:' />
[quote name='ndlrjajdlfo' date='17 January 2011 - 05:44 AM' timestamp='1295243080' post='1178221']
how can u enable it?
would u tell me the way more detaily?
i want to test it
Yeah, I wanna test it too, just for the hell of it. Can you tell us how to configure it correctly, please? /biggrin.gif' class='bbc_emoticon' alt=':biggrin:' />
[quote name='Majorgamer' date='17 January 2011 - 05:36 AM' timestamp='1295242617' post='1178219']
I've been testing this recently but i had nothing but crash in 3D. However i saw a huge increase in benchmark test it went from an average 84 FPS to 130 FPS with a dedicated CUDA card using a GTX 580-GTX460 setup. Performance were so awesome it even top the 200 FPS limit with everything maxed out.
But...it's crashing so it's useless lol
[/quote]
Have you OCd your card at all? My JC crashes as soon as it sniffs OC especially during cut-scenes but is super-duper stable when factory clocked.
[quote name='Majorgamer' date='17 January 2011 - 05:36 AM' timestamp='1295242617' post='1178219']
I've been testing this recently but i had nothing but crash in 3D. However i saw a huge increase in benchmark test it went from an average 84 FPS to 130 FPS with a dedicated CUDA card using a GTX 580-GTX460 setup. Performance were so awesome it even top the 200 FPS limit with everything maxed out.
But...it's crashing so it's useless lol
Have you OCd your card at all? My JC crashes as soon as it sniffs OC especially during cut-scenes but is super-duper stable when factory clocked.
Lord, grant me the serenity to accept the things I cannot change, the courage to change the things I can, and the wisdom to know the difference.
-------------------
Vitals: Windows 7 64bit, i5 2500 @ 4.4ghz, SLI GTX670, 8GB, Viewsonic VX2268WM
Yes of course. It's in the Nvidia control panel, ''manage 3d setting'' tab and then the ''CUDA - GPU'' tab.
Then select your 2nd GPU so it can be dedicated to CUDA.
Good testing!
And no my GTX 580 isnt OC but i heard this card and JUST cause 2 arent going well together. Same thing with Metro DX11. Probably because the GTX 580 is an overclock version of the GTX480. It's actually the same architecture.
Yes of course. It's in the Nvidia control panel, ''manage 3d setting'' tab and then the ''CUDA - GPU'' tab.
Then select your 2nd GPU so it can be dedicated to CUDA.
Good testing!
And no my GTX 580 isnt OC but i heard this card and JUST cause 2 arent going well together. Same thing with Metro DX11. Probably because the GTX 580 is an overclock version of the GTX480. It's actually the same architecture.
[quote name='Majorgamer' date='18 January 2011 - 12:00 AM' timestamp='1295308849' post='1178671']
Yes of course. It's in the Nvidia control panel, ''manage 3d setting'' tab and then the ''CUDA - GPU'' tab.
Then select your 2nd GPU so it can be dedicated to CUDA.
Good testing!
And no my GTX 580 isnt OC but i heard this card and JUST cause 2 arent going well together. Same thing with Metro DX11. Probably because the GTX 580 is an overclock version of the GTX480. It's actually the same architecture.
[/quote]
Hmm... If I set that in the control panel, there is no difference whatsoever in the game... It is still using my 470 for compute processing. What am I doing wrong? Afterburner shows no usage on my gtx 275 as well, so it is not being used even though I changed the CUDA-GPU in the Nvidia Control Panel to Nvidia GTX 275.
[quote name='Majorgamer' date='18 January 2011 - 12:00 AM' timestamp='1295308849' post='1178671']
Yes of course. It's in the Nvidia control panel, ''manage 3d setting'' tab and then the ''CUDA - GPU'' tab.
Then select your 2nd GPU so it can be dedicated to CUDA.
Good testing!
And no my GTX 580 isnt OC but i heard this card and JUST cause 2 arent going well together. Same thing with Metro DX11. Probably because the GTX 580 is an overclock version of the GTX480. It's actually the same architecture.
Hmm... If I set that in the control panel, there is no difference whatsoever in the game... It is still using my 470 for compute processing. What am I doing wrong? Afterburner shows no usage on my gtx 275 as well, so it is not being used even though I changed the CUDA-GPU in the Nvidia Control Panel to Nvidia GTX 275.
[quote name='FormulaRedline' date='18 January 2011 - 12:10 AM' timestamp='1295309416' post='1178675']
I think they are asking which file (and where to find it) you change the CUDA_VISIBLE_DEVICES line in?
[/quote]
Yeah, thats what I meant, lol, I just poorly worded my question I suppose /biggrin.gif' class='bbc_emoticon' alt=':biggrin:' /> .
[quote name='emperorwalrus' date='17 January 2011 - 06:19 PM' timestamp='1295309963' post='1178681']
Yeah, thats what I meant, lol, I just poorly worded my question I suppose /biggrin.gif' class='bbc_emoticon' alt=':biggrin:' /> .
[/quote]
I feel ya. I tried to do this when it was first posted but gave up because I never did find the file. Then I couldn't find this post to ask the question, lol. Glad it came back around, hopefully we'll get an answer!
[quote name='emperorwalrus' date='17 January 2011 - 06:19 PM' timestamp='1295309963' post='1178681']
Yeah, thats what I meant, lol, I just poorly worded my question I suppose /biggrin.gif' class='bbc_emoticon' alt=':biggrin:' /> .
I feel ya. I tried to do this when it was first posted but gave up because I never did find the file. Then I couldn't find this post to ask the question, lol. Glad it came back around, hopefully we'll get an answer!
Intel i7-4770k
EVGA GTX 780 Ti SC
ASRock Z87 Extreme4
8GB DDR3, 240GB Intel SSD, 3TB HDD
Cooler Master Siedon 120M Liquid Cooling
Dell 3007WFP-HC 30" 2560x1600
Alienware OptX AW2310 23" 1920x1080 with 3D Vision
Acer H5360 720p Projector with 3D Vision
ONKYO HT-S5300 7.1 Sound System
Logitech G19 Keyboard, G9 Mouse, G25 Wheel
Saitek X52 Pro and Rudder Pedals
[quote name='FormulaRedline' date='18 January 2011 - 12:24 AM' timestamp='1295310286' post='1178684']
I feel ya. I tried to do this when it was first posted but gave up because I never did find the file. Then I couldn't find this post to ask the question, lol. Glad it came back around, hopefully we'll get an answer!
[/quote]
[quote name='FormulaRedline' date='18 January 2011 - 12:24 AM' timestamp='1295310286' post='1178684']
I feel ya. I tried to do this when it was first posted but gave up because I never did find the file. Then I couldn't find this post to ask the question, lol. Glad it came back around, hopefully we'll get an answer!
I did something in CUDA and have asked whether it is possible to enable only specific adapters (for my app and also for this failed game), well now it is with env variable CUDA_VISIBLE_DEVICES. I have GTX295 and 9800GTX so I have CUDA_VISIBLE_DEVICES=2 (to use only 9800GTX - it can be verified, I have implemented benchmark in my app so I am using it for verification).
For me performance increase is major (cca from avg 25 to avg 30 fps when looking into vegetation) and also gameplay is fluid and not microchoppy.
Enjoy.
----------
I have tested it with concrete jungle benchmark and it is real, fps went from 17 to 20 (which is brilliant, because in that benchmark each fps is paid with blood).
Brilliant is also fact that with this I can distribute PhysX and Cuda to devices I want so everything is correctly utilized.
I did something in CUDA and have asked whether it is possible to enable only specific adapters (for my app and also for this failed game), well now it is with env variable CUDA_VISIBLE_DEVICES. I have GTX295 and 9800GTX so I have CUDA_VISIBLE_DEVICES=2 (to use only 9800GTX - it can be verified, I have implemented benchmark in my app so I am using it for verification).
For me performance increase is major (cca from avg 25 to avg 30 fps when looking into vegetation) and also gameplay is fluid and not microchoppy.
Enjoy.
----------
I have tested it with concrete jungle benchmark and it is real, fps went from 17 to 20 (which is brilliant, because in that benchmark each fps is paid with blood).
Brilliant is also fact that with this I can distribute PhysX and Cuda to devices I want so everything is correctly utilized.
I did something in CUDA and have asked whether it is possible to enable only specific adapters (for my app and also for this failed game), well now it is with env variable CUDA_VISIBLE_DEVICES. I have GTX295 and 9800GTX so I have CUDA_VISIBLE_DEVICES=2 (to use only 9800GTX - it can be verified, I have implemented benchmark in my app so I am using it for verification).
For me performance increase is major (cca from avg 25 to avg 30 fps when looking into vegetation) and also gameplay is fluid and not microchoppy.
Enjoy.
----------
I have tested it with concrete jungle benchmark and it is real, fps went from 17 to 20 (which is brilliant, because in that benchmark each fps is paid with blood).
Brilliant is also fact that with this I can distribute PhysX and Cuda to devices I want so everything is correctly utilized.
I did something in CUDA and have asked whether it is possible to enable only specific adapters (for my app and also for this failed game), well now it is with env variable CUDA_VISIBLE_DEVICES. I have GTX295 and 9800GTX so I have CUDA_VISIBLE_DEVICES=2 (to use only 9800GTX - it can be verified, I have implemented benchmark in my app so I am using it for verification).
For me performance increase is major (cca from avg 25 to avg 30 fps when looking into vegetation) and also gameplay is fluid and not microchoppy.
Enjoy.
----------
I have tested it with concrete jungle benchmark and it is real, fps went from 17 to 20 (which is brilliant, because in that benchmark each fps is paid with blood).
Brilliant is also fact that with this I can distribute PhysX and Cuda to devices I want so everything is correctly utilized.
I knew it immediately when I installed Just Cause 2 that it is not utilising resources correctly. However there was no way how to tweak it. I was not playing games several months but now I have tried new CUDA setting and it is working.
I did something in CUDA and have asked whether it is possible to enable only specific adapters (for my app and also for this failed game), well now it is with env variable CUDA_VISIBLE_DEVICES. I have GTX295 and 9800GTX so I have CUDA_VISIBLE_DEVICES=2 (to use only 9800GTX - it can be verified, I have implemented benchmark in my app so I am using it for verification).
For me performance increase is major (cca from avg 25 to avg 30 fps when looking into vegetation) and also gameplay is fluid and not microchoppy.
Enjoy.
----------
I have tested it with concrete jungle benchmark and it is real, fps went from 17 to 20 (which is brilliant, because in that benchmark each fps is paid with blood).
Brilliant is also fact that with this I can distribute PhysX and Cuda to devices I want so everything is correctly utilized.
[/quote]
Sorry to bump an old thread...
Thanks,
Walrus
I knew it immediately when I installed Just Cause 2 that it is not utilising resources correctly. However there was no way how to tweak it. I was not playing games several months but now I have tried new CUDA setting and it is working.
I did something in CUDA and have asked whether it is possible to enable only specific adapters (for my app and also for this failed game), well now it is with env variable CUDA_VISIBLE_DEVICES. I have GTX295 and 9800GTX so I have CUDA_VISIBLE_DEVICES=2 (to use only 9800GTX - it can be verified, I have implemented benchmark in my app so I am using it for verification).
For me performance increase is major (cca from avg 25 to avg 30 fps when looking into vegetation) and also gameplay is fluid and not microchoppy.
Enjoy.
----------
I have tested it with concrete jungle benchmark and it is real, fps went from 17 to 20 (which is brilliant, because in that benchmark each fps is paid with blood).
Brilliant is also fact that with this I can distribute PhysX and Cuda to devices I want so everything is correctly utilized.
Sorry to bump an old thread...
Thanks,
Walrus
But...it's crashing so it's useless lol
But...it's crashing so it's useless lol
English is my 2nd language...
Vista 64
ASUS P8P67Pro
I-7 2600k @ 3.4Ghz
Msi GTX 580
Msi GTX 460 PhysX
4G ram HyperX DDR3
3D vision user
Panasonic 3DHDTV VT25
I've been testing this recently but i had nothing but crash in 3D. However i saw a huge increase in benchmark test it went from an average 84 FPS to 130 FPS with a dedicated CUDA card using a GTX 580-GTX460 setup. Performance were so awesome it even top the 200 FPS limit with everything maxed out.
But...it's crashing so it's useless lol
[/quote]
how can u enable it?
would u tell me the way more detaily?
i want to test it
I've been testing this recently but i had nothing but crash in 3D. However i saw a huge increase in benchmark test it went from an average 84 FPS to 130 FPS with a dedicated CUDA card using a GTX 580-GTX460 setup. Performance were so awesome it even top the 200 FPS limit with everything maxed out.
But...it's crashing so it's useless lol
how can u enable it?
would u tell me the way more detaily?
i want to test it
i7 8700K @4.9
GTX1080Ti
Asrock Z370 Gamming K6
Windows10 64bit
LG OLED UHD 3dtv 55E6K
how can u enable it?
would u tell me the way more detaily?
i want to test it
[/quote]
Yeah, I wanna test it too, just for the hell of it. Can you tell us how to configure it correctly, please?
Thanks
how can u enable it?
would u tell me the way more detaily?
i want to test it
Yeah, I wanna test it too, just for the hell of it. Can you tell us how to configure it correctly, please?
Thanks
I've been testing this recently but i had nothing but crash in 3D. However i saw a huge increase in benchmark test it went from an average 84 FPS to 130 FPS with a dedicated CUDA card using a GTX 580-GTX460 setup. Performance were so awesome it even top the 200 FPS limit with everything maxed out.
But...it's crashing so it's useless lol
[/quote]
Have you OCd your card at all? My JC crashes as soon as it sniffs OC especially during cut-scenes but is super-duper stable when factory clocked.
I've been testing this recently but i had nothing but crash in 3D. However i saw a huge increase in benchmark test it went from an average 84 FPS to 130 FPS with a dedicated CUDA card using a GTX 580-GTX460 setup. Performance were so awesome it even top the 200 FPS limit with everything maxed out.
But...it's crashing so it's useless lol
Have you OCd your card at all? My JC crashes as soon as it sniffs OC especially during cut-scenes but is super-duper stable when factory clocked.
Lord, grant me the serenity to accept the things I cannot change, the courage to change the things I can, and the wisdom to know the difference.
-------------------
Vitals: Windows 7 64bit, i5 2500 @ 4.4ghz, SLI GTX670, 8GB, Viewsonic VX2268WM
Handy Driver Discussion
Helix Mod - community fixes
Bo3b's Shaderhacker School - How to fix 3D in games
3dsolutionsgaming.com - videos, reviews and 3D fixes
Then select your 2nd GPU so it can be dedicated to CUDA.
Good testing!
And no my GTX 580 isnt OC but i heard this card and JUST cause 2 arent going well together. Same thing with Metro DX11. Probably because the GTX 580 is an overclock version of the GTX480. It's actually the same architecture.
Then select your 2nd GPU so it can be dedicated to CUDA.
Good testing!
And no my GTX 580 isnt OC but i heard this card and JUST cause 2 arent going well together. Same thing with Metro DX11. Probably because the GTX 580 is an overclock version of the GTX480. It's actually the same architecture.
English is my 2nd language...
Vista 64
ASUS P8P67Pro
I-7 2600k @ 3.4Ghz
Msi GTX 580
Msi GTX 460 PhysX
4G ram HyperX DDR3
3D vision user
Panasonic 3DHDTV VT25
Yes of course. It's in the Nvidia control panel, ''manage 3d setting'' tab and then the ''CUDA - GPU'' tab.
Then select your 2nd GPU so it can be dedicated to CUDA.
Good testing!
And no my GTX 580 isnt OC but i heard this card and JUST cause 2 arent going well together. Same thing with Metro DX11. Probably because the GTX 580 is an overclock version of the GTX480. It's actually the same architecture.
[/quote]
Hmm... If I set that in the control panel, there is no difference whatsoever in the game... It is still using my 470 for compute processing. What am I doing wrong? Afterburner shows no usage on my gtx 275 as well, so it is not being used even though I changed the CUDA-GPU in the Nvidia Control Panel to Nvidia GTX 275.
Yes of course. It's in the Nvidia control panel, ''manage 3d setting'' tab and then the ''CUDA - GPU'' tab.
Then select your 2nd GPU so it can be dedicated to CUDA.
Good testing!
And no my GTX 580 isnt OC but i heard this card and JUST cause 2 arent going well together. Same thing with Metro DX11. Probably because the GTX 580 is an overclock version of the GTX480. It's actually the same architecture.
Hmm... If I set that in the control panel, there is no difference whatsoever in the game... It is still using my 470 for compute processing. What am I doing wrong? Afterburner shows no usage on my gtx 275 as well, so it is not being used even though I changed the CUDA-GPU in the Nvidia Control Panel to Nvidia GTX 275.
Intel i7-4770k
EVGA GTX 780 Ti SC
ASRock Z87 Extreme4
8GB DDR3, 240GB Intel SSD, 3TB HDD
Cooler Master Siedon 120M Liquid Cooling
Dell 3007WFP-HC 30" 2560x1600
Alienware OptX AW2310 23" 1920x1080 with 3D Vision
Acer H5360 720p Projector with 3D Vision
ONKYO HT-S5300 7.1 Sound System
Logitech G19 Keyboard, G9 Mouse, G25 Wheel
Saitek X52 Pro and Rudder Pedals
I think they are asking which file (and where to find it) you change the CUDA_VISIBLE_DEVICES line in?
[/quote]
Yeah, thats what I meant, lol, I just poorly worded my question I suppose
I think they are asking which file (and where to find it) you change the CUDA_VISIBLE_DEVICES line in?
Yeah, thats what I meant, lol, I just poorly worded my question I suppose
Yeah, thats what I meant, lol, I just poorly worded my question I suppose
[/quote]
I feel ya. I tried to do this when it was first posted but gave up because I never did find the file. Then I couldn't find this post to ask the question, lol. Glad it came back around, hopefully we'll get an answer!
Yeah, thats what I meant, lol, I just poorly worded my question I suppose
I feel ya. I tried to do this when it was first posted but gave up because I never did find the file. Then I couldn't find this post to ask the question, lol. Glad it came back around, hopefully we'll get an answer!
Intel i7-4770k
EVGA GTX 780 Ti SC
ASRock Z87 Extreme4
8GB DDR3, 240GB Intel SSD, 3TB HDD
Cooler Master Siedon 120M Liquid Cooling
Dell 3007WFP-HC 30" 2560x1600
Alienware OptX AW2310 23" 1920x1080 with 3D Vision
Acer H5360 720p Projector with 3D Vision
ONKYO HT-S5300 7.1 Sound System
Logitech G19 Keyboard, G9 Mouse, G25 Wheel
Saitek X52 Pro and Rudder Pedals
I feel ya. I tried to do this when it was first posted but gave up because I never did find the file. Then I couldn't find this post to ask the question, lol. Glad it came back around, hopefully we'll get an answer!
[/quote]
Lol, lets hope so :)
I feel ya. I tried to do this when it was first posted but gave up because I never did find the file. Then I couldn't find this post to ask the question, lol. Glad it came back around, hopefully we'll get an answer!
Lol, lets hope so :)