Is that I understand NVIDIA 3D VISION supports up to 1080p.
Test 1080p to see if you still the same error.
I once asked them and they said NVIDIA glasses supported as 1920 x 1080 maximum resolution and therefore more of that screen resolution should give errors. Correct me if I'm wrong.
Is that I understand NVIDIA 3D VISION supports up to 1080p.
Test 1080p to see if you still the same error.
I once asked them and they said NVIDIA glasses supported as 1920 x 1080 maximum resolution and therefore more of that screen resolution should give errors. Correct me if I'm wrong.
Its a shame having bought Trine 3 myself, that I bought a 3D vision kit a while ago but I couldn't use it as glasses cause me immense nose bridge irritation. I had to return them :(
Trine 3 looks perfect for 3D vision, and the graphics are amazing.
Its a shame having bought Trine 3 myself, that I bought a 3D vision kit a while ago but I couldn't use it as glasses cause me immense nose bridge irritation. I had to return them :(
Trine 3 looks perfect for 3D vision, and the graphics are amazing.
See if you can pick up a set of old 3D Vision 1 glasses - they are compatible with 3D Vision 2 (the main difference is in the monitor, not the glasses), but are much lighter and more comfortable.
I have a set of each and used to use the 3D Vision 1 glasses until the little rubber nose rest popped off and refused to stay on, but by that point I had got used to the weight on my nose and could wear the 3D Vision 2 glasses with no problems.
See if you can pick up a set of old 3D Vision 1 glasses - they are compatible with 3D Vision 2 (the main difference is in the monitor, not the glasses), but are much lighter and more comfortable.
I have a set of each and used to use the 3D Vision 1 glasses until the little rubber nose rest popped off and refused to stay on, but by that point I had got used to the weight on my nose and could wear the 3D Vision 2 glasses with no problems.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
If you can find a full kit for 3D Vision 1 glasses, they also come with 3 different nose pieces that you can try to find which is most comfortable. I prefer the Ver 1 of the glasses too. The nose piece on mine popped off as well, but I like them so much better I superglued the nose piece in place.
@Jose: I didn't understand your Trine 3 post. Are you saying that Trine 3 doesn't run in 1080p?
If you can find a full kit for 3D Vision 1 glasses, they also come with 3 different nose pieces that you can try to find which is most comfortable. I prefer the Ver 1 of the glasses too. The nose piece on mine popped off as well, but I like them so much better I superglued the nose piece in place.
@Jose: I didn't understand your Trine 3 post. Are you saying that Trine 3 doesn't run in 1080p?
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
[quote="bo3b"]If you can find a full kit for 3D Vision 1 glasses, they also come with 3 different nose pieces that you can try to find which is most comfortable. I prefer the Ver 1 of the glasses too. The nose piece on mine popped off as well, but I like them so much better I superglued the nose piece in place.
@Jose: I didn't understand your Trine 3 post. Are you saying that Trine 3 doesn't run in 1080p?[/quote]
I never even realized they came with different setting pieces. My default kept popping off after a while. Then I tried it with nothing on (Ver 1) the nose, and it was a revelation. It was like 20X more comfortable and I never looked back. Have kept it bare ever since.
bo3b said:If you can find a full kit for 3D Vision 1 glasses, they also come with 3 different nose pieces that you can try to find which is most comfortable. I prefer the Ver 1 of the glasses too. The nose piece on mine popped off as well, but I like them so much better I superglued the nose piece in place.
@Jose: I didn't understand your Trine 3 post. Are you saying that Trine 3 doesn't run in 1080p?
I never even realized they came with different setting pieces. My default kept popping off after a while. Then I tried it with nothing on (Ver 1) the nose, and it was a revelation. It was like 20X more comfortable and I never looked back. Have kept it bare ever since.
Yeah this plastic is really slightly annoying. Mine pops up too. But nothing on it, no way, that is super unconformtable. :) Ah, I like that we already discuss the comfort of the glasses. ;D
Yeah this plastic is really slightly annoying. Mine pops up too. But nothing on it, no way, that is super unconformtable. :) Ah, I like that we already discuss the comfort of the glasses. ;D
[quote="mrorange55"][quote="Foulplay99"]Seems to be depth and convergence locked, and I cant get the game's own 3D menu to become active rather than the bars being greyed out.
Anyone else having this problem?[/quote]
No, it is not locked. You have to change the 3D settings in the menu. When it is greyed out, you might have problems with the hardware, that the game does not recognizes your 3D vision set. I have an inbuilt transmitter an no problem.[/quote]
I tried all settings, and unplugging my 3D emitter and plugging it back in, and nothing made any difference. I had to alter the 3D values manually in the config file in %appdata% which was a pain. I've only tried it on my ROG SWIFT, haven't had the chance to try it on my projector yet, because I would have to re-arrange the living room and its too much of a pain.
Foulplay99 said:Seems to be depth and convergence locked, and I cant get the game's own 3D menu to become active rather than the bars being greyed out.
Anyone else having this problem?
No, it is not locked. You have to change the 3D settings in the menu. When it is greyed out, you might have problems with the hardware, that the game does not recognizes your 3D vision set. I have an inbuilt transmitter an no problem.
I tried all settings, and unplugging my 3D emitter and plugging it back in, and nothing made any difference. I had to alter the 3D values manually in the config file in %appdata% which was a pain. I've only tried it on my ROG SWIFT, haven't had the chance to try it on my projector yet, because I would have to re-arrange the living room and its too much of a pain.
Anyone else having performance issues in 3D?
Runs smooth as butter when I turn 3D off (CTRL T) but very poor with 3D visin engaged.
Most recent drivers (250.xx I think it is)
[quote="Foulplay99"]
I tried all settings, and unplugging my 3D emitter and plugging it back in, and nothing made any difference. I had to alter the 3D values manually in the config file in %appdata% which was a pain. I've only tried it on my ROG SWIFT, haven't had the chance to try it on my projector yet, because I would have to re-arrange the living room and its too much of a pain.[/quote]
Do you have a SLI setup? SLI in combination with 3D Vision and Rog Swift, does not work. You have to disable SLI.
My thread about this topic:
[url]https://forums.geforce.com/default/topic/827372/sli-totally-useless-with-g-sync-or-3d-vision-/#4517380[/url]
I tried all settings, and unplugging my 3D emitter and plugging it back in, and nothing made any difference. I had to alter the 3D values manually in the config file in %appdata% which was a pain. I've only tried it on my ROG SWIFT, haven't had the chance to try it on my projector yet, because I would have to re-arrange the living room and its too much of a pain.
Do you have a SLI setup? SLI in combination with 3D Vision and Rog Swift, does not work. You have to disable SLI.
[quote="vaelo"]Anyone else having performance issues in 3D?
Runs smooth as butter when I turn 3D off (CTRL T) but very poor with 3D visin engaged.
Most recent drivers (250.xx I think it is)[/quote]
3D costs 50% performance. How much is your frame drop from 2D to 3D?
[quote="clammy"]I dont know...I bought Trine 2 last month and it crawled in 3D Vision..like unplayable[/quote]
We made some Stereo 3D fixes last week and released a patch for Trine Enchanted Edition and Trine 2. Hopefully this will fix some performance issues which some users have experienced.
Please do continue reporting if there are still problems with stereo 3D! And to make everything easier, mention things like your OS, GPU + drivers etc.
-Kai/Frozenbyte
clammy said:I dont know...I bought Trine 2 last month and it crawled in 3D Vision..like unplayable
We made some Stereo 3D fixes last week and released a patch for Trine Enchanted Edition and Trine 2. Hopefully this will fix some performance issues which some users have experienced.
Please do continue reporting if there are still problems with stereo 3D! And to make everything easier, mention things like your OS, GPU + drivers etc.
For better performance:
- Find the options.txt file in %appdata% folder. Change 'glow' from "true" to 'false'.
- try a little lower resolution. Difference between 900p and 1080p on my gtx760 is surprisingly huge
- remember you have the option to create 100 or 110fps 3D mode, it flickers but it's worth considering, since choppy 55fps at 60Hz mode is awful compared to 50fps at 50Hz, the animation and details difference in moving objects/scenes is huge on low persistence displays (all 3D Vision 2 monitors)
- in options/control (or input) menu - disable "reduce input lag". It basically enables triple buffering, boosting my framerate from 43fps to 52fps instantly (don't forget to switch triple buffering to on in nvidia panel, although in my case, it doesn't make any difference
- this game is demanding. Don't turn on any AA just "because I have a fast GPU and this should be no problem".
Trine 2 works [b]much[/b] faster on my gtx760 than Trine 3.
you might try to enable/disable shader cache, but in my case it doesn't make any difference.
My current settings for gtx760 for ultimate fluidness:
- 900p 60Hz or 1080p 50Hz (120/100Hz mode)
- AA off, input lag reduction off, very high settings with manually disabled glow in options.file
I was testing and testing so I didn't even finish the first level, might be different in other levels.
Convergence settings are not really well thought through. There are annoying objects really close to the screen forcing me to change convergence to just 2 positions away from the minimum on the in-game slider. :( I hope Frozenbyte will change that before releasing the retail version. I love high convergence, without it all 3D looks flat in comparison :(
For better performance:
- Find the options.txt file in %appdata% folder. Change 'glow' from "true" to 'false'.
- try a little lower resolution. Difference between 900p and 1080p on my gtx760 is surprisingly huge
- remember you have the option to create 100 or 110fps 3D mode, it flickers but it's worth considering, since choppy 55fps at 60Hz mode is awful compared to 50fps at 50Hz, the animation and details difference in moving objects/scenes is huge on low persistence displays (all 3D Vision 2 monitors)
- in options/control (or input) menu - disable "reduce input lag". It basically enables triple buffering, boosting my framerate from 43fps to 52fps instantly (don't forget to switch triple buffering to on in nvidia panel, although in my case, it doesn't make any difference
- this game is demanding. Don't turn on any AA just "because I have a fast GPU and this should be no problem".
Trine 2 works much faster on my gtx760 than Trine 3.
you might try to enable/disable shader cache, but in my case it doesn't make any difference.
My current settings for gtx760 for ultimate fluidness:
- 900p 60Hz or 1080p 50Hz (120/100Hz mode)
- AA off, input lag reduction off, very high settings with manually disabled glow in options.file
I was testing and testing so I didn't even finish the first level, might be different in other levels.
Convergence settings are not really well thought through. There are annoying objects really close to the screen forcing me to change convergence to just 2 positions away from the minimum on the in-game slider. :( I hope Frozenbyte will change that before releasing the retail version. I love high convergence, without it all 3D looks flat in comparison :(
[quote="KaiFB"]-Kai/Frozenbyte[/quote] Welcome to the forums, Kai! We pretty much never see developers stop by, but we really appreciate it when they do. I'd actually love to know what caused the 3D slowdowns for Trine 2, and how you fixed it. I haven't picked up T3 yet, but if you're able to come back and answer that question, I'll pick up a copy to say thanks!
Welcome to the forums, Kai! We pretty much never see developers stop by, but we really appreciate it when they do. I'd actually love to know what caused the 3D slowdowns for Trine 2, and how you fixed it. I haven't picked up T3 yet, but if you're able to come back and answer that question, I'll pick up a copy to say thanks!
[quote="bans3i"][quote="Foulplay99"]
I tried all settings, and unplugging my 3D emitter and plugging it back in, and nothing made any difference. I had to alter the 3D values manually in the config file in %appdata% which was a pain. I've only tried it on my ROG SWIFT, haven't had the chance to try it on my projector yet, because I would have to re-arrange the living room and its too much of a pain.[/quote]
Do you have a SLI setup? SLI in combination with 3D Vision and Rog Swift, does not work. You have to disable SLI.
My thread about this topic:
[url]https://forums.geforce.com/default/topic/827372/sli-totally-useless-with-g-sync-or-3d-vision-/#4517380[/url][/quote]
No, I just run a single 980. I'm due to upgrade to Titan X SLI soon though and get a secondary 4K monitor for anything 2D.
I tried all settings, and unplugging my 3D emitter and plugging it back in, and nothing made any difference. I had to alter the 3D values manually in the config file in %appdata% which was a pain. I've only tried it on my ROG SWIFT, haven't had the chance to try it on my projector yet, because I would have to re-arrange the living room and its too much of a pain.
Do you have a SLI setup? SLI in combination with 3D Vision and Rog Swift, does not work. You have to disable SLI.
Test 1080p to see if you still the same error.
I once asked them and they said NVIDIA glasses supported as 1920 x 1080 maximum resolution and therefore more of that screen resolution should give errors. Correct me if I'm wrong.
Trine 3 looks perfect for 3D vision, and the graphics are amazing.
I have a set of each and used to use the 3D Vision 1 glasses until the little rubber nose rest popped off and refused to stay on, but by that point I had got used to the weight on my nose and could wear the 3D Vision 2 glasses with no problems.
2x Geforce GTX 980 in SLI provided by NVIDIA, i7 6700K 4GHz CPU, Asus 27" VG278HE 144Hz 3D Monitor, BenQ W1070 3D Projector, 120" Elite Screens YardMaster 2, 32GB Corsair DDR4 3200MHz RAM, Samsung 850 EVO 500G SSD, 4x750GB HDD in RAID5, Gigabyte Z170X-Gaming 7 Motherboard, Corsair Obsidian 750D Airflow Edition Case, Corsair RM850i PSU, HTC Vive, Win 10 64bit
Alienware M17x R4 w/ built in 3D, Intel i7 3740QM, GTX 680m 2GB, 16GB DDR3 1600MHz RAM, Win7 64bit, 1TB SSD, 1TB HDD, 750GB HDD
Pre-release 3D fixes, shadertool.py and other goodies: http://github.com/DarkStarSword/3d-fixes
Support me on Patreon: https://www.patreon.com/DarkStarSword or PayPal: https://www.paypal.me/DarkStarSword
@Jose: I didn't understand your Trine 3 post. Are you saying that Trine 3 doesn't run in 1080p?
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
I never even realized they came with different setting pieces. My default kept popping off after a while. Then I tried it with nothing on (Ver 1) the nose, and it was a revelation. It was like 20X more comfortable and I never looked back. Have kept it bare ever since.
Intel Core i7-3820, 4 X 3,60 GHz overclocked to 4,50 GHz ; EVGA Titan X 12VRAM ; 16 GB Corsair Vengeance DDR-1600 (4x 4 GB) ; Asus VG278H 27-inch incl. 3D vision 2 glasses, integrated transmitter ; Xbox One Elite wireless controller ; Windows 10HTC VIVE 2,5 m2 roomscale3D VISION GAMERS - VISIT ME ON STEAM and feel free to add me: http://steamcommunity.com/profiles/76561198064106555 YOUTUBE: https://www.youtube.com/channel/UC1UE5TPoF0HX0HVpF_E4uPQ STEAM CURATOR: https://store.steampowered.com/curator/33611530-Streaming-Deluxe/
I tried all settings, and unplugging my 3D emitter and plugging it back in, and nothing made any difference. I had to alter the 3D values manually in the config file in %appdata% which was a pain. I've only tried it on my ROG SWIFT, haven't had the chance to try it on my projector yet, because I would have to re-arrange the living room and its too much of a pain.
i7 4790k @ 4.6 - 16GB RAM - 2x SLI Titan X
27" ASUS ROG SWIFT, 28" - 65" Samsung UHD8200 4k 3DTV - Oculus Rift CV1 - 34" Acer Predator X34 Ultrawide
Old kit:
i5 2500k @ 4.4 - 8gb RAM
Acer H5360BD projector
GTX 580, SLI 670, GTX 980 EVGA SC
Acer XB280HK 4k 60hz
Oculus DK2
Beautiful game.
Runs smooth as butter when I turn 3D off (CTRL T) but very poor with 3D visin engaged.
Most recent drivers (250.xx I think it is)
Core i7 920 @ 3.6Ghz, 6GB 3 Channel, SLi GTX670 2GB, SSD
Do you have a SLI setup? SLI in combination with 3D Vision and Rog Swift, does not work. You have to disable SLI.
My thread about this topic:
https://forums.geforce.com/default/topic/827372/sli-totally-useless-with-g-sync-or-3d-vision-/#4517380
Gaming Machine:
CPU: i7-5960X @4.5 GHz | Board: Asus Rampage V Extreme| GPU: GTX Titan X 2000/5000| RAM: 16 GB Corsair Vengance LPX DDR4 2666@2800 | PSU: Lepa G1600 | SSD: Samsung 830 Intel SSD 750 | Case: Corsair 500r | Monitor: Asus PG278Q, Asus PB279Q | OS: Win10 x64 | Cooling: Airplex Gigant: http://www.forum-3dcenter.org/vbulletin/showthread.php?t=557708
Home Entertainment:
Sony X85C 65", Nvidia Shield Pro 500 GB, Denon AVR X1200W, 5.0 Nubert nuJubilee 40
3D costs 50% performance. How much is your frame drop from 2D to 3D?
Gaming Machine:
CPU: i7-5960X @4.5 GHz | Board: Asus Rampage V Extreme| GPU: GTX Titan X 2000/5000| RAM: 16 GB Corsair Vengance LPX DDR4 2666@2800 | PSU: Lepa G1600 | SSD: Samsung 830 Intel SSD 750 | Case: Corsair 500r | Monitor: Asus PG278Q, Asus PB279Q | OS: Win10 x64 | Cooling: Airplex Gigant: http://www.forum-3dcenter.org/vbulletin/showthread.php?t=557708
Home Entertainment:
Sony X85C 65", Nvidia Shield Pro 500 GB, Denon AVR X1200W, 5.0 Nubert nuJubilee 40
We made some Stereo 3D fixes last week and released a patch for Trine Enchanted Edition and Trine 2. Hopefully this will fix some performance issues which some users have experienced.
Please do continue reporting if there are still problems with stereo 3D! And to make everything easier, mention things like your OS, GPU + drivers etc.
-Kai/Frozenbyte
- Find the options.txt file in %appdata% folder. Change 'glow' from "true" to 'false'.
- try a little lower resolution. Difference between 900p and 1080p on my gtx760 is surprisingly huge
- remember you have the option to create 100 or 110fps 3D mode, it flickers but it's worth considering, since choppy 55fps at 60Hz mode is awful compared to 50fps at 50Hz, the animation and details difference in moving objects/scenes is huge on low persistence displays (all 3D Vision 2 monitors)
- in options/control (or input) menu - disable "reduce input lag". It basically enables triple buffering, boosting my framerate from 43fps to 52fps instantly (don't forget to switch triple buffering to on in nvidia panel, although in my case, it doesn't make any difference
- this game is demanding. Don't turn on any AA just "because I have a fast GPU and this should be no problem".
Trine 2 works much faster on my gtx760 than Trine 3.
you might try to enable/disable shader cache, but in my case it doesn't make any difference.
My current settings for gtx760 for ultimate fluidness:
- 900p 60Hz or 1080p 50Hz (120/100Hz mode)
- AA off, input lag reduction off, very high settings with manually disabled glow in options.file
I was testing and testing so I didn't even finish the first level, might be different in other levels.
Convergence settings are not really well thought through. There are annoying objects really close to the screen forcing me to change convergence to just 2 positions away from the minimum on the in-game slider. :( I hope Frozenbyte will change that before releasing the retail version. I love high convergence, without it all 3D looks flat in comparison :(
No, I just run a single 980. I'm due to upgrade to Titan X SLI soon though and get a secondary 4K monitor for anything 2D.
i7 4790k @ 4.6 - 16GB RAM - 2x SLI Titan X
27" ASUS ROG SWIFT, 28" - 65" Samsung UHD8200 4k 3DTV - Oculus Rift CV1 - 34" Acer Predator X34 Ultrawide
Old kit:
i5 2500k @ 4.4 - 8gb RAM
Acer H5360BD projector
GTX 580, SLI 670, GTX 980 EVGA SC
Acer XB280HK 4k 60hz
Oculus DK2