My original 3D Vision gaming rig is a Core 2 Duo E8600 with 2GB DDR2 and SLIx2 Geforce 9800GTs. I tested several games and, overall, walked away completely disillusioned with 3D Vision. In particular, the ghosting was just too distracting for the games to be enjoyable. I boxed up my 3D Vision gaming rig and my 3D Vision kit (glasses and USB emitter).
Recently, I build a low-cost, low-power desktop based on dual-core Celeron and replaced the crappy onboard video with a Geforce G210, the cheapest PCI-E card at Best Buy that day. Since I still enjoyed 3D movies and pictures, I decided to bring my 3D Vision kit back out of the closet. Then, I had the idea to re-run several older games at relatively low quality and just see how strong and excellent or weak and ghosted the 3D Vision experience would be with a single card solution.
Several rigorous tests later: I'm in love with 3D Vision. Manhunt, UT99, Deus Ex, Morrowind, and Stalker SoC all look incredible in 3D Vision, even on this pitifully slow G210. However, the same games hurt my eyes when I was playing on a much more powerful SLI-based setup.
My theory: according to various sites, both SLI and Crossfire suffer from unavoidable frame artifacts due to timing of the redundant video cards. While these artifacts may not easily be noticeable during a typical 2D gaming session, they could be the cause of additional ghosting that I noticed. Please share your more accurate and detailed theories.
My original 3D Vision gaming rig is a Core 2 Duo E8600 with 2GB DDR2 and SLIx2 Geforce 9800GTs. I tested several games and, overall, walked away completely disillusioned with 3D Vision. In particular, the ghosting was just too distracting for the games to be enjoyable. I boxed up my 3D Vision gaming rig and my 3D Vision kit (glasses and USB emitter).
Recently, I build a low-cost, low-power desktop based on dual-core Celeron and replaced the crappy onboard video with a Geforce G210, the cheapest PCI-E card at Best Buy that day. Since I still enjoyed 3D movies and pictures, I decided to bring my 3D Vision kit back out of the closet. Then, I had the idea to re-run several older games at relatively low quality and just see how strong and excellent or weak and ghosted the 3D Vision experience would be with a single card solution.
Several rigorous tests later: I'm in love with 3D Vision. Manhunt, UT99, Deus Ex, Morrowind, and Stalker SoC all look incredible in 3D Vision, even on this pitifully slow G210. However, the same games hurt my eyes when I was playing on a much more powerful SLI-based setup.
My theory: according to various sites, both SLI and Crossfire suffer from unavoidable frame artifacts due to timing of the redundant video cards. While these artifacts may not easily be noticeable during a typical 2D gaming session, they could be the cause of additional ghosting that I noticed. Please share your more accurate and detailed theories.
@Raptor. Reasonable, since that would isolate whether SLI or the 9800GT is the culprit. However, I'll just wait until the GTX 480 drops in price a bit and buy a single one.
@Raptor. Reasonable, since that would isolate whether SLI or the 9800GT is the culprit. However, I'll just wait until the GTX 480 drops in price a bit and buy a single one.
no ghosting on my sli rig..... i just tested single card and found the experience to be inferior (much slower).. i wonder what is going on. are you testing the same games on the same monitor?
i bet the drivers have better support for your monitor and have just become better. When i first got my dlp tv 7 or 8 months ago, i saw ghosting at max depth- then when i updated to newer drivers- ghosting when away for good.
no ghosting on my sli rig..... i just tested single card and found the experience to be inferior (much slower).. i wonder what is going on. are you testing the same games on the same monitor?
i bet the drivers have better support for your monitor and have just become better. When i first got my dlp tv 7 or 8 months ago, i saw ghosting at max depth- then when i updated to newer drivers- ghosting when away for good.
System:
Intel I7 920 overclocked to 4ghz
Asus Rampage Extreme II
2 Ge-force 480 in SLI
GTX 295 PhysX Card
12gb ddr3 2000mhz ram
Intel SSD in RAID 0
BR RW
1000w Sony surround sound
NVIDIA 3D Vision
3d displays tested:
Mitsubishi 65" DLP 3d HDTV (good old 1080p checkerboard since 2007!!!)
Panasonic VT25 (nice 2d but I returned it due to cross talk)
Acer H5360 720p on 130" screen (the best 3d)
23" Acer LCD monitor (horrible cross talk- sold it)
I have never had any significant ghosting except maybe in areas where there is high contrast but that's just the nature of how the 3D Vision system works.
I don't think there should be any difference with 3D Vision. I have run it on the following systems:
1 8800 GTX with Vista 64
2 8800 GTX in SLI with Vista 64
2 8800 GTX in SLI with Windows 7 64
1 480 GTX with Windows 7 64
2 480 GTX in SLI with Windows 7 64
The same motherboard/memory was used in all of the above. There shouldn't be any difference aside from frame rate as far as I can think of. The only thing I had to do was update drivers when switching to the 480 GTX.
I have never had any significant ghosting except maybe in areas where there is high contrast but that's just the nature of how the 3D Vision system works.
I don't think there should be any difference with 3D Vision. I have run it on the following systems:
1 8800 GTX with Vista 64
2 8800 GTX in SLI with Vista 64
2 8800 GTX in SLI with Windows 7 64
1 480 GTX with Windows 7 64
2 480 GTX in SLI with Windows 7 64
The same motherboard/memory was used in all of the above. There shouldn't be any difference aside from frame rate as far as I can think of. The only thing I had to do was update drivers when switching to the 480 GTX.
Recently, I build a low-cost, low-power desktop based on dual-core Celeron and replaced the crappy onboard video with a Geforce G210, the cheapest PCI-E card at Best Buy that day. Since I still enjoyed 3D movies and pictures, I decided to bring my 3D Vision kit back out of the closet. Then, I had the idea to re-run several older games at relatively low quality and just see how strong and excellent or weak and ghosted the 3D Vision experience would be with a single card solution.
Several rigorous tests later: I'm in love with 3D Vision. Manhunt, UT99, Deus Ex, Morrowind, and Stalker SoC all look incredible in 3D Vision, even on this pitifully slow G210. However, the same games hurt my eyes when I was playing on a much more powerful SLI-based setup.
My theory: according to various sites, both SLI and Crossfire suffer from unavoidable frame artifacts due to timing of the redundant video cards. While these artifacts may not easily be noticeable during a typical 2D gaming session, they could be the cause of additional ghosting that I noticed. Please share your more accurate and detailed theories.
Recently, I build a low-cost, low-power desktop based on dual-core Celeron and replaced the crappy onboard video with a Geforce G210, the cheapest PCI-E card at Best Buy that day. Since I still enjoyed 3D movies and pictures, I decided to bring my 3D Vision kit back out of the closet. Then, I had the idea to re-run several older games at relatively low quality and just see how strong and excellent or weak and ghosted the 3D Vision experience would be with a single card solution.
Several rigorous tests later: I'm in love with 3D Vision. Manhunt, UT99, Deus Ex, Morrowind, and Stalker SoC all look incredible in 3D Vision, even on this pitifully slow G210. However, the same games hurt my eyes when I was playing on a much more powerful SLI-based setup.
My theory: according to various sites, both SLI and Crossfire suffer from unavoidable frame artifacts due to timing of the redundant video cards. While these artifacts may not easily be noticeable during a typical 2D gaming session, they could be the cause of additional ghosting that I noticed. Please share your more accurate and detailed theories.
// Owned: MX420 // MX440 // FX5200 Ultra // 7600GT // 8800GTS 320MB // 9800GX2 // GTX275 // SLI GTX580s // SLI GTX780Tis //
________________________________________________________________________________________________________________________
check my blog - cybereality.com
i bet the drivers have better support for your monitor and have just become better. When i first got my dlp tv 7 or 8 months ago, i saw ghosting at max depth- then when i updated to newer drivers- ghosting when away for good.
i bet the drivers have better support for your monitor and have just become better. When i first got my dlp tv 7 or 8 months ago, i saw ghosting at max depth- then when i updated to newer drivers- ghosting when away for good.
System:
Intel I7 920 overclocked to 4ghz
Asus Rampage Extreme II
2 Ge-force 480 in SLI
GTX 295 PhysX Card
12gb ddr3 2000mhz ram
Intel SSD in RAID 0
BR RW
1000w Sony surround sound
NVIDIA 3D Vision
3d displays tested:
Mitsubishi 65" DLP 3d HDTV (good old 1080p checkerboard since 2007!!!)
Panasonic VT25 (nice 2d but I returned it due to cross talk)
Acer H5360 720p on 130" screen (the best 3d)
23" Acer LCD monitor (horrible cross talk- sold it)
Samsung 65D8000
I don't think there should be any difference with 3D Vision. I have run it on the following systems:
1 8800 GTX with Vista 64
2 8800 GTX in SLI with Vista 64
2 8800 GTX in SLI with Windows 7 64
1 480 GTX with Windows 7 64
2 480 GTX in SLI with Windows 7 64
The same motherboard/memory was used in all of the above. There shouldn't be any difference aside from frame rate as far as I can think of. The only thing I had to do was update drivers when switching to the 480 GTX.
I don't think there should be any difference with 3D Vision. I have run it on the following systems:
1 8800 GTX with Vista 64
2 8800 GTX in SLI with Vista 64
2 8800 GTX in SLI with Windows 7 64
1 480 GTX with Windows 7 64
2 480 GTX in SLI with Windows 7 64
The same motherboard/memory was used in all of the above. There shouldn't be any difference aside from frame rate as far as I can think of. The only thing I had to do was update drivers when switching to the 480 GTX.