OH MY GOD! Best GFX EVER! I bought glasses and a monitor
Ok, so I decided to buy a stereoscopic system. After seeing this post on usenet : [url="http://pymol.sourceforge.net/stereo3d.html"]http://pymol.sourceforge.net/stereo3d.html[/url] I bought a Dell P1230 off of ebay. I paid $165 for one new in box, obvious surplus from some corporation somewhere. I remember wanting a 22" monitor sooo bad years ago...
I have these : i-ware! 3-D system. The goods ones, the ones certified by nvidia and mentioned that Tom's Harware ad.
The effecti is INCREDIBLE. It can make a game with somewhat flat, dull graphics, like Prince of Persia The Two Thrones look 10 times better. In 3d it somehow richens the colors, smooths out the edges, makes the game look much better than the original textures. Further, I don't notice low resolution textures nearly so much.
I think this is because of the interpretation our brains do to process the 3d image that creates a smoother, unified image from the left and right channels. God's version of anti-aliasing.
With the same grade of graphics hardware, 3d stereoscopic looks better to me than many of the other advanced effects that take more horsepower.
Problems : Ghosting. This, I understand, is inherent to CRT displays. While the display may refresh often, excited phosphors take too long to fade.
Solutions : DLP projectors don't have this problem but they suffer from bad flickering. The only solution that does it all just about it a TARDIS like "nubie" here describes. It has no flicker, high contrast, and no ghosting as well as much cheaper glasses. I think I could build a decent one for $600.
If only I hadn't spent my budget on a $1400 53" HDTV. While it's good, the enormous screen doesn't have nearly the increase in realism that 3d does.
This form of gaming has SOOO much potential. But, it'll only take off if a decent 3d display becomes available. There's one based on 2 LCDs that works like a TARDIS and requires polarized glasses coming out in May. This puppy, assuming it uses 2 1650x1050 panels, absolutely REQUIRES an 8800 GTS or GTX to run recent games at a decent framerate.
Nvidia needs to write better driver support. There's an obvious way to implement 3d that should result in no loss in frame rate : SLI mode! Have each graphics card render the scene for each eye! Each card would work independently, alternating who gets to use the output jacks for one of the card each frame drawn. Per eye framerates would be exactly the same as in non 3d mode and just 1 card.
Certainly, the costs for all of this are pricier : if 3d took off, your gaming box would need two graphics cards, and 3d displays would cost about double what a regular display costs at any given time because of the extra LCD panel inside. However, the graphical quality would far exceed what consoles could do in a million years. (since consoles are unlikely to ever migrate to 3d HDTVs)
Ok, so I decided to buy a stereoscopic system. After seeing this post on usenet : http://pymol.sourceforge.net/stereo3d.html I bought a Dell P1230 off of ebay. I paid $165 for one new in box, obvious surplus from some corporation somewhere. I remember wanting a 22" monitor sooo bad years ago...
I have these : i-ware! 3-D system. The goods ones, the ones certified by nvidia and mentioned that Tom's Harware ad.
The effecti is INCREDIBLE. It can make a game with somewhat flat, dull graphics, like Prince of Persia The Two Thrones look 10 times better. In 3d it somehow richens the colors, smooths out the edges, makes the game look much better than the original textures. Further, I don't notice low resolution textures nearly so much.
I think this is because of the interpretation our brains do to process the 3d image that creates a smoother, unified image from the left and right channels. God's version of anti-aliasing.
With the same grade of graphics hardware, 3d stereoscopic looks better to me than many of the other advanced effects that take more horsepower.
Problems : Ghosting. This, I understand, is inherent to CRT displays. While the display may refresh often, excited phosphors take too long to fade.
Solutions : DLP projectors don't have this problem but they suffer from bad flickering. The only solution that does it all just about it a TARDIS like "nubie" here describes. It has no flicker, high contrast, and no ghosting as well as much cheaper glasses. I think I could build a decent one for $600.
If only I hadn't spent my budget on a $1400 53" HDTV. While it's good, the enormous screen doesn't have nearly the increase in realism that 3d does.
This form of gaming has SOOO much potential. But, it'll only take off if a decent 3d display becomes available. There's one based on 2 LCDs that works like a TARDIS and requires polarized glasses coming out in May. This puppy, assuming it uses 2 1650x1050 panels, absolutely REQUIRES an 8800 GTS or GTX to run recent games at a decent framerate.
Nvidia needs to write better driver support. There's an obvious way to implement 3d that should result in no loss in frame rate : SLI mode! Have each graphics card render the scene for each eye! Each card would work independently, alternating who gets to use the output jacks for one of the card each frame drawn. Per eye framerates would be exactly the same as in non 3d mode and just 1 card.
Certainly, the costs for all of this are pricier : if 3d took off, your gaming box would need two graphics cards, and 3d displays would cost about double what a regular display costs at any given time because of the extra LCD panel inside. However, the graphical quality would far exceed what consoles could do in a million years. (since consoles are unlikely to ever migrate to 3d HDTVs)
I PM'ed you, you don't have to build a Tardis, you can mimick the Planar setup, just have a machine shop turn you out a stand that can hold both screens and the mirror.
It is just a stand that puts one of the displays at 110° to the other and bisects that angle with a semi-transparent mirror. So a stand with two monitor mounts, a base, a bend between the monitors, and a mirror frame is all you really need.
And flip up the mirror when you aren't doing 3D.
If you want to get tricky you could even have the spare LCD so that you can unlatch it and put it with the other one for use in windows as a second display.
I PM'ed you, you don't have to build a Tardis, you can mimick the Planar setup, just have a machine shop turn you out a stand that can hold both screens and the mirror.
It is just a stand that puts one of the displays at 110° to the other and bisects that angle with a semi-transparent mirror. So a stand with two monitor mounts, a base, a bend between the monitors, and a mirror frame is all you really need.
And flip up the mirror when you aren't doing 3D.
If you want to get tricky you could even have the spare LCD so that you can unlatch it and put it with the other one for use in windows as a second display.
[quote name='hrznblack' date='Apr 21 2007, 06:36 AM']I don't think you'd double the frame rate by doing that. games still use your cpu a lot and a lot of games arn't well multi threaded
[right][snapback]187936[/snapback][/right]
[/quote]
If the cards weren't CPU starved right now it would help, if Nvidia would ever get around to writing SLI drivers for Stereo.
In reality a single 8800GTS 320mb can kill a 7900 series setup, even for 3D, all cards are dual output, so no problems there.
So if nVidia just gets the 8800 series drivers out it will be fine without the need for SLI. Of course all of us were expecting SLI to do stereo as a logical extension.
The game/program doesn't need to be multi-threaded, the driver for the card will handle the 3D, and it just needs to set up an identical scene and textures on each card, then put a different "camera" location to each card, of course that seems like it should be easy, and it probably isn't. If it was easy nvidia probably would have done it, it likely would take some work to sync the two cards, and perhaps the result is unsatisfactory in terms of performance.
I would be happy with a 50-75% increase in performance from dual cards in SLI, unfortunately I can get that with a more powerful single card, or one with more memory, or more system memory, or a faster CPU, or even some overclocking on the CPU or GPU, or perhaps the minimum frame-rates could be improved with a faster hard drive and making sure the CPU isn't running at 10-20% using the current hard drive controller "driver".
As long as your framerates don't routinely drop by 50% or average much under 60fps you should be fine, and the 7900 series does that fine on a single card, at least my 7900GS is OK, but I clock it to 650mhz on the core and 1.58ghz on the RAM.
[quote name='hrznblack' date='Apr 21 2007, 06:36 AM']I don't think you'd double the frame rate by doing that. games still use your cpu a lot and a lot of games arn't well multi threaded
[snapback]187936[/snapback]
If the cards weren't CPU starved right now it would help, if Nvidia would ever get around to writing SLI drivers for Stereo.
In reality a single 8800GTS 320mb can kill a 7900 series setup, even for 3D, all cards are dual output, so no problems there.
So if nVidia just gets the 8800 series drivers out it will be fine without the need for SLI. Of course all of us were expecting SLI to do stereo as a logical extension.
The game/program doesn't need to be multi-threaded, the driver for the card will handle the 3D, and it just needs to set up an identical scene and textures on each card, then put a different "camera" location to each card, of course that seems like it should be easy, and it probably isn't. If it was easy nvidia probably would have done it, it likely would take some work to sync the two cards, and perhaps the result is unsatisfactory in terms of performance.
I would be happy with a 50-75% increase in performance from dual cards in SLI, unfortunately I can get that with a more powerful single card, or one with more memory, or more system memory, or a faster CPU, or even some overclocking on the CPU or GPU, or perhaps the minimum frame-rates could be improved with a faster hard drive and making sure the CPU isn't running at 10-20% using the current hard drive controller "driver".
As long as your framerates don't routinely drop by 50% or average much under 60fps you should be fine, and the 7900 series does that fine on a single card, at least my 7900GS is OK, but I clock it to 650mhz on the core and 1.58ghz on the RAM.
Nubie and hrznblack : actually, you would precisely double your frame rates in every situation, were it properly implemented. 3d stereoscopic is the ideal application for SLI.
Here's why : the reason why SLI doesn't double the frame rates now is because to render the same scene faster means that animated objects (such as 3d models and such) need to have more interpolated frames generated for the motions they are doing. So like a moving arm on a character needs to have the CPU and the main game thread queried about what position the arm will be in in the extra frames beig drawn.
With SLI for stereoscopy, the game thread needs to know nothing about the fact that 2 perspectives are being generated. Basically, one card receives the EXACT same instructions it normally does. The other one receives a clone of this data, probably not even using any more CPU time up (the PCIe bus might be able to send a clone of data to two different devices, or there is an intercard link cable) with a simple program loaded on to it to offset the perspective of all the geometry drawn.
I haven't taken a 3d graphics course yet, but I am pretty sure a very simple transform can change point of view for every polygon with minimal processing time. This program would run on the card, and the SAME program would be loaded onto to other card, using up the same amount of processing time, so the cards never get out of sync.
So both cards use no more host CPU and run minimally slower than they would run normally with just 1 card generating non stereo.
If your framerates were 60 without stereo and single card, they'd be 58 EACH CARD with it.
Perceptually, you'd get the same framerates - it could seem like the extra card wasn't doing anything - but everything would be in 3d rather than 2d :P
Nubie and hrznblack : actually, you would precisely double your frame rates in every situation, were it properly implemented. 3d stereoscopic is the ideal application for SLI.
Here's why : the reason why SLI doesn't double the frame rates now is because to render the same scene faster means that animated objects (such as 3d models and such) need to have more interpolated frames generated for the motions they are doing. So like a moving arm on a character needs to have the CPU and the main game thread queried about what position the arm will be in in the extra frames beig drawn.
With SLI for stereoscopy, the game thread needs to know nothing about the fact that 2 perspectives are being generated. Basically, one card receives the EXACT same instructions it normally does. The other one receives a clone of this data, probably not even using any more CPU time up (the PCIe bus might be able to send a clone of data to two different devices, or there is an intercard link cable) with a simple program loaded on to it to offset the perspective of all the geometry drawn.
I haven't taken a 3d graphics course yet, but I am pretty sure a very simple transform can change point of view for every polygon with minimal processing time. This program would run on the card, and the SAME program would be loaded onto to other card, using up the same amount of processing time, so the cards never get out of sync.
So both cards use no more host CPU and run minimally slower than they would run normally with just 1 card generating non stereo.
If your framerates were 60 without stereo and single card, they'd be 58 EACH CARD with it.
Perceptually, you'd get the same framerates - it could seem like the extra card wasn't doing anything - but everything would be in 3d rather than 2d :P
Hi, After Reading this Post I decided to get a Newer Monitor and I am Glad I did. I bought a 22" NEC MultiSync FP2141SB and the Ghosting is Gone. I Usually had BAD Ghosting in Falcon 4 But With this New Monitor it has been eliminated!!
I did Read that Ghosting Happens mostly with "Older" CRT Monitors and with this newer 2003 monitor the Ghosting has been eliminated THNX for Posting this thread and the usenet link!!
Oh and the Colors are much more Brighter and the Whites are Actually White not an OffEggshell White!!
So if your in the market for a newer Monitor check out the link from usenet as they have some Monitor Suggestions and Mostr of them can be Had on Ebay!!
Hi, After Reading this Post I decided to get a Newer Monitor and I am Glad I did. I bought a 22" NEC MultiSync FP2141SB and the Ghosting is Gone. I Usually had BAD Ghosting in Falcon 4 But With this New Monitor it has been eliminated!!
I did Read that Ghosting Happens mostly with "Older" CRT Monitors and with this newer 2003 monitor the Ghosting has been eliminated THNX for Posting this thread and the usenet link!!
Oh and the Colors are much more Brighter and the Whites are Actually White not an OffEggshell White!!
So if your in the market for a newer Monitor check out the link from usenet as they have some Monitor Suggestions and Mostr of them can be Had on Ebay!!
Nephilim : are you sure the ghosting is gone? Close one eye while in 3d mode. Be looking at a HIGH contrast scene (one with very bright and very dark areas). You will see a double image.
Why? Because the monitor you bought uses the same phosphors as my monitor.
I suspect if you are getting less ghosting, it is because you are running at a lower refresh rate.
Nephilim : are you sure the ghosting is gone? Close one eye while in 3d mode. Be looking at a HIGH contrast scene (one with very bright and very dark areas). You will see a double image.
Why? Because the monitor you bought uses the same phosphors as my monitor.
I suspect if you are getting less ghosting, it is because you are running at a lower refresh rate.
[quote name='GeraldMonroe' date='Apr 21 2007, 07:03 PM']Nubie and hrznblack : actually, you would precisely double your frame rates in every situation, were it properly implemented. 3d stereoscopic is the ideal application for SLI.
Here's why : the reason why SLI doesn't double the frame rates now is because to render the same scene faster means that animated objects (such as 3d models and such) need to have more interpolated frames generated for the motions they are doing. So like a moving arm on a character needs to have the CPU and the main game thread queried about what position the arm will be in in the extra frames beig drawn.
With SLI for stereoscopy, the game thread needs to know nothing about the fact that 2 perspectives are being generated. Basically, one card receives the EXACT same instructions it normally does. The other one receives a clone of this data, probably not even using any more CPU time up (the PCIe bus might be able to send a clone of data to two different devices, or there is an intercard link cable) with a simple program loaded on to it to offset the perspective of all the geometry drawn.
I haven't taken a 3d graphics course yet, but I am pretty sure a very simple transform can change point of view for every polygon with minimal processing time. This program would run on the card, and the SAME program would be loaded onto to other card, using up the same amount of processing time, so the cards never get out of sync.
So both cards use no more host CPU and run minimally slower than they would run normally with just 1 card generating non stereo.
If your framerates were 60 without stereo and single card, they'd be 58 EACH CARD with it.
Perceptually, you'd get the same framerates - it could seem like the extra card wasn't doing anything - but everything would be in 3d rather than 2d :P
[right][snapback]188103[/snapback][/right]
[/quote]
In a perfect world, yes. But PCI-E works by packetizing PCI instructions and then transmitting them over the PCI-E "network" causing very high overhead.
Unfortunately for us we never get properly written software, that is why the hardware is always going faster and faster, to make up for the bad programming.
The CPU is involved in everything that the GPU is doing, unfortunate but true, even quad-core CPUs PCs at 4.5ghz need more CPU power for 8800 SLI, it must be something to do with memory transfers or control of the stream processors.
The nvidia cards don't seem to have an intelligent control unit onboard, neither does ATI, but they seem to be more concerned with it and have reduced the communication load to the card. See the article "pci express scaling analysis" for tests of an 8800 cards at x16 x8 x4 x1 bus links, you will see that the nvidia card needs more CPU time and bandwidth.
I hope they get around to it someday. But as it stands a single 8800GTX ACS3 edition should more than do it for any game you want to run, at any resolution you would want to run it, on as many screens as you would run it.
Funny that SLI is even around at all, it is a marketing gimmick, and I guess it works as well as the Dual-Core gimmick (Core2 Solo desktop processors aren't being released, it would eat into the dual-core market, even though a 4Ghz OC'ed single Core would easily handle most people's needs).
I have a 3Ghz socket 939 and I can watch 3 720p streams at one time with no lag. Dual core that.
[quote name='GeraldMonroe' date='Apr 21 2007, 07:03 PM']Nubie and hrznblack : actually, you would precisely double your frame rates in every situation, were it properly implemented. 3d stereoscopic is the ideal application for SLI.
Here's why : the reason why SLI doesn't double the frame rates now is because to render the same scene faster means that animated objects (such as 3d models and such) need to have more interpolated frames generated for the motions they are doing. So like a moving arm on a character needs to have the CPU and the main game thread queried about what position the arm will be in in the extra frames beig drawn.
With SLI for stereoscopy, the game thread needs to know nothing about the fact that 2 perspectives are being generated. Basically, one card receives the EXACT same instructions it normally does. The other one receives a clone of this data, probably not even using any more CPU time up (the PCIe bus might be able to send a clone of data to two different devices, or there is an intercard link cable) with a simple program loaded on to it to offset the perspective of all the geometry drawn.
I haven't taken a 3d graphics course yet, but I am pretty sure a very simple transform can change point of view for every polygon with minimal processing time. This program would run on the card, and the SAME program would be loaded onto to other card, using up the same amount of processing time, so the cards never get out of sync.
So both cards use no more host CPU and run minimally slower than they would run normally with just 1 card generating non stereo.
If your framerates were 60 without stereo and single card, they'd be 58 EACH CARD with it.
Perceptually, you'd get the same framerates - it could seem like the extra card wasn't doing anything - but everything would be in 3d rather than 2d :P
[snapback]188103[/snapback]
In a perfect world, yes. But PCI-E works by packetizing PCI instructions and then transmitting them over the PCI-E "network" causing very high overhead.
Unfortunately for us we never get properly written software, that is why the hardware is always going faster and faster, to make up for the bad programming.
The CPU is involved in everything that the GPU is doing, unfortunate but true, even quad-core CPUs PCs at 4.5ghz need more CPU power for 8800 SLI, it must be something to do with memory transfers or control of the stream processors.
The nvidia cards don't seem to have an intelligent control unit onboard, neither does ATI, but they seem to be more concerned with it and have reduced the communication load to the card. See the article "pci express scaling analysis" for tests of an 8800 cards at x16 x8 x4 x1 bus links, you will see that the nvidia card needs more CPU time and bandwidth.
I hope they get around to it someday. But as it stands a single 8800GTX ACS3 edition should more than do it for any game you want to run, at any resolution you would want to run it, on as many screens as you would run it.
Funny that SLI is even around at all, it is a marketing gimmick, and I guess it works as well as the Dual-Core gimmick (Core2 Solo desktop processors aren't being released, it would eat into the dual-core market, even though a 4Ghz OC'ed single Core would easily handle most people's needs).
I have a 3Ghz socket 939 and I can watch 3 720p streams at one time with no lag. Dual core that.
[quote name='Habeed' date='May 3 2007, 05:09 PM']Nephilim : are you sure the ghosting is gone? Close one eye while in 3d mode. Be looking at a HIGH contrast scene (one with very bright and very dark areas). You will see a double image.
Why? Because the monitor you bought uses the same phosphors as my monitor.
I suspect if you are getting less ghosting, it is because you are running at a lower refresh rate.
[right][snapback]192919[/snapback][/right]
[/quote]
YES I am Sure the Ghosting os GONE in Falcon 4 Allied Force!! I did your suggestion and did Not see a Double image.
I was using a 120hz Refresh rate but now am using 85 and at BOTH they are the same NO Ghosting!!
EDIT: I suspect that you are using a Higher Stereo Seperation which will lead to the Ghosting Also, I would suggest turning Down the Stereo Seperation.
[quote name='Habeed' date='May 3 2007, 05:09 PM']Nephilim : are you sure the ghosting is gone? Close one eye while in 3d mode. Be looking at a HIGH contrast scene (one with very bright and very dark areas). You will see a double image.
Why? Because the monitor you bought uses the same phosphors as my monitor.
I suspect if you are getting less ghosting, it is because you are running at a lower refresh rate.
[snapback]192919[/snapback]
YES I am Sure the Ghosting os GONE in Falcon 4 Allied Force!! I did your suggestion and did Not see a Double image.
I was using a 120hz Refresh rate but now am using 85 and at BOTH they are the same NO Ghosting!!
EDIT: I suspect that you are using a Higher Stereo Seperation which will lead to the Ghosting Also, I would suggest turning Down the Stereo Seperation.
I have these : i-ware! 3-D system. The goods ones, the ones certified by nvidia and mentioned that Tom's Harware ad.
The effecti is INCREDIBLE. It can make a game with somewhat flat, dull graphics, like Prince of Persia The Two Thrones look 10 times better. In 3d it somehow richens the colors, smooths out the edges, makes the game look much better than the original textures. Further, I don't notice low resolution textures nearly so much.
I think this is because of the interpretation our brains do to process the 3d image that creates a smoother, unified image from the left and right channels. God's version of anti-aliasing.
With the same grade of graphics hardware, 3d stereoscopic looks better to me than many of the other advanced effects that take more horsepower.
Problems : Ghosting. This, I understand, is inherent to CRT displays. While the display may refresh often, excited phosphors take too long to fade.
Solutions : DLP projectors don't have this problem but they suffer from bad flickering. The only solution that does it all just about it a TARDIS like "nubie" here describes. It has no flicker, high contrast, and no ghosting as well as much cheaper glasses. I think I could build a decent one for $600.
If only I hadn't spent my budget on a $1400 53" HDTV. While it's good, the enormous screen doesn't have nearly the increase in realism that 3d does.
This form of gaming has SOOO much potential. But, it'll only take off if a decent 3d display becomes available. There's one based on 2 LCDs that works like a TARDIS and requires polarized glasses coming out in May. This puppy, assuming it uses 2 1650x1050 panels, absolutely REQUIRES an 8800 GTS or GTX to run recent games at a decent framerate.
Nvidia needs to write better driver support. There's an obvious way to implement 3d that should result in no loss in frame rate : SLI mode! Have each graphics card render the scene for each eye! Each card would work independently, alternating who gets to use the output jacks for one of the card each frame drawn. Per eye framerates would be exactly the same as in non 3d mode and just 1 card.
Certainly, the costs for all of this are pricier : if 3d took off, your gaming box would need two graphics cards, and 3d displays would cost about double what a regular display costs at any given time because of the extra LCD panel inside. However, the graphical quality would far exceed what consoles could do in a million years. (since consoles are unlikely to ever migrate to 3d HDTVs)
I have these : i-ware! 3-D system. The goods ones, the ones certified by nvidia and mentioned that Tom's Harware ad.
The effecti is INCREDIBLE. It can make a game with somewhat flat, dull graphics, like Prince of Persia The Two Thrones look 10 times better. In 3d it somehow richens the colors, smooths out the edges, makes the game look much better than the original textures. Further, I don't notice low resolution textures nearly so much.
I think this is because of the interpretation our brains do to process the 3d image that creates a smoother, unified image from the left and right channels. God's version of anti-aliasing.
With the same grade of graphics hardware, 3d stereoscopic looks better to me than many of the other advanced effects that take more horsepower.
Problems : Ghosting. This, I understand, is inherent to CRT displays. While the display may refresh often, excited phosphors take too long to fade.
Solutions : DLP projectors don't have this problem but they suffer from bad flickering. The only solution that does it all just about it a TARDIS like "nubie" here describes. It has no flicker, high contrast, and no ghosting as well as much cheaper glasses. I think I could build a decent one for $600.
If only I hadn't spent my budget on a $1400 53" HDTV. While it's good, the enormous screen doesn't have nearly the increase in realism that 3d does.
This form of gaming has SOOO much potential. But, it'll only take off if a decent 3d display becomes available. There's one based on 2 LCDs that works like a TARDIS and requires polarized glasses coming out in May. This puppy, assuming it uses 2 1650x1050 panels, absolutely REQUIRES an 8800 GTS or GTX to run recent games at a decent framerate.
Nvidia needs to write better driver support. There's an obvious way to implement 3d that should result in no loss in frame rate : SLI mode! Have each graphics card render the scene for each eye! Each card would work independently, alternating who gets to use the output jacks for one of the card each frame drawn. Per eye framerates would be exactly the same as in non 3d mode and just 1 card.
Certainly, the costs for all of this are pricier : if 3d took off, your gaming box would need two graphics cards, and 3d displays would cost about double what a regular display costs at any given time because of the extra LCD panel inside. However, the graphical quality would far exceed what consoles could do in a million years. (since consoles are unlikely to ever migrate to 3d HDTVs)
I PM'ed you, you don't have to build a Tardis, you can mimick the Planar setup, just have a machine shop turn you out a stand that can hold both screens and the mirror.
[url="http://www.planar.com/products/flatpanel_monitors/stereoscopic/"]http://www.planar.com/products/flatpanel_m...s/stereoscopic/[/url]
It is just a stand that puts one of the displays at 110° to the other and bisects that angle with a semi-transparent mirror. So a stand with two monitor mounts, a base, a bend between the monitors, and a mirror frame is all you really need.
And flip up the mirror when you aren't doing 3D.
If you want to get tricky you could even have the spare LCD so that you can unlatch it and put it with the other one for use in windows as a second display.
I PM'ed you, you don't have to build a Tardis, you can mimick the Planar setup, just have a machine shop turn you out a stand that can hold both screens and the mirror.
http://www.planar.com/products/flatpanel_m...s/stereoscopic/
It is just a stand that puts one of the displays at 110° to the other and bisects that angle with a semi-transparent mirror. So a stand with two monitor mounts, a base, a bend between the monitors, and a mirror frame is all you really need.
And flip up the mirror when you aren't doing 3D.
If you want to get tricky you could even have the spare LCD so that you can unlatch it and put it with the other one for use in windows as a second display.
[right][snapback]187936[/snapback][/right]
[/quote]
If the cards weren't CPU starved right now it would help, if Nvidia would ever get around to writing SLI drivers for Stereo.
In reality a single 8800GTS 320mb can kill a 7900 series setup, even for 3D, all cards are dual output, so no problems there.
So if nVidia just gets the 8800 series drivers out it will be fine without the need for SLI. Of course all of us were expecting SLI to do stereo as a logical extension.
The game/program doesn't need to be multi-threaded, the driver for the card will handle the 3D, and it just needs to set up an identical scene and textures on each card, then put a different "camera" location to each card, of course that seems like it should be easy, and it probably isn't. If it was easy nvidia probably would have done it, it likely would take some work to sync the two cards, and perhaps the result is unsatisfactory in terms of performance.
I would be happy with a 50-75% increase in performance from dual cards in SLI, unfortunately I can get that with a more powerful single card, or one with more memory, or more system memory, or a faster CPU, or even some overclocking on the CPU or GPU, or perhaps the minimum frame-rates could be improved with a faster hard drive and making sure the CPU isn't running at 10-20% using the current hard drive controller "driver".
As long as your framerates don't routinely drop by 50% or average much under 60fps you should be fine, and the 7900 series does that fine on a single card, at least my 7900GS is OK, but I clock it to 650mhz on the core and 1.58ghz on the RAM.
If the cards weren't CPU starved right now it would help, if Nvidia would ever get around to writing SLI drivers for Stereo.
In reality a single 8800GTS 320mb can kill a 7900 series setup, even for 3D, all cards are dual output, so no problems there.
So if nVidia just gets the 8800 series drivers out it will be fine without the need for SLI. Of course all of us were expecting SLI to do stereo as a logical extension.
The game/program doesn't need to be multi-threaded, the driver for the card will handle the 3D, and it just needs to set up an identical scene and textures on each card, then put a different "camera" location to each card, of course that seems like it should be easy, and it probably isn't. If it was easy nvidia probably would have done it, it likely would take some work to sync the two cards, and perhaps the result is unsatisfactory in terms of performance.
I would be happy with a 50-75% increase in performance from dual cards in SLI, unfortunately I can get that with a more powerful single card, or one with more memory, or more system memory, or a faster CPU, or even some overclocking on the CPU or GPU, or perhaps the minimum frame-rates could be improved with a faster hard drive and making sure the CPU isn't running at 10-20% using the current hard drive controller "driver".
As long as your framerates don't routinely drop by 50% or average much under 60fps you should be fine, and the 7900 series does that fine on a single card, at least my 7900GS is OK, but I clock it to 650mhz on the core and 1.58ghz on the RAM.
Here's why : the reason why SLI doesn't double the frame rates now is because to render the same scene faster means that animated objects (such as 3d models and such) need to have more interpolated frames generated for the motions they are doing. So like a moving arm on a character needs to have the CPU and the main game thread queried about what position the arm will be in in the extra frames beig drawn.
With SLI for stereoscopy, the game thread needs to know nothing about the fact that 2 perspectives are being generated. Basically, one card receives the EXACT same instructions it normally does. The other one receives a clone of this data, probably not even using any more CPU time up (the PCIe bus might be able to send a clone of data to two different devices, or there is an intercard link cable) with a simple program loaded on to it to offset the perspective of all the geometry drawn.
I haven't taken a 3d graphics course yet, but I am pretty sure a very simple transform can change point of view for every polygon with minimal processing time. This program would run on the card, and the SAME program would be loaded onto to other card, using up the same amount of processing time, so the cards never get out of sync.
So both cards use no more host CPU and run minimally slower than they would run normally with just 1 card generating non stereo.
If your framerates were 60 without stereo and single card, they'd be 58 EACH CARD with it.
Perceptually, you'd get the same framerates - it could seem like the extra card wasn't doing anything - but everything would be in 3d rather than 2d :P
Here's why : the reason why SLI doesn't double the frame rates now is because to render the same scene faster means that animated objects (such as 3d models and such) need to have more interpolated frames generated for the motions they are doing. So like a moving arm on a character needs to have the CPU and the main game thread queried about what position the arm will be in in the extra frames beig drawn.
With SLI for stereoscopy, the game thread needs to know nothing about the fact that 2 perspectives are being generated. Basically, one card receives the EXACT same instructions it normally does. The other one receives a clone of this data, probably not even using any more CPU time up (the PCIe bus might be able to send a clone of data to two different devices, or there is an intercard link cable) with a simple program loaded on to it to offset the perspective of all the geometry drawn.
I haven't taken a 3d graphics course yet, but I am pretty sure a very simple transform can change point of view for every polygon with minimal processing time. This program would run on the card, and the SAME program would be loaded onto to other card, using up the same amount of processing time, so the cards never get out of sync.
So both cards use no more host CPU and run minimally slower than they would run normally with just 1 card generating non stereo.
If your framerates were 60 without stereo and single card, they'd be 58 EACH CARD with it.
Perceptually, you'd get the same framerates - it could seem like the extra card wasn't doing anything - but everything would be in 3d rather than 2d :P
I did Read that Ghosting Happens mostly with "Older" CRT Monitors and with this newer 2003 monitor the Ghosting has been eliminated THNX for Posting this thread and the usenet link!!
Oh and the Colors are much more Brighter and the Whites are Actually White not an OffEggshell White!!
So if your in the market for a newer Monitor check out the link from usenet as they have some Monitor Suggestions and Mostr of them can be Had on Ebay!!
I did Read that Ghosting Happens mostly with "Older" CRT Monitors and with this newer 2003 monitor the Ghosting has been eliminated THNX for Posting this thread and the usenet link!!
Oh and the Colors are much more Brighter and the Whites are Actually White not an OffEggshell White!!
So if your in the market for a newer Monitor check out the link from usenet as they have some Monitor Suggestions and Mostr of them can be Had on Ebay!!
Intel i5 7600K @ 4.8ghz / MSI Z270 SLI / Asus 1080GTX - 416.16 / Optoma HD142x Projector / 1 4'x10' Curved Screen PVC / TrackIR / HOTAS Cougar / Cougar MFD's / Track IR / NVidia 3D Vision / Win 10 64bit
Why? Because the monitor you bought uses the same phosphors as my monitor.
I suspect if you are getting less ghosting, it is because you are running at a lower refresh rate.
Why? Because the monitor you bought uses the same phosphors as my monitor.
I suspect if you are getting less ghosting, it is because you are running at a lower refresh rate.
Here's why : the reason why SLI doesn't double the frame rates now is because to render the same scene faster means that animated objects (such as 3d models and such) need to have more interpolated frames generated for the motions they are doing. So like a moving arm on a character needs to have the CPU and the main game thread queried about what position the arm will be in in the extra frames beig drawn.
With SLI for stereoscopy, the game thread needs to know nothing about the fact that 2 perspectives are being generated. Basically, one card receives the EXACT same instructions it normally does. The other one receives a clone of this data, probably not even using any more CPU time up (the PCIe bus might be able to send a clone of data to two different devices, or there is an intercard link cable) with a simple program loaded on to it to offset the perspective of all the geometry drawn.
I haven't taken a 3d graphics course yet, but I am pretty sure a very simple transform can change point of view for every polygon with minimal processing time. This program would run on the card, and the SAME program would be loaded onto to other card, using up the same amount of processing time, so the cards never get out of sync.
So both cards use no more host CPU and run minimally slower than they would run normally with just 1 card generating non stereo.
If your framerates were 60 without stereo and single card, they'd be 58 EACH CARD with it.
Perceptually, you'd get the same framerates - it could seem like the extra card wasn't doing anything - but everything would be in 3d rather than 2d :P
[right][snapback]188103[/snapback][/right]
[/quote]
In a perfect world, yes. But PCI-E works by packetizing PCI instructions and then transmitting them over the PCI-E "network" causing very high overhead.
Unfortunately for us we never get properly written software, that is why the hardware is always going faster and faster, to make up for the bad programming.
The CPU is involved in everything that the GPU is doing, unfortunate but true, even quad-core CPUs PCs at 4.5ghz need more CPU power for 8800 SLI, it must be something to do with memory transfers or control of the stream processors.
The nvidia cards don't seem to have an intelligent control unit onboard, neither does ATI, but they seem to be more concerned with it and have reduced the communication load to the card. See the article "pci express scaling analysis" for tests of an 8800 cards at x16 x8 x4 x1 bus links, you will see that the nvidia card needs more CPU time and bandwidth.
I hope they get around to it someday. But as it stands a single 8800GTX ACS3 edition should more than do it for any game you want to run, at any resolution you would want to run it, on as many screens as you would run it.
Funny that SLI is even around at all, it is a marketing gimmick, and I guess it works as well as the Dual-Core gimmick (Core2 Solo desktop processors aren't being released, it would eat into the dual-core market, even though a 4Ghz OC'ed single Core would easily handle most people's needs).
I have a 3Ghz socket 939 and I can watch 3 720p streams at one time with no lag. Dual core that.
Here's why : the reason why SLI doesn't double the frame rates now is because to render the same scene faster means that animated objects (such as 3d models and such) need to have more interpolated frames generated for the motions they are doing. So like a moving arm on a character needs to have the CPU and the main game thread queried about what position the arm will be in in the extra frames beig drawn.
With SLI for stereoscopy, the game thread needs to know nothing about the fact that 2 perspectives are being generated. Basically, one card receives the EXACT same instructions it normally does. The other one receives a clone of this data, probably not even using any more CPU time up (the PCIe bus might be able to send a clone of data to two different devices, or there is an intercard link cable) with a simple program loaded on to it to offset the perspective of all the geometry drawn.
I haven't taken a 3d graphics course yet, but I am pretty sure a very simple transform can change point of view for every polygon with minimal processing time. This program would run on the card, and the SAME program would be loaded onto to other card, using up the same amount of processing time, so the cards never get out of sync.
So both cards use no more host CPU and run minimally slower than they would run normally with just 1 card generating non stereo.
If your framerates were 60 without stereo and single card, they'd be 58 EACH CARD with it.
Perceptually, you'd get the same framerates - it could seem like the extra card wasn't doing anything - but everything would be in 3d rather than 2d :P
In a perfect world, yes. But PCI-E works by packetizing PCI instructions and then transmitting them over the PCI-E "network" causing very high overhead.
Unfortunately for us we never get properly written software, that is why the hardware is always going faster and faster, to make up for the bad programming.
The CPU is involved in everything that the GPU is doing, unfortunate but true, even quad-core CPUs PCs at 4.5ghz need more CPU power for 8800 SLI, it must be something to do with memory transfers or control of the stream processors.
The nvidia cards don't seem to have an intelligent control unit onboard, neither does ATI, but they seem to be more concerned with it and have reduced the communication load to the card. See the article "pci express scaling analysis" for tests of an 8800 cards at x16 x8 x4 x1 bus links, you will see that the nvidia card needs more CPU time and bandwidth.
I hope they get around to it someday. But as it stands a single 8800GTX ACS3 edition should more than do it for any game you want to run, at any resolution you would want to run it, on as many screens as you would run it.
Funny that SLI is even around at all, it is a marketing gimmick, and I guess it works as well as the Dual-Core gimmick (Core2 Solo desktop processors aren't being released, it would eat into the dual-core market, even though a 4Ghz OC'ed single Core would easily handle most people's needs).
I have a 3Ghz socket 939 and I can watch 3 720p streams at one time with no lag. Dual core that.
Why? Because the monitor you bought uses the same phosphors as my monitor.
I suspect if you are getting less ghosting, it is because you are running at a lower refresh rate.
[right][snapback]192919[/snapback][/right]
[/quote]
YES I am Sure the Ghosting os GONE in Falcon 4 Allied Force!! I did your suggestion and did Not see a Double image.
I was using a 120hz Refresh rate but now am using 85 and at BOTH they are the same NO Ghosting!!
EDIT: I suspect that you are using a Higher Stereo Seperation which will lead to the Ghosting Also, I would suggest turning Down the Stereo Seperation.
Why? Because the monitor you bought uses the same phosphors as my monitor.
I suspect if you are getting less ghosting, it is because you are running at a lower refresh rate.
YES I am Sure the Ghosting os GONE in Falcon 4 Allied Force!! I did your suggestion and did Not see a Double image.
I was using a 120hz Refresh rate but now am using 85 and at BOTH they are the same NO Ghosting!!
EDIT: I suspect that you are using a Higher Stereo Seperation which will lead to the Ghosting Also, I would suggest turning Down the Stereo Seperation.
Intel i5 7600K @ 4.8ghz / MSI Z270 SLI / Asus 1080GTX - 416.16 / Optoma HD142x Projector / 1 4'x10' Curved Screen PVC / TrackIR / HOTAS Cougar / Cougar MFD's / Track IR / NVidia 3D Vision / Win 10 64bit