Anybody else thinking of swithing to ATI?
  3 / 3    
The AMD HD3D supported hardware contains mainly 3DTVs and DLP link projectors. Yes of course the 3D monitor section is empty but I find it good AMD does not split the market between 3DTV and other displays like Nvidia does.

Also AMD is really bad at explaining how their system works to the public. Their technical presentations of the AMD open stereo initiative and PDFs are very clear (but long and not suitable for the public) while their public web page is just full of confusing nonsense.

According to user reports the ATI HD3D feature works for some DLP link projectors, for others there are display detection issues, the only report from using a 3DTV we have at the moment is the one from 3D Vision blog and there seems to be a big bug with the Panasonic 3DTV, we don't know if it's just an issue about Panasonic TVs or if it's a general hdmi1.4 support problem.
Anyway I have an ATi Radeon card with a Zalman Trimon monitor, and the HD3D standard thing does not work for this monitor... So at the moment, it's the good old usual way of making apps work (iZ3D, and DDD directly without any assistance from the AMD driver).
It's like if they had to announce it before the feature was really finished, because the new HD6800 graphics cards are out. It's so obvious their 3D outputs is still under heavy development.
The AMD HD3D supported hardware contains mainly 3DTVs and DLP link projectors. Yes of course the 3D monitor section is empty but I find it good AMD does not split the market between 3DTV and other displays like Nvidia does.



Also AMD is really bad at explaining how their system works to the public. Their technical presentations of the AMD open stereo initiative and PDFs are very clear (but long and not suitable for the public) while their public web page is just full of confusing nonsense.



According to user reports the ATI HD3D feature works for some DLP link projectors, for others there are display detection issues, the only report from using a 3DTV we have at the moment is the one from 3D Vision blog and there seems to be a big bug with the Panasonic 3DTV, we don't know if it's just an issue about Panasonic TVs or if it's a general hdmi1.4 support problem.

Anyway I have an ATi Radeon card with a Zalman Trimon monitor, and the HD3D standard thing does not work for this monitor... So at the moment, it's the good old usual way of making apps work (iZ3D, and DDD directly without any assistance from the AMD driver).

It's like if they had to announce it before the feature was really finished, because the new HD6800 graphics cards are out. It's so obvious their 3D outputs is still under heavy development.

Passive 3D forever
110" DIY dual-projection system
2x Epson EH-TW3500 (1080p) + Linear Polarizers (SPAR)
XtremScreen Daylight 2.0
VNS Geobox501 signal converter

#31
Posted 10/30/2010 01:54 PM   
The AMD HD3D supported hardware contains mainly 3DTVs and DLP link projectors. Yes of course the 3D monitor section is empty but I find it good AMD does not split the market between 3DTV and other displays like Nvidia does.

Also AMD is really bad at explaining how their system works to the public. Their technical presentations of the AMD open stereo initiative and PDFs are very clear (but long and not suitable for the public) while their public web page is just full of confusing nonsense.

According to user reports the ATI HD3D feature works for some DLP link projectors, for others there are display detection issues, the only report from using a 3DTV we have at the moment is the one from 3D Vision blog and there seems to be a big bug with the Panasonic 3DTV, we don't know if it's just an issue about Panasonic TVs or if it's a general hdmi1.4 support problem.
Anyway I have an ATi Radeon card with a Zalman Trimon monitor, and the HD3D standard thing does not work for this monitor... So at the moment, it's the good old usual way of making apps work (iZ3D, and DDD directly without any assistance from the AMD driver).
It's like if they had to announce it before the feature was really finished, because the new HD6800 graphics cards are out. It's so obvious their 3D outputs is still under heavy development.
The AMD HD3D supported hardware contains mainly 3DTVs and DLP link projectors. Yes of course the 3D monitor section is empty but I find it good AMD does not split the market between 3DTV and other displays like Nvidia does.



Also AMD is really bad at explaining how their system works to the public. Their technical presentations of the AMD open stereo initiative and PDFs are very clear (but long and not suitable for the public) while their public web page is just full of confusing nonsense.



According to user reports the ATI HD3D feature works for some DLP link projectors, for others there are display detection issues, the only report from using a 3DTV we have at the moment is the one from 3D Vision blog and there seems to be a big bug with the Panasonic 3DTV, we don't know if it's just an issue about Panasonic TVs or if it's a general hdmi1.4 support problem.

Anyway I have an ATi Radeon card with a Zalman Trimon monitor, and the HD3D standard thing does not work for this monitor... So at the moment, it's the good old usual way of making apps work (iZ3D, and DDD directly without any assistance from the AMD driver).

It's like if they had to announce it before the feature was really finished, because the new HD6800 graphics cards are out. It's so obvious their 3D outputs is still under heavy development.

Passive 3D forever
110" DIY dual-projection system
2x Epson EH-TW3500 (1080p) + Linear Polarizers (SPAR)
XtremScreen Daylight 2.0
VNS Geobox501 signal converter

#32
Posted 10/30/2010 01:54 PM   
Heck, I still haven't figured out why you would want to hook your computer to your TV at all. I hooked mine up, confirmed that it worked, and never bothered to use the feature again. Sure the TV is bigger but the monitor, sitting a foot and a half in front of me, appears bigger.

I would also worry about the too-many-cooks factor. Whenever a problem could be caused by two different companies, you've got problems because each is going to want to blame the other instead of shelling out the time/money to investigate the problem. (See the monitor ghosting issue we already have, for example - I bet we would see a lot more progress on it if nVidia actually owned one of the monitor companies.) The ATI partnership worries me a lot on that front.

But I'll definitely echo the others wrt competition. Right now, nVidia's only 'competition' is people's fear of S3D. Niceties like making it easier to adjust convergence aren't going to get more people into the stereo vision camp, so why bother? If ATI can give an even better interface, nVidia will need to match it to stay competative.
Heck, I still haven't figured out why you would want to hook your computer to your TV at all. I hooked mine up, confirmed that it worked, and never bothered to use the feature again. Sure the TV is bigger but the monitor, sitting a foot and a half in front of me, appears bigger.



I would also worry about the too-many-cooks factor. Whenever a problem could be caused by two different companies, you've got problems because each is going to want to blame the other instead of shelling out the time/money to investigate the problem. (See the monitor ghosting issue we already have, for example - I bet we would see a lot more progress on it if nVidia actually owned one of the monitor companies.) The ATI partnership worries me a lot on that front.



But I'll definitely echo the others wrt competition. Right now, nVidia's only 'competition' is people's fear of S3D. Niceties like making it easier to adjust convergence aren't going to get more people into the stereo vision camp, so why bother? If ATI can give an even better interface, nVidia will need to match it to stay competative.

#33
Posted 10/30/2010 01:55 PM   
Heck, I still haven't figured out why you would want to hook your computer to your TV at all. I hooked mine up, confirmed that it worked, and never bothered to use the feature again. Sure the TV is bigger but the monitor, sitting a foot and a half in front of me, appears bigger.

I would also worry about the too-many-cooks factor. Whenever a problem could be caused by two different companies, you've got problems because each is going to want to blame the other instead of shelling out the time/money to investigate the problem. (See the monitor ghosting issue we already have, for example - I bet we would see a lot more progress on it if nVidia actually owned one of the monitor companies.) The ATI partnership worries me a lot on that front.

But I'll definitely echo the others wrt competition. Right now, nVidia's only 'competition' is people's fear of S3D. Niceties like making it easier to adjust convergence aren't going to get more people into the stereo vision camp, so why bother? If ATI can give an even better interface, nVidia will need to match it to stay competative.
Heck, I still haven't figured out why you would want to hook your computer to your TV at all. I hooked mine up, confirmed that it worked, and never bothered to use the feature again. Sure the TV is bigger but the monitor, sitting a foot and a half in front of me, appears bigger.



I would also worry about the too-many-cooks factor. Whenever a problem could be caused by two different companies, you've got problems because each is going to want to blame the other instead of shelling out the time/money to investigate the problem. (See the monitor ghosting issue we already have, for example - I bet we would see a lot more progress on it if nVidia actually owned one of the monitor companies.) The ATI partnership worries me a lot on that front.



But I'll definitely echo the others wrt competition. Right now, nVidia's only 'competition' is people's fear of S3D. Niceties like making it easier to adjust convergence aren't going to get more people into the stereo vision camp, so why bother? If ATI can give an even better interface, nVidia will need to match it to stay competative.

#34
Posted 10/30/2010 01:55 PM   
My two pennies

Nvidia have been working closely with some devs to get the best possible 3D experience from the game, i presume the developers must spend a lot of time, working with nvidia to get certains things working well (Bad Company crosshair fix)

With ATI does this mean developers will spread their time optimizing for both, rather than just nvidia's ?

on the flipside - competition is often a great way of keeping cost down and quality up
My two pennies



Nvidia have been working closely with some devs to get the best possible 3D experience from the game, i presume the developers must spend a lot of time, working with nvidia to get certains things working well (Bad Company crosshair fix)



With ATI does this mean developers will spread their time optimizing for both, rather than just nvidia's ?



on the flipside - competition is often a great way of keeping cost down and quality up

Asus Rampage Extreme II | Quad core intel I7 2.6 | 6gig ram | Geforce GTX 680 | Samsung 22" 2233RZ | Acer 5360 | win8

3D website dedicated soley for nvidia 3D Vision

Visit 3dSolutionGaming.com for an A-Z listing of 3D streaming video's, automated slideshows, download packs and common fixes for 3dvision gamers.

Facebook Page: https://www.facebook.com/3dsolutiongaming
Twitter page: https://twitter.com/solutiongaming
Youtube Channel: www.youtube.com/user/SolutionGaming

Keenly supporting the Helixwrapper
http://helixmod.blogspot.com/

#35
Posted 10/30/2010 02:47 PM   
My two pennies

Nvidia have been working closely with some devs to get the best possible 3D experience from the game, i presume the developers must spend a lot of time, working with nvidia to get certains things working well (Bad Company crosshair fix)

With ATI does this mean developers will spread their time optimizing for both, rather than just nvidia's ?

on the flipside - competition is often a great way of keeping cost down and quality up
My two pennies



Nvidia have been working closely with some devs to get the best possible 3D experience from the game, i presume the developers must spend a lot of time, working with nvidia to get certains things working well (Bad Company crosshair fix)



With ATI does this mean developers will spread their time optimizing for both, rather than just nvidia's ?



on the flipside - competition is often a great way of keeping cost down and quality up

Asus Rampage Extreme II | Quad core intel I7 2.6 | 6gig ram | Geforce GTX 680 | Samsung 22" 2233RZ | Acer 5360 | win8

3D website dedicated soley for nvidia 3D Vision

Visit 3dSolutionGaming.com for an A-Z listing of 3D streaming video's, automated slideshows, download packs and common fixes for 3dvision gamers.

Facebook Page: https://www.facebook.com/3dsolutiongaming
Twitter page: https://twitter.com/solutiongaming
Youtube Channel: www.youtube.com/user/SolutionGaming

Keenly supporting the Helixwrapper
http://helixmod.blogspot.com/

#36
Posted 10/30/2010 02:47 PM   
[quote name='solutiongaming' post='1139496' date='Oct 30 2010, 04:47 PM']Nvidia have been working closely with some devs to get the best possible 3D experience from the game, i presume the developers must spend a lot of time, working with nvidia to get certains things working well (Bad Company crosshair fix)

With ATI does this mean developers will spread their time optimizing for both, rather than just nvidia's ?[/quote]
You presume too much.
The current way Nvidia does things is to tell game developers to do their games in 2D, let the 3D vision driver do it's magic and then correct bugs. Because it's an approach that takes less time to implement than doing it the full way (game developer making a stereoscopic game from scratch) even though it takes more GPU power and sometimes some bugs cannot be corrected in time for release or cannot be corrected at all.
They do it because it's a cheaper way to try 3D without doing the proper investments (time and money) required to a full blown native 3D approach. A bit like Hollywood convert movies because they don't want to spend the time to shoot everything with a dual camera setup.

So when developers just do the 3D vision optimisation, you know you won't get the "best possible" 3D experience, you'll just get a "better than if they didn't do anything".

AMD does not provide a 2D to 3D conversion driver for games, and iZ3D and DDD's driver react differently from Nvidia's driver, optimisations for one don't necessarily work with the others. So AMD chose to take the long term approach : have the game developers make their games render their games natively and only provide a way to display the left and right eye views whatever display the user is using (Display Port, Hdmi1.4, 10Hz DLP-link projector, or the few other displays listed on their web page).
Nvidia also provides such a feature in the 3D vision drivers but so far only Avatar : the game used it.

There are more upcoming games that will use such features : Crysis 2 (announced they will support for everything available), CoD Black Ops (announced only Nvidia api support so far) and TrackMania 2 (not officially announced, I asked them myself, they support Nvidia's api and are waiting for other APIs to implement them) will use these features because they do the 3D internally.
For these games, supporting AMD HD3D will be very easy and won't require any third party drivers, the game developers just need to do a very small patch to direct the left and right eye views to the new AMD HD3D driver instead of Nvidia's.
[quote name='solutiongaming' post='1139496' date='Oct 30 2010, 04:47 PM']Nvidia have been working closely with some devs to get the best possible 3D experience from the game, i presume the developers must spend a lot of time, working with nvidia to get certains things working well (Bad Company crosshair fix)



With ATI does this mean developers will spread their time optimizing for both, rather than just nvidia's ?

You presume too much.

The current way Nvidia does things is to tell game developers to do their games in 2D, let the 3D vision driver do it's magic and then correct bugs. Because it's an approach that takes less time to implement than doing it the full way (game developer making a stereoscopic game from scratch) even though it takes more GPU power and sometimes some bugs cannot be corrected in time for release or cannot be corrected at all.

They do it because it's a cheaper way to try 3D without doing the proper investments (time and money) required to a full blown native 3D approach. A bit like Hollywood convert movies because they don't want to spend the time to shoot everything with a dual camera setup.



So when developers just do the 3D vision optimisation, you know you won't get the "best possible" 3D experience, you'll just get a "better than if they didn't do anything".



AMD does not provide a 2D to 3D conversion driver for games, and iZ3D and DDD's driver react differently from Nvidia's driver, optimisations for one don't necessarily work with the others. So AMD chose to take the long term approach : have the game developers make their games render their games natively and only provide a way to display the left and right eye views whatever display the user is using (Display Port, Hdmi1.4, 10Hz DLP-link projector, or the few other displays listed on their web page).

Nvidia also provides such a feature in the 3D vision drivers but so far only Avatar : the game used it.



There are more upcoming games that will use such features : Crysis 2 (announced they will support for everything available), CoD Black Ops (announced only Nvidia api support so far) and TrackMania 2 (not officially announced, I asked them myself, they support Nvidia's api and are waiting for other APIs to implement them) will use these features because they do the 3D internally.

For these games, supporting AMD HD3D will be very easy and won't require any third party drivers, the game developers just need to do a very small patch to direct the left and right eye views to the new AMD HD3D driver instead of Nvidia's.

Passive 3D forever
110" DIY dual-projection system
2x Epson EH-TW3500 (1080p) + Linear Polarizers (SPAR)
XtremScreen Daylight 2.0
VNS Geobox501 signal converter

#37
Posted 10/30/2010 03:42 PM   
[quote name='solutiongaming' post='1139496' date='Oct 30 2010, 04:47 PM']Nvidia have been working closely with some devs to get the best possible 3D experience from the game, i presume the developers must spend a lot of time, working with nvidia to get certains things working well (Bad Company crosshair fix)

With ATI does this mean developers will spread their time optimizing for both, rather than just nvidia's ?[/quote]
You presume too much.
The current way Nvidia does things is to tell game developers to do their games in 2D, let the 3D vision driver do it's magic and then correct bugs. Because it's an approach that takes less time to implement than doing it the full way (game developer making a stereoscopic game from scratch) even though it takes more GPU power and sometimes some bugs cannot be corrected in time for release or cannot be corrected at all.
They do it because it's a cheaper way to try 3D without doing the proper investments (time and money) required to a full blown native 3D approach. A bit like Hollywood convert movies because they don't want to spend the time to shoot everything with a dual camera setup.

So when developers just do the 3D vision optimisation, you know you won't get the "best possible" 3D experience, you'll just get a "better than if they didn't do anything".

AMD does not provide a 2D to 3D conversion driver for games, and iZ3D and DDD's driver react differently from Nvidia's driver, optimisations for one don't necessarily work with the others. So AMD chose to take the long term approach : have the game developers make their games render their games natively and only provide a way to display the left and right eye views whatever display the user is using (Display Port, Hdmi1.4, 10Hz DLP-link projector, or the few other displays listed on their web page).
Nvidia also provides such a feature in the 3D vision drivers but so far only Avatar : the game used it.

There are more upcoming games that will use such features : Crysis 2 (announced they will support for everything available), CoD Black Ops (announced only Nvidia api support so far) and TrackMania 2 (not officially announced, I asked them myself, they support Nvidia's api and are waiting for other APIs to implement them) will use these features because they do the 3D internally.
For these games, supporting AMD HD3D will be very easy and won't require any third party drivers, the game developers just need to do a very small patch to direct the left and right eye views to the new AMD HD3D driver instead of Nvidia's.
[quote name='solutiongaming' post='1139496' date='Oct 30 2010, 04:47 PM']Nvidia have been working closely with some devs to get the best possible 3D experience from the game, i presume the developers must spend a lot of time, working with nvidia to get certains things working well (Bad Company crosshair fix)



With ATI does this mean developers will spread their time optimizing for both, rather than just nvidia's ?

You presume too much.

The current way Nvidia does things is to tell game developers to do their games in 2D, let the 3D vision driver do it's magic and then correct bugs. Because it's an approach that takes less time to implement than doing it the full way (game developer making a stereoscopic game from scratch) even though it takes more GPU power and sometimes some bugs cannot be corrected in time for release or cannot be corrected at all.

They do it because it's a cheaper way to try 3D without doing the proper investments (time and money) required to a full blown native 3D approach. A bit like Hollywood convert movies because they don't want to spend the time to shoot everything with a dual camera setup.



So when developers just do the 3D vision optimisation, you know you won't get the "best possible" 3D experience, you'll just get a "better than if they didn't do anything".



AMD does not provide a 2D to 3D conversion driver for games, and iZ3D and DDD's driver react differently from Nvidia's driver, optimisations for one don't necessarily work with the others. So AMD chose to take the long term approach : have the game developers make their games render their games natively and only provide a way to display the left and right eye views whatever display the user is using (Display Port, Hdmi1.4, 10Hz DLP-link projector, or the few other displays listed on their web page).

Nvidia also provides such a feature in the 3D vision drivers but so far only Avatar : the game used it.



There are more upcoming games that will use such features : Crysis 2 (announced they will support for everything available), CoD Black Ops (announced only Nvidia api support so far) and TrackMania 2 (not officially announced, I asked them myself, they support Nvidia's api and are waiting for other APIs to implement them) will use these features because they do the 3D internally.

For these games, supporting AMD HD3D will be very easy and won't require any third party drivers, the game developers just need to do a very small patch to direct the left and right eye views to the new AMD HD3D driver instead of Nvidia's.

Passive 3D forever
110" DIY dual-projection system
2x Epson EH-TW3500 (1080p) + Linear Polarizers (SPAR)
XtremScreen Daylight 2.0
VNS Geobox501 signal converter

#38
Posted 10/30/2010 03:42 PM   
  3 / 3    
Scroll To Top