They're not really screwing their customers though, are they? They're screwing Nvidia customers, if anything.
If I'm understanding this correctly, to their own customers they're giving a vastly more efficient and powerful gaming platform. (Nvidia customers will probably get this benefit too, to a lesser degree). From what I'm reading, it seems like the only losers in this will be 3Dvision users, and anything else that relies on directX.
The actual principle of ditching an inefficient DirectX that Microsoft barely cares about in favour of a much better pipeline sounds terrific to me. Though I'd feel more comfortable if a 3rd party was doing it, rather than the company with their finger in every hardware pie at the moment.
They're not really screwing their customers though, are they? They're screwing Nvidia customers, if anything.
If I'm understanding this correctly, to their own customers they're giving a vastly more efficient and powerful gaming platform. (Nvidia customers will probably get this benefit too, to a lesser degree). From what I'm reading, it seems like the only losers in this will be 3Dvision users, and anything else that relies on directX.
The actual principle of ditching an inefficient DirectX that Microsoft barely cares about in favour of a much better pipeline sounds terrific to me. Though I'd feel more comfortable if a 3rd party was doing it, rather than the company with their finger in every hardware pie at the moment.
No, they are screwing their customers too. Any time a company makes a walled garden instead of an open market they screw their customers. If you care about interoperability, having every corporation avoid standards like DirectX is a terrible thing.
If they are successful with their ploy, this will lead directly to things like Deus Ex 3D only working on AMD cards, and never on NVidia. Conversely, their customers will get screwed when something comes out that [i]requires[/i] PhysX, which is <cough> only slightly broken on AMD.
It's bad news.
I'm not at all sure they'll be successful though. I find it unlikely that big developers are going to skip half the PC market and program directly to a proprietary API.
On the other hand, if big developers decide it's just too much trouble or not enough money, they could just skip the non AMD PC market altogether, and consoles ultimately win with a proprietary API.
I had thought that having x86 hardware for this generation of consoles would be a plus, but not if AMD succeeds in poisoning the well.
No, they are screwing their customers too. Any time a company makes a walled garden instead of an open market they screw their customers. If you care about interoperability, having every corporation avoid standards like DirectX is a terrible thing.
If they are successful with their ploy, this will lead directly to things like Deus Ex 3D only working on AMD cards, and never on NVidia. Conversely, their customers will get screwed when something comes out that requires PhysX, which is <cough> only slightly broken on AMD.
It's bad news.
I'm not at all sure they'll be successful though. I find it unlikely that big developers are going to skip half the PC market and program directly to a proprietary API.
On the other hand, if big developers decide it's just too much trouble or not enough money, they could just skip the non AMD PC market altogether, and consoles ultimately win with a proprietary API.
I had thought that having x86 hardware for this generation of consoles would be a plus, but not if AMD succeeds in poisoning the well.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
I agree it's aggressive and is bad news for us. And ultimately, for gaming in general, since monopolies stifle progress. But I still don't see how it directly screws their customers in any way.
Apple customers are doing just fine in their walled garden. (I personally wouldn't buy an iOS device if you paid me. But the average iOS user is very happy where she is.) So are xbox customers and PS3 customers.
I don't see why AMD customers wouldn't be happy in their walled garden either - especially now that that garden has had yet another layer separating PCs and consoles removed.
PhysX? What PhysX? If AMD "are successful with their ploy", there will be no PhysX, because Nvidia will have been reduced to a minor has-been.
I agree that standards are good. But in this case the standard (DirectX) kind of sucks. It moves at snail pace, has had barely any significant improvements in a decade or so, has a single owner who barely cares about it anymore, and is clearly inefficient.
Someone coming and making a better standard sounds like a good idea to me. Making that new standard proprietary is not good, but that's a separate issue.
I agree it's aggressive and is bad news for us. And ultimately, for gaming in general, since monopolies stifle progress. But I still don't see how it directly screws their customers in any way.
Apple customers are doing just fine in their walled garden. (I personally wouldn't buy an iOS device if you paid me. But the average iOS user is very happy where she is.) So are xbox customers and PS3 customers.
I don't see why AMD customers wouldn't be happy in their walled garden either - especially now that that garden has had yet another layer separating PCs and consoles removed.
PhysX? What PhysX? If AMD "are successful with their ploy", there will be no PhysX, because Nvidia will have been reduced to a minor has-been.
I agree that standards are good. But in this case the standard (DirectX) kind of sucks. It moves at snail pace, has had barely any significant improvements in a decade or so, has a single owner who barely cares about it anymore, and is clearly inefficient.
Someone coming and making a better standard sounds like a good idea to me. Making that new standard proprietary is not good, but that's a separate issue.
The walled garden hurts even their customers because it makes it hard for them to share with other people, the larger world. As long as you stay in the garden, you don't notice that you are being blocked.
An example is Facetime. It works great if the only people you know also have iOS devices. If someone is outside your circle, too bad. Could be an open standard and better for everyone, directly including the customers, but no, it has to be locked down to encourage you to stay in the "eco-system." Skype runs on all devices. Why not Facetime?
A lot of times it's subtle, which is even worse. Something simple like Email should be universal right? An example on MacOS is the way they send pictures. If you use the Mail app, it will encode pictures as TIFF files. That pretty much blows up any non-Mac or non-iOS device, and makes the Apple person seem like a dick. The Apple person is confused as to why their non-Apple friends can never see the pictures.
How about when somebody in the other world sends a Mac/iOS person a docx file? How many Mac people can figure that out? Why isn't there interoperability there by default? I mean, damn, it's even XML now.
How about when someone sends a movie they took with their iPhone to a PC user? How many PC users have that stupid mp4 codec installed? Then the iOS person just hears that the end user couldn't view it, oh well.
Mostly people I see just think it's some weird computer thing and move along. They typically don't even realize they are being jacked.
This came to a head with the Map debacle, where even the most happy iOS customer was confused that Apple would force the good one off, and the bad one on.
The walled garden hurts even their customers because it makes it hard for them to share with other people, the larger world. As long as you stay in the garden, you don't notice that you are being blocked.
An example is Facetime. It works great if the only people you know also have iOS devices. If someone is outside your circle, too bad. Could be an open standard and better for everyone, directly including the customers, but no, it has to be locked down to encourage you to stay in the "eco-system." Skype runs on all devices. Why not Facetime?
A lot of times it's subtle, which is even worse. Something simple like Email should be universal right? An example on MacOS is the way they send pictures. If you use the Mail app, it will encode pictures as TIFF files. That pretty much blows up any non-Mac or non-iOS device, and makes the Apple person seem like a dick. The Apple person is confused as to why their non-Apple friends can never see the pictures.
How about when somebody in the other world sends a Mac/iOS person a docx file? How many Mac people can figure that out? Why isn't there interoperability there by default? I mean, damn, it's even XML now.
How about when someone sends a movie they took with their iPhone to a PC user? How many PC users have that stupid mp4 codec installed? Then the iOS person just hears that the end user couldn't view it, oh well.
Mostly people I see just think it's some weird computer thing and move along. They typically don't even realize they are being jacked.
This came to a head with the Map debacle, where even the most happy iOS customer was confused that Apple would force the good one off, and the bad one on.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
Hey, you don't have to convince me about the crapness of walled gardens. I loathe Macs (I've been forced to use one at work for 5 years, and I hate it more with each year). iOS devices annoy me, and the loyalty they inspire bewilders me. I love my android, because I can customise the crap out of it, and because it doesn't treat me like an idiot.
But you said it yourself: most happy iOS users aren't even aware that there's a problem. Most of them get what they want from Apple, and the walled garden helps make that happen (by keeping the system closed and controlled, predictable, making sure stuff works by not allowing lots of variables etc.)
And while your Facetime and TIFF examples sound like arbitrary hurdles that have been placed for purely cynical (or thoughtless) reasons by Apple, the same cannot be said for Mantle.
Forgetting for a moment the sneakiness of it, Mantle would give real, tangible (and quite possibly huge) benefits to AMD customers. Firstly, it would give them better performance and potentially less buggy games. Secondly, it could mean that one of the biggest gripes of this generation - terrible console ports - could well and truly be a thing of the past (for them......I shudder at the thought that we DirectX users will now be the ones getting bad PC-to-PC ports - ouch), since all 3 game systems (xbox, PS4 and PC) would be speaking pretty much the same language.
An open standard would probably not cut it here, since an open standard (that worked happily on nvidia and intel hardware) would have to be high-level. If there's to be a low-level API with the benefits that a low-level API brings, it's going to have to be an AMD-specific one, since that's what's going to run on both consoles. So it's not as simple as closed iOS vs open Android, or closed Facetime vs open Skype. It's a case of AMD being in a unique position to consolidate a pipeline in a way that was impossible last gen, and that opportunity by its very nature is destined to benefit AMD more than anyone else.
I think the pros will outweigh the cons for AMD customers. And frankly, as much as I'm worried about the consequences, I can't really blame AMD for doing it. This move makes perfect sense, even if you exclude all malicious motives: Most games will now be made for AMD hardware, on consoles which use a low-level API....it would almost be crazy of AMD to continue to rely on inefficient middleware like DirectX rather than just carry through the low-level stuff from the consoles.
Hopefully it won't be the revolution we fear it to be. If it is, it will be a real shame that the company with the better hardware (nvidia) got shafted because it thought the superiority of its hardware was enough, while its weaker competitor relied instead on PR, deal-making and cross-branding to paint them into a corner.
Hey, you don't have to convince me about the crapness of walled gardens. I loathe Macs (I've been forced to use one at work for 5 years, and I hate it more with each year). iOS devices annoy me, and the loyalty they inspire bewilders me. I love my android, because I can customise the crap out of it, and because it doesn't treat me like an idiot.
But you said it yourself: most happy iOS users aren't even aware that there's a problem. Most of them get what they want from Apple, and the walled garden helps make that happen (by keeping the system closed and controlled, predictable, making sure stuff works by not allowing lots of variables etc.)
And while your Facetime and TIFF examples sound like arbitrary hurdles that have been placed for purely cynical (or thoughtless) reasons by Apple, the same cannot be said for Mantle.
Forgetting for a moment the sneakiness of it, Mantle would give real, tangible (and quite possibly huge) benefits to AMD customers. Firstly, it would give them better performance and potentially less buggy games. Secondly, it could mean that one of the biggest gripes of this generation - terrible console ports - could well and truly be a thing of the past (for them......I shudder at the thought that we DirectX users will now be the ones getting bad PC-to-PC ports - ouch), since all 3 game systems (xbox, PS4 and PC) would be speaking pretty much the same language.
An open standard would probably not cut it here, since an open standard (that worked happily on nvidia and intel hardware) would have to be high-level. If there's to be a low-level API with the benefits that a low-level API brings, it's going to have to be an AMD-specific one, since that's what's going to run on both consoles. So it's not as simple as closed iOS vs open Android, or closed Facetime vs open Skype. It's a case of AMD being in a unique position to consolidate a pipeline in a way that was impossible last gen, and that opportunity by its very nature is destined to benefit AMD more than anyone else.
I think the pros will outweigh the cons for AMD customers. And frankly, as much as I'm worried about the consequences, I can't really blame AMD for doing it. This move makes perfect sense, even if you exclude all malicious motives: Most games will now be made for AMD hardware, on consoles which use a low-level API....it would almost be crazy of AMD to continue to rely on inefficient middleware like DirectX rather than just carry through the low-level stuff from the consoles.
Hopefully it won't be the revolution we fear it to be. If it is, it will be a real shame that the company with the better hardware (nvidia) got shafted because it thought the superiority of its hardware was enough, while its weaker competitor relied instead on PR, deal-making and cross-branding to paint them into a corner.
[quote="Ricky Martyr"][quote="JnLoader"]@Ricky Martyr.
Fair enough sorry for the harsh words, it was over the top..sorry for that!![/quote]
No problem, really :)
[/quote]
Allrigt then, thanks a lot :)
Just how big of an issue is DX? Because I was under the impression my 660ti was about 8 times as powerful as PS3. Considering most PS3 games were lucky to run locked at 720p@30fps, wasn't I getting 8X the performance by running at 1080p@120fps? I mean, if things were this horrendous bottleneck, a 660ti shouldn't have been able to consistently due that (sometime with startling ease).
The cynic in me also focuses on all the talk about not having to load the CPU. That's all fine and dandy for the upcoming consoles (which are basically using awful netbook CPUs that would seriously bottleneck things), but for a real Intel CPU, what do I care if they have to handle some of the load. I'm not running with a netbook CPU.
I guess we'll see. The GPU industry has a long history of over-hyping things though.
Just how big of an issue is DX? Because I was under the impression my 660ti was about 8 times as powerful as PS3. Considering most PS3 games were lucky to run locked at 720p@30fps, wasn't I getting 8X the performance by running at 1080p@120fps? I mean, if things were this horrendous bottleneck, a 660ti shouldn't have been able to consistently due that (sometime with startling ease).
The cynic in me also focuses on all the talk about not having to load the CPU. That's all fine and dandy for the upcoming consoles (which are basically using awful netbook CPUs that would seriously bottleneck things), but for a real Intel CPU, what do I care if they have to handle some of the load. I'm not running with a netbook CPU.
I guess we'll see. The GPU industry has a long history of over-hyping things though.
[quote="Paul33993"]Just how big of an issue is DX? Because I was under the impression my 660ti was about 8 times as powerful as PS3. Considering most PS3 games were lucky to run locked at 720p@30fps, wasn't I getting 8X the performance by running at 1080p@120fps? I mean, if things were this horrendous bottleneck, a 660ti shouldn't have been able to consistently due that (sometime with startling ease).[/quote]
It's not always that simple though. For one, I'm sure there are plenty of games that you weren't able to run at 1080p@120fps (eg. Assassins' Creed series - even the older ones....I wasn't even getting 60fps on my 680 on those). Then there was the odd game which was purported to actually run worse on PC than it did on consoles (eg. GTA4)
[quote]The cynic in me also focuses on all the talk about not having to load the CPU. That's all fine and dandy for the upcoming consoles (which are basically using awful netbook CPUs that would seriously bottleneck things), but for a real Intel CPU, what do I care if they have to handle some of the load. I'm not running with a netbook CPU.[/quote]
CPU bottleneck has me worried. While GPUs get better each year, CPU development has stagnated. With Haswell, it's almost gone backwards. Yet there are games today that bottleneck on the CPU, like Skyrim and Hitman Absolution.
When I max out the Hitman Absolution settings and run the benchmark, I get about 25fps in 3D. I'd like to think that my twin Titans are not to blame.
[quote]
I guess we'll see. The GPU industry has a long history of over-hyping things though.[/quote]Very true (I'm still waiting for tesselation to impress me, for example). Hopefully this is just a lot of market hype and community overreaction. [/quote]
Paul33993 said:Just how big of an issue is DX? Because I was under the impression my 660ti was about 8 times as powerful as PS3. Considering most PS3 games were lucky to run locked at 720p@30fps, wasn't I getting 8X the performance by running at 1080p@120fps? I mean, if things were this horrendous bottleneck, a 660ti shouldn't have been able to consistently due that (sometime with startling ease).
It's not always that simple though. For one, I'm sure there are plenty of games that you weren't able to run at 1080p@120fps (eg. Assassins' Creed series - even the older ones....I wasn't even getting 60fps on my 680 on those). Then there was the odd game which was purported to actually run worse on PC than it did on consoles (eg. GTA4)
The cynic in me also focuses on all the talk about not having to load the CPU. That's all fine and dandy for the upcoming consoles (which are basically using awful netbook CPUs that would seriously bottleneck things), but for a real Intel CPU, what do I care if they have to handle some of the load. I'm not running with a netbook CPU.
CPU bottleneck has me worried. While GPUs get better each year, CPU development has stagnated. With Haswell, it's almost gone backwards. Yet there are games today that bottleneck on the CPU, like Skyrim and Hitman Absolution.
When I max out the Hitman Absolution settings and run the benchmark, I get about 25fps in 3D. I'd like to think that my twin Titans are not to blame.
I guess we'll see. The GPU industry has a long history of over-hyping things though.
Very true (I'm still waiting for tesselation to impress me, for example). Hopefully this is just a lot of market hype and community overreaction.
My only question is does allowing developers direct access to the hardware, is that a security risk?
Intel Core i9-9820x @ 3.30GHZ
32 gig Ram
2 EVGA RTX 2080 ti Gaming
3 X ASUS ROG SWIFT 27 144Hz G-SYNC Gaming 3D Monitor [PG278Q]
1 X ASUS VG278HE
Nvidia 3Dvision
Oculus Rift
HTC VIVE
Windows 10
[quote="Volnaiskra"]But you said it yourself: most happy iOS users aren't even aware that there's a problem. Most of them get what they want from Apple, and the walled garden helps make that happen (by keeping the system closed and controlled, predictable, making sure stuff works by not allowing lots of variables etc.)[/quote]You were asking how walled gardens are bad for even their customers, and these are some of my examples. It doesn't matter whether they are 'happy' or not, it's still bad for their customers. They've been told repeatedly from marketing experts that they made the right choice, and are simply brilliant people. Of course they aren't going to recognize it.
Like a lot of us, I spend a fair amount of time helping out my less technical family. Even they notice and are unhappy when they run into arbitrary and stupid limitations. All of those examples are from people I've helped, who wanted to be able to connect with their peers properly, not mediated through the walled garden. They still don't recognize that they are being jacked, but definitely recognize they are not happy with it. Curious to me is that they still believe the marketing message more than they believe me.
[quote="Paul33993"]Just how big of an issue is DX? Because I was under the impression my 660ti was about 8 times as powerful as PS3. Considering most PS3 games were lucky to run locked at 720p@30fps, wasn't I getting 8X the performance by running at 1080p@120fps? I mean, if things were this horrendous bottleneck, a 660ti shouldn't have been able to consistently due that (sometime with startling ease)[/quote]I thought the same thing. If DX is so terrible then why does the XBox have higher performance than a PS4 in general?
I don't believe [i]anything[/i] that self-interested parties have to say with regard to performance. You can ruin an algorithm just as easily as optimize it. I seriously doubt that draw calls is some sort of holy-grail performance limit.
If a developer had a draw-call problem, they could easily and simply write HLSL programs to do more work on the GPU without intervening calls.
The real reason games sometimes run more poorly on PCs is because they just don't spend any time on it. We all know it's a secondary market for them, free money to skim up, and it's just not a priority. Having played GTA4 extensively on XBox AND on PC, I can say for a fact that it runs better on PC. On XBox they cheat it by lowering the draw distance to 17 or so. On my PC in 3D Vision, I set draw distance to 65. It doesn't affect it that much, but don't be confused by false comparisons.
Volnaiskra said:But you said it yourself: most happy iOS users aren't even aware that there's a problem. Most of them get what they want from Apple, and the walled garden helps make that happen (by keeping the system closed and controlled, predictable, making sure stuff works by not allowing lots of variables etc.)
You were asking how walled gardens are bad for even their customers, and these are some of my examples. It doesn't matter whether they are 'happy' or not, it's still bad for their customers. They've been told repeatedly from marketing experts that they made the right choice, and are simply brilliant people. Of course they aren't going to recognize it.
Like a lot of us, I spend a fair amount of time helping out my less technical family. Even they notice and are unhappy when they run into arbitrary and stupid limitations. All of those examples are from people I've helped, who wanted to be able to connect with their peers properly, not mediated through the walled garden. They still don't recognize that they are being jacked, but definitely recognize they are not happy with it. Curious to me is that they still believe the marketing message more than they believe me.
Paul33993 said:Just how big of an issue is DX? Because I was under the impression my 660ti was about 8 times as powerful as PS3. Considering most PS3 games were lucky to run locked at 720p@30fps, wasn't I getting 8X the performance by running at 1080p@120fps? I mean, if things were this horrendous bottleneck, a 660ti shouldn't have been able to consistently due that (sometime with startling ease)
I thought the same thing. If DX is so terrible then why does the XBox have higher performance than a PS4 in general?
I don't believe anything that self-interested parties have to say with regard to performance. You can ruin an algorithm just as easily as optimize it. I seriously doubt that draw calls is some sort of holy-grail performance limit.
If a developer had a draw-call problem, they could easily and simply write HLSL programs to do more work on the GPU without intervening calls.
The real reason games sometimes run more poorly on PCs is because they just don't spend any time on it. We all know it's a secondary market for them, free money to skim up, and it's just not a priority. Having played GTA4 extensively on XBox AND on PC, I can say for a fact that it runs better on PC. On XBox they cheat it by lowering the draw distance to 17 or so. On my PC in 3D Vision, I set draw distance to 65. It doesn't affect it that much, but don't be confused by false comparisons.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
[quote="bo3b"][quote="Volnaiskra"]But you said it yourself: most happy iOS users aren't even aware that there's a problem. Most of them get what they want from Apple, and the walled garden helps make that happen (by keeping the system closed and controlled, predictable, making sure stuff works by not allowing lots of variables etc.)[/quote]You were asking how walled gardens are bad for even their customers, and these are some of my examples. It doesn't matter whether they are 'happy' or not, it's still bad for their customers. They've been told repeatedly from marketing experts that they made the right choice, and are simply brilliant people. Of course they aren't going to recognize it.
Like a lot of us, I spend a fair amount of time helping out my less technical family. Even they notice and are unhappy when they run into arbitrary and stupid limitations. All of those examples are from people I've helped, who wanted to be able to connect with their peers properly, not mediated through the walled garden. They still don't recognize that they are being jacked, but definitely recognize they are not happy with it. Curious to me is that they still believe the marketing message more than they believe me.[/quote]Point taken, but my point is that walled gardens don't consist purely of cons. They also contain some pros, which I think you're choosing not to acknowledge. For a user like you or me who values power and freedom above convenience, those pros are not worth the sacrifice. But for many, those pros outweigh the cons. There are prices that need to be paid for open systems, and not everyone wants to pay them.
If you managed to convert your techno-illiterate family members to PC and Android, would they really be happier and have fewer problems? That's hardly guaranteed. They'd have to deal with a whole bunch of new issues, like the mountains of crapware on Google Play, websites that don't render correctly on all browsers, apps that work properly on some Android devices but not others, or games that don't work right until you edit .cfg files, edit the registry, or turn off certain Windows services, etc. etc.
Would you really recommend to your old uncle to stop playing Candy Crush and start playing 3Dvision PC games? Sure, they're much better, but would your uncle really be interested in having to learn how to download drivers, roll them back if necessary, hack profiles with nvidia inspector, and install helixmods?
I've mentioned my strong dislike of Apple many times, but when techno-illiterate people ask me for recommendations for what phone or computer they should get, I recommend Apple almost every time. A walled garden is exactly that: A garden. It's well-maintained, it's pruned, it's carefully designed, and it's easy to find your way around, as long as you don't wonder what's on the other side of the wall. But for a lot of users, that simplicity is exactly what they want.
Anyway, I think this has probably gone offtopic. I reckon you might be right about draw calls not being as important as AMD are making them out to be. Hopefully AMDs stab at monopolising the PC space will be disMantled before it starts. Har har. ;P
Volnaiskra said:But you said it yourself: most happy iOS users aren't even aware that there's a problem. Most of them get what they want from Apple, and the walled garden helps make that happen (by keeping the system closed and controlled, predictable, making sure stuff works by not allowing lots of variables etc.)
You were asking how walled gardens are bad for even their customers, and these are some of my examples. It doesn't matter whether they are 'happy' or not, it's still bad for their customers. They've been told repeatedly from marketing experts that they made the right choice, and are simply brilliant people. Of course they aren't going to recognize it.
Like a lot of us, I spend a fair amount of time helping out my less technical family. Even they notice and are unhappy when they run into arbitrary and stupid limitations. All of those examples are from people I've helped, who wanted to be able to connect with their peers properly, not mediated through the walled garden. They still don't recognize that they are being jacked, but definitely recognize they are not happy with it. Curious to me is that they still believe the marketing message more than they believe me.
Point taken, but my point is that walled gardens don't consist purely of cons. They also contain some pros, which I think you're choosing not to acknowledge. For a user like you or me who values power and freedom above convenience, those pros are not worth the sacrifice. But for many, those pros outweigh the cons. There are prices that need to be paid for open systems, and not everyone wants to pay them.
If you managed to convert your techno-illiterate family members to PC and Android, would they really be happier and have fewer problems? That's hardly guaranteed. They'd have to deal with a whole bunch of new issues, like the mountains of crapware on Google Play, websites that don't render correctly on all browsers, apps that work properly on some Android devices but not others, or games that don't work right until you edit .cfg files, edit the registry, or turn off certain Windows services, etc. etc.
Would you really recommend to your old uncle to stop playing Candy Crush and start playing 3Dvision PC games? Sure, they're much better, but would your uncle really be interested in having to learn how to download drivers, roll them back if necessary, hack profiles with nvidia inspector, and install helixmods?
I've mentioned my strong dislike of Apple many times, but when techno-illiterate people ask me for recommendations for what phone or computer they should get, I recommend Apple almost every time. A walled garden is exactly that: A garden. It's well-maintained, it's pruned, it's carefully designed, and it's easy to find your way around, as long as you don't wonder what's on the other side of the wall. But for a lot of users, that simplicity is exactly what they want.
Anyway, I think this has probably gone offtopic. I reckon you might be right about draw calls not being as important as AMD are making them out to be. Hopefully AMDs stab at monopolising the PC space will be disMantled before it starts. Har har. ;P
Volnaiskra & bo3d both make some valid points, but I agree with bo3d purely from the standpoint that the customer should benefit. Proprietary stuff is good only to a point. Companies want to box and force loyalty to the customer by insidious design decisions. I'll use the term "unrealized benefits" for those who've been subtley brain washed with corporate marketing. One example sticks in my mind. My father was a staunch Sony fan for years and years until I started working for Panasonic. At that time, palm size video was the hottest technology going. Of course my dad was Sony all the way. What he refuse to acknowledge that Pansonic had an open platform with their proprietary tapes (C size cassettes). Sony was 8mm I believe. Now here is the thing, for every Palmcorder that Panasonic sold they included a full size empty VHS cassette which was designed to accept a c size cassette. What this meant, that no matter what, a Panasonic customer could play their filmed content on any VHS system. Sony not so, you needed the camera to play the content or had to transfer the content, what a pain. I worked for years a Panasonic and they did dominate the market for their video equipment. Simply because a customer wasn't boxed in. My dad finally "realized" the benefit and to this day, he is more open to other products or ideas. I have a PC as my gaming platform because it gives me the most freedom. Nuff said, I've avoided and will continue to do so with companies that want to box me in. That is why I don't have any MAC products. Are they good products, sure from a technical stand point. Anyway, thought provoking subject...
Cheers guys!
P.S. Sony and the Betamax...lol
[url]https://www.youtube.com/watch?v=KUiuHGWvRrw[/url]
Volnaiskra & bo3d both make some valid points, but I agree with bo3d purely from the standpoint that the customer should benefit. Proprietary stuff is good only to a point. Companies want to box and force loyalty to the customer by insidious design decisions. I'll use the term "unrealized benefits" for those who've been subtley brain washed with corporate marketing. One example sticks in my mind. My father was a staunch Sony fan for years and years until I started working for Panasonic. At that time, palm size video was the hottest technology going. Of course my dad was Sony all the way. What he refuse to acknowledge that Pansonic had an open platform with their proprietary tapes (C size cassettes). Sony was 8mm I believe. Now here is the thing, for every Palmcorder that Panasonic sold they included a full size empty VHS cassette which was designed to accept a c size cassette. What this meant, that no matter what, a Panasonic customer could play their filmed content on any VHS system. Sony not so, you needed the camera to play the content or had to transfer the content, what a pain. I worked for years a Panasonic and they did dominate the market for their video equipment. Simply because a customer wasn't boxed in. My dad finally "realized" the benefit and to this day, he is more open to other products or ideas. I have a PC as my gaming platform because it gives me the most freedom. Nuff said, I've avoided and will continue to do so with companies that want to box me in. That is why I don't have any MAC products. Are they good products, sure from a technical stand point. Anyway, thought provoking subject...
[quote="Volnaiskra"]Point taken, but my point is that walled gardens don't consist purely of cons. They also contain some pros, which I think you're choosing not to acknowledge. For a user like you or me who values power and freedom above convenience, those pros are not worth the sacrifice. But for many, those pros outweigh the cons. There are prices that need to be paid for open systems, and not everyone wants to pay them.[/quote]Yes, I completely agree. There is no question that a walled garden is actually better for some people, at least some of the time. To do it honestly though, I think you need to concentrate on the user benefit, and not the locking people into your eco-system part. Somewhere we've switched from making it better by limiting problem behavior/apps/people to doing it because it provides automatic lockin.
[quote="Volnaiskra"]If you managed to convert your techno-illiterate family members to PC and Android, would they really be happier and have fewer problems? That's hardly guaranteed. They'd have to deal with a whole bunch of new issues, like the mountains of crapware on Google Play, websites that don't render correctly on all browsers, apps that work properly on some Android devices but not others, or games that don't work right until you edit .cfg files, edit the registry, or turn off certain Windows services, etc. etc.[/quote]Whenever I make recommendations for anyone, not just family, I try make sure they get what is best for them, not best for me. It's hard to recommend XBox, but sometimes it's the right choice.
Interesting story- I switched my Mom, my MOM, from a Mac to Windows Vista. And I really like my Mom. I studied it very carefully to avoid making an egregious mistake, and I concluded that Vista was actually easier to use for naive users than MacOS. That is still true today. She took to it nearly instantly, with no serious problems. Two things about MacOS are fundamentally broken for old/naive users. Menu bar at the top of the screen, out of sight, out of mind, maybe a mile away on a big monitor. The Dock in the bottom with changing size, crazy icons, mixing metaphors of quick launch and running apps. I know this because my Dad refused to switch, and I watch him struggle with the UI [i]every [/i]time I help him.
BTW, I'm not a Mac/Win hater or lover, I don't really care about stupid corporations. It's just a gizmo, get over it. I worked at Apple for more than 10 years, so I'm not blind.
Back on specific topic, I looked up Mantle and whether this is just marketing or a real thing. I'm a software developer, so I care about APIs and how the industry moves.
Short answer: It's a real thing. There actually is a draw-call problem on PC, but not for the reasons they stated. And, it can be hard for the CPU to feed the GPUs fast enough to make a difference, and not be a bottleneck.
Now naturally, they overstate the case to prove a point and drum up excitement. The draw-call problem exists if you try to send too little data to the GPU. So for example, you set up 2 triangles, and then draw-call to display them, then two more and draw-call to display. You could program this way, and will hit the bottleneck. On the other hand, if you build a 10,000 triangle model of a car, you [i]can [/i]send that entire car to the scene with one draw-call.
So, you can do it poorly, or you can avoid the problem with better design. Now guess how consoles work? That's right, the programming model is better suited for them to do small batches, instead of large batches. So the upshot is that games written and optimized for consoles are going to run fairly poorly on PC.
If you don't do any optimization on PC, you get GTA4 and Hitman Absolution. It runs, but you run into the draw-call problem. The PC is sooo powerful, that you can still do 1080p in 3D Vision, add AA, and extend draw distance, but it's not nearly what you could do. Not surprisingly, it's more of a business decision than a technical one.
Now how does Mantle play into that? Pretty bad for us I still think. It means that a sweet, fast API for both consoles and some PC is available. The temptation to switch to the new-Glide will be too much I expect, and since console is considered to be the real money, I can't see them skipping it.
With next-gen consoles, at least for a little while, we can expect to have NVidia variants actually run slower than on console. And we can expect the API to bifurcate to compatibility with DirectX, and speed with Mantle.
Let's see how BattleField 4 looks- EA is obviously in bed with AMD, and is pushing it. It will be very interesting to see how it performs on NVidia hardware. And here's the smack talk:
[url]http://www.maximumpc.com/amd_r9_290x_will_be_much_faster_titan_battlefield_4[/url]
Which I would have no problem with if they were actually competing, but this is just lock-in, walled garden cheating, which is bad for all users ultimately. Ultimately open platforms win, but you can have a long period of bad experiences before that happens.
Volnaiskra said:Point taken, but my point is that walled gardens don't consist purely of cons. They also contain some pros, which I think you're choosing not to acknowledge. For a user like you or me who values power and freedom above convenience, those pros are not worth the sacrifice. But for many, those pros outweigh the cons. There are prices that need to be paid for open systems, and not everyone wants to pay them.
Yes, I completely agree. There is no question that a walled garden is actually better for some people, at least some of the time. To do it honestly though, I think you need to concentrate on the user benefit, and not the locking people into your eco-system part. Somewhere we've switched from making it better by limiting problem behavior/apps/people to doing it because it provides automatic lockin.
Volnaiskra said:If you managed to convert your techno-illiterate family members to PC and Android, would they really be happier and have fewer problems? That's hardly guaranteed. They'd have to deal with a whole bunch of new issues, like the mountains of crapware on Google Play, websites that don't render correctly on all browsers, apps that work properly on some Android devices but not others, or games that don't work right until you edit .cfg files, edit the registry, or turn off certain Windows services, etc. etc.
Whenever I make recommendations for anyone, not just family, I try make sure they get what is best for them, not best for me. It's hard to recommend XBox, but sometimes it's the right choice.
Interesting story- I switched my Mom, my MOM, from a Mac to Windows Vista. And I really like my Mom. I studied it very carefully to avoid making an egregious mistake, and I concluded that Vista was actually easier to use for naive users than MacOS. That is still true today. She took to it nearly instantly, with no serious problems. Two things about MacOS are fundamentally broken for old/naive users. Menu bar at the top of the screen, out of sight, out of mind, maybe a mile away on a big monitor. The Dock in the bottom with changing size, crazy icons, mixing metaphors of quick launch and running apps. I know this because my Dad refused to switch, and I watch him struggle with the UI every time I help him.
BTW, I'm not a Mac/Win hater or lover, I don't really care about stupid corporations. It's just a gizmo, get over it. I worked at Apple for more than 10 years, so I'm not blind.
Back on specific topic, I looked up Mantle and whether this is just marketing or a real thing. I'm a software developer, so I care about APIs and how the industry moves.
Short answer: It's a real thing. There actually is a draw-call problem on PC, but not for the reasons they stated. And, it can be hard for the CPU to feed the GPUs fast enough to make a difference, and not be a bottleneck.
Now naturally, they overstate the case to prove a point and drum up excitement. The draw-call problem exists if you try to send too little data to the GPU. So for example, you set up 2 triangles, and then draw-call to display them, then two more and draw-call to display. You could program this way, and will hit the bottleneck. On the other hand, if you build a 10,000 triangle model of a car, you can send that entire car to the scene with one draw-call.
So, you can do it poorly, or you can avoid the problem with better design. Now guess how consoles work? That's right, the programming model is better suited for them to do small batches, instead of large batches. So the upshot is that games written and optimized for consoles are going to run fairly poorly on PC.
If you don't do any optimization on PC, you get GTA4 and Hitman Absolution. It runs, but you run into the draw-call problem. The PC is sooo powerful, that you can still do 1080p in 3D Vision, add AA, and extend draw distance, but it's not nearly what you could do. Not surprisingly, it's more of a business decision than a technical one.
Now how does Mantle play into that? Pretty bad for us I still think. It means that a sweet, fast API for both consoles and some PC is available. The temptation to switch to the new-Glide will be too much I expect, and since console is considered to be the real money, I can't see them skipping it.
With next-gen consoles, at least for a little while, we can expect to have NVidia variants actually run slower than on console. And we can expect the API to bifurcate to compatibility with DirectX, and speed with Mantle.
Let's see how BattleField 4 looks- EA is obviously in bed with AMD, and is pushing it. It will be very interesting to see how it performs on NVidia hardware. And here's the smack talk:
Which I would have no problem with if they were actually competing, but this is just lock-in, walled garden cheating, which is bad for all users ultimately. Ultimately open platforms win, but you can have a long period of bad experiences before that happens.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607 Latest 3Dmigoto Release Bo3b's School for ShaderHackers
I've read the recent back and forth posts with interest and I like the fact that there is a serious discussion on the NVIDIA forums, although the 3D Vision board is likely not the best place for it as no moderators read it.
As I tried to say in my previous posts, my major concern is that Mantle may not just kick NVIDIA out of the PC gaming market, but it could ultimately kill PC gaming as a whole.
Right now, thanks to an ultimately bad move from the big N, AMD has already established a monopoly over consoles which makes it potentially easy to convince developers and even Microsoft to ditch Directx and focus on the low level API. Microsoft would use Mantle to compete against the PS4 (which will also use Mantle, as Sony doesn't even have a standard API for their machine) and I'm afraid they'd like this solution: it's no mystery that, since entering the console market, Microsoft has always wanted users to own a PC for professional purposes AND an Eggbox for gaming. This way they would be super happy.
There's more: AMD has Always been famous for spending a gynormous amount of money to promote their stuff (the already established deal with EA says it all and I'm sure Square-Enix is also in) and they seem to have Valve and still influential has-been John Carmack on their side.
Now let's say some developers don't want this monopoly to happen and they will still use Directx to make their games work on Nvidia and integrated chips: if Mantle is as powerful as it rightfully seems, it's pretty obvious that AMD will be able to significantly lower the price of their video cards while keeping strong performances and ultimately win over their hated rival.
A massive use of the new API will sooner or later leave AMD as the only serious video card manufacturer in the world. What would the consequences be? I, as a longtime PC gamer, would say: why do I have to spend nearly 4000 EUR for a super PC when I can get the same identical result from a 500-600 EUR console? This type of thinking may likely kill PC gaming, Intel and, [u]ironically[/u], AMD super processors and pricey Radeon cards.
NVIDIA could come up with a Mantle of their own: do you think most developers will want to work on two radically different wrappers just to make a PC port?
Sorry, I've been too long, but I'd like to say one last thing: I don't like the comparison between Mantle and Physx: the latter is just a gimmick, some additional bling bling that doesn't alter gameplay and doesn't really punish those who can't/don't wanna use it and let's not even mention the fact that it has a software version which works like any other Havok out there.
Mantle is a declaration of war: win or die trying.
I've read the recent back and forth posts with interest and I like the fact that there is a serious discussion on the NVIDIA forums, although the 3D Vision board is likely not the best place for it as no moderators read it.
As I tried to say in my previous posts, my major concern is that Mantle may not just kick NVIDIA out of the PC gaming market, but it could ultimately kill PC gaming as a whole.
Right now, thanks to an ultimately bad move from the big N, AMD has already established a monopoly over consoles which makes it potentially easy to convince developers and even Microsoft to ditch Directx and focus on the low level API. Microsoft would use Mantle to compete against the PS4 (which will also use Mantle, as Sony doesn't even have a standard API for their machine) and I'm afraid they'd like this solution: it's no mystery that, since entering the console market, Microsoft has always wanted users to own a PC for professional purposes AND an Eggbox for gaming. This way they would be super happy.
There's more: AMD has Always been famous for spending a gynormous amount of money to promote their stuff (the already established deal with EA says it all and I'm sure Square-Enix is also in) and they seem to have Valve and still influential has-been John Carmack on their side.
Now let's say some developers don't want this monopoly to happen and they will still use Directx to make their games work on Nvidia and integrated chips: if Mantle is as powerful as it rightfully seems, it's pretty obvious that AMD will be able to significantly lower the price of their video cards while keeping strong performances and ultimately win over their hated rival.
A massive use of the new API will sooner or later leave AMD as the only serious video card manufacturer in the world. What would the consequences be? I, as a longtime PC gamer, would say: why do I have to spend nearly 4000 EUR for a super PC when I can get the same identical result from a 500-600 EUR console? This type of thinking may likely kill PC gaming, Intel and, ironically, AMD super processors and pricey Radeon cards.
NVIDIA could come up with a Mantle of their own: do you think most developers will want to work on two radically different wrappers just to make a PC port?
Sorry, I've been too long, but I'd like to say one last thing: I don't like the comparison between Mantle and Physx: the latter is just a gimmick, some additional bling bling that doesn't alter gameplay and doesn't really punish those who can't/don't wanna use it and let's not even mention the fact that it has a software version which works like any other Havok out there.
Mantle is a declaration of war: win or die trying.
If I'm understanding this correctly, to their own customers they're giving a vastly more efficient and powerful gaming platform. (Nvidia customers will probably get this benefit too, to a lesser degree). From what I'm reading, it seems like the only losers in this will be 3Dvision users, and anything else that relies on directX.
The actual principle of ditching an inefficient DirectX that Microsoft barely cares about in favour of a much better pipeline sounds terrific to me. Though I'd feel more comfortable if a 3rd party was doing it, rather than the company with their finger in every hardware pie at the moment.
If they are successful with their ploy, this will lead directly to things like Deus Ex 3D only working on AMD cards, and never on NVidia. Conversely, their customers will get screwed when something comes out that requires PhysX, which is <cough> only slightly broken on AMD.
It's bad news.
I'm not at all sure they'll be successful though. I find it unlikely that big developers are going to skip half the PC market and program directly to a proprietary API.
On the other hand, if big developers decide it's just too much trouble or not enough money, they could just skip the non AMD PC market altogether, and consoles ultimately win with a proprietary API.
I had thought that having x86 hardware for this generation of consoles would be a plus, but not if AMD succeeds in poisoning the well.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
Apple customers are doing just fine in their walled garden. (I personally wouldn't buy an iOS device if you paid me. But the average iOS user is very happy where she is.) So are xbox customers and PS3 customers.
I don't see why AMD customers wouldn't be happy in their walled garden either - especially now that that garden has had yet another layer separating PCs and consoles removed.
PhysX? What PhysX? If AMD "are successful with their ploy", there will be no PhysX, because Nvidia will have been reduced to a minor has-been.
I agree that standards are good. But in this case the standard (DirectX) kind of sucks. It moves at snail pace, has had barely any significant improvements in a decade or so, has a single owner who barely cares about it anymore, and is clearly inefficient.
Someone coming and making a better standard sounds like a good idea to me. Making that new standard proprietary is not good, but that's a separate issue.
An example is Facetime. It works great if the only people you know also have iOS devices. If someone is outside your circle, too bad. Could be an open standard and better for everyone, directly including the customers, but no, it has to be locked down to encourage you to stay in the "eco-system." Skype runs on all devices. Why not Facetime?
A lot of times it's subtle, which is even worse. Something simple like Email should be universal right? An example on MacOS is the way they send pictures. If you use the Mail app, it will encode pictures as TIFF files. That pretty much blows up any non-Mac or non-iOS device, and makes the Apple person seem like a dick. The Apple person is confused as to why their non-Apple friends can never see the pictures.
How about when somebody in the other world sends a Mac/iOS person a docx file? How many Mac people can figure that out? Why isn't there interoperability there by default? I mean, damn, it's even XML now.
How about when someone sends a movie they took with their iPhone to a PC user? How many PC users have that stupid mp4 codec installed? Then the iOS person just hears that the end user couldn't view it, oh well.
Mostly people I see just think it's some weird computer thing and move along. They typically don't even realize they are being jacked.
This came to a head with the Map debacle, where even the most happy iOS customer was confused that Apple would force the good one off, and the bad one on.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
But you said it yourself: most happy iOS users aren't even aware that there's a problem. Most of them get what they want from Apple, and the walled garden helps make that happen (by keeping the system closed and controlled, predictable, making sure stuff works by not allowing lots of variables etc.)
And while your Facetime and TIFF examples sound like arbitrary hurdles that have been placed for purely cynical (or thoughtless) reasons by Apple, the same cannot be said for Mantle.
Forgetting for a moment the sneakiness of it, Mantle would give real, tangible (and quite possibly huge) benefits to AMD customers. Firstly, it would give them better performance and potentially less buggy games. Secondly, it could mean that one of the biggest gripes of this generation - terrible console ports - could well and truly be a thing of the past (for them......I shudder at the thought that we DirectX users will now be the ones getting bad PC-to-PC ports - ouch), since all 3 game systems (xbox, PS4 and PC) would be speaking pretty much the same language.
An open standard would probably not cut it here, since an open standard (that worked happily on nvidia and intel hardware) would have to be high-level. If there's to be a low-level API with the benefits that a low-level API brings, it's going to have to be an AMD-specific one, since that's what's going to run on both consoles. So it's not as simple as closed iOS vs open Android, or closed Facetime vs open Skype. It's a case of AMD being in a unique position to consolidate a pipeline in a way that was impossible last gen, and that opportunity by its very nature is destined to benefit AMD more than anyone else.
I think the pros will outweigh the cons for AMD customers. And frankly, as much as I'm worried about the consequences, I can't really blame AMD for doing it. This move makes perfect sense, even if you exclude all malicious motives: Most games will now be made for AMD hardware, on consoles which use a low-level API....it would almost be crazy of AMD to continue to rely on inefficient middleware like DirectX rather than just carry through the low-level stuff from the consoles.
Hopefully it won't be the revolution we fear it to be. If it is, it will be a real shame that the company with the better hardware (nvidia) got shafted because it thought the superiority of its hardware was enough, while its weaker competitor relied instead on PR, deal-making and cross-branding to paint them into a corner.
Allrigt then, thanks a lot :)
The cynic in me also focuses on all the talk about not having to load the CPU. That's all fine and dandy for the upcoming consoles (which are basically using awful netbook CPUs that would seriously bottleneck things), but for a real Intel CPU, what do I care if they have to handle some of the load. I'm not running with a netbook CPU.
I guess we'll see. The GPU industry has a long history of over-hyping things though.
It's not always that simple though. For one, I'm sure there are plenty of games that you weren't able to run at 1080p@120fps (eg. Assassins' Creed series - even the older ones....I wasn't even getting 60fps on my 680 on those). Then there was the odd game which was purported to actually run worse on PC than it did on consoles (eg. GTA4)
CPU bottleneck has me worried. While GPUs get better each year, CPU development has stagnated. With Haswell, it's almost gone backwards. Yet there are games today that bottleneck on the CPU, like Skyrim and Hitman Absolution.
When I max out the Hitman Absolution settings and run the benchmark, I get about 25fps in 3D. I'd like to think that my twin Titans are not to blame.
Very true (I'm still waiting for tesselation to impress me, for example). Hopefully this is just a lot of market hype and community overreaction.
Intel Core i9-9820x @ 3.30GHZ
32 gig Ram
2 EVGA RTX 2080 ti Gaming
3 X ASUS ROG SWIFT 27 144Hz G-SYNC Gaming 3D Monitor [PG278Q]
1 X ASUS VG278HE
Nvidia 3Dvision
Oculus Rift
HTC VIVE
Windows 10
Like a lot of us, I spend a fair amount of time helping out my less technical family. Even they notice and are unhappy when they run into arbitrary and stupid limitations. All of those examples are from people I've helped, who wanted to be able to connect with their peers properly, not mediated through the walled garden. They still don't recognize that they are being jacked, but definitely recognize they are not happy with it. Curious to me is that they still believe the marketing message more than they believe me.
I thought the same thing. If DX is so terrible then why does the XBox have higher performance than a PS4 in general?
I don't believe anything that self-interested parties have to say with regard to performance. You can ruin an algorithm just as easily as optimize it. I seriously doubt that draw calls is some sort of holy-grail performance limit.
If a developer had a draw-call problem, they could easily and simply write HLSL programs to do more work on the GPU without intervening calls.
The real reason games sometimes run more poorly on PCs is because they just don't spend any time on it. We all know it's a secondary market for them, free money to skim up, and it's just not a priority. Having played GTA4 extensively on XBox AND on PC, I can say for a fact that it runs better on PC. On XBox they cheat it by lowering the draw distance to 17 or so. On my PC in 3D Vision, I set draw distance to 65. It doesn't affect it that much, but don't be confused by false comparisons.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
If you managed to convert your techno-illiterate family members to PC and Android, would they really be happier and have fewer problems? That's hardly guaranteed. They'd have to deal with a whole bunch of new issues, like the mountains of crapware on Google Play, websites that don't render correctly on all browsers, apps that work properly on some Android devices but not others, or games that don't work right until you edit .cfg files, edit the registry, or turn off certain Windows services, etc. etc.
Would you really recommend to your old uncle to stop playing Candy Crush and start playing 3Dvision PC games? Sure, they're much better, but would your uncle really be interested in having to learn how to download drivers, roll them back if necessary, hack profiles with nvidia inspector, and install helixmods?
I've mentioned my strong dislike of Apple many times, but when techno-illiterate people ask me for recommendations for what phone or computer they should get, I recommend Apple almost every time. A walled garden is exactly that: A garden. It's well-maintained, it's pruned, it's carefully designed, and it's easy to find your way around, as long as you don't wonder what's on the other side of the wall. But for a lot of users, that simplicity is exactly what they want.
Anyway, I think this has probably gone offtopic. I reckon you might be right about draw calls not being as important as AMD are making them out to be. Hopefully AMDs stab at monopolising the PC space will be disMantled before it starts. Har har. ;P
Cheers guys!
P.S. Sony and the Betamax...lol
" rel="nofollow" target = "_blank">
Whenever I make recommendations for anyone, not just family, I try make sure they get what is best for them, not best for me. It's hard to recommend XBox, but sometimes it's the right choice.
Interesting story- I switched my Mom, my MOM, from a Mac to Windows Vista. And I really like my Mom. I studied it very carefully to avoid making an egregious mistake, and I concluded that Vista was actually easier to use for naive users than MacOS. That is still true today. She took to it nearly instantly, with no serious problems. Two things about MacOS are fundamentally broken for old/naive users. Menu bar at the top of the screen, out of sight, out of mind, maybe a mile away on a big monitor. The Dock in the bottom with changing size, crazy icons, mixing metaphors of quick launch and running apps. I know this because my Dad refused to switch, and I watch him struggle with the UI every time I help him.
BTW, I'm not a Mac/Win hater or lover, I don't really care about stupid corporations. It's just a gizmo, get over it. I worked at Apple for more than 10 years, so I'm not blind.
Back on specific topic, I looked up Mantle and whether this is just marketing or a real thing. I'm a software developer, so I care about APIs and how the industry moves.
Short answer: It's a real thing. There actually is a draw-call problem on PC, but not for the reasons they stated. And, it can be hard for the CPU to feed the GPUs fast enough to make a difference, and not be a bottleneck.
Now naturally, they overstate the case to prove a point and drum up excitement. The draw-call problem exists if you try to send too little data to the GPU. So for example, you set up 2 triangles, and then draw-call to display them, then two more and draw-call to display. You could program this way, and will hit the bottleneck. On the other hand, if you build a 10,000 triangle model of a car, you can send that entire car to the scene with one draw-call.
So, you can do it poorly, or you can avoid the problem with better design. Now guess how consoles work? That's right, the programming model is better suited for them to do small batches, instead of large batches. So the upshot is that games written and optimized for consoles are going to run fairly poorly on PC.
If you don't do any optimization on PC, you get GTA4 and Hitman Absolution. It runs, but you run into the draw-call problem. The PC is sooo powerful, that you can still do 1080p in 3D Vision, add AA, and extend draw distance, but it's not nearly what you could do. Not surprisingly, it's more of a business decision than a technical one.
Now how does Mantle play into that? Pretty bad for us I still think. It means that a sweet, fast API for both consoles and some PC is available. The temptation to switch to the new-Glide will be too much I expect, and since console is considered to be the real money, I can't see them skipping it.
With next-gen consoles, at least for a little while, we can expect to have NVidia variants actually run slower than on console. And we can expect the API to bifurcate to compatibility with DirectX, and speed with Mantle.
Let's see how BattleField 4 looks- EA is obviously in bed with AMD, and is pushing it. It will be very interesting to see how it performs on NVidia hardware. And here's the smack talk:
http://www.maximumpc.com/amd_r9_290x_will_be_much_faster_titan_battlefield_4
Which I would have no problem with if they were actually competing, but this is just lock-in, walled garden cheating, which is bad for all users ultimately. Ultimately open platforms win, but you can have a long period of bad experiences before that happens.
Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers
As I tried to say in my previous posts, my major concern is that Mantle may not just kick NVIDIA out of the PC gaming market, but it could ultimately kill PC gaming as a whole.
Right now, thanks to an ultimately bad move from the big N, AMD has already established a monopoly over consoles which makes it potentially easy to convince developers and even Microsoft to ditch Directx and focus on the low level API. Microsoft would use Mantle to compete against the PS4 (which will also use Mantle, as Sony doesn't even have a standard API for their machine) and I'm afraid they'd like this solution: it's no mystery that, since entering the console market, Microsoft has always wanted users to own a PC for professional purposes AND an Eggbox for gaming. This way they would be super happy.
There's more: AMD has Always been famous for spending a gynormous amount of money to promote their stuff (the already established deal with EA says it all and I'm sure Square-Enix is also in) and they seem to have Valve and still influential has-been John Carmack on their side.
Now let's say some developers don't want this monopoly to happen and they will still use Directx to make their games work on Nvidia and integrated chips: if Mantle is as powerful as it rightfully seems, it's pretty obvious that AMD will be able to significantly lower the price of their video cards while keeping strong performances and ultimately win over their hated rival.
A massive use of the new API will sooner or later leave AMD as the only serious video card manufacturer in the world. What would the consequences be? I, as a longtime PC gamer, would say: why do I have to spend nearly 4000 EUR for a super PC when I can get the same identical result from a 500-600 EUR console? This type of thinking may likely kill PC gaming, Intel and, ironically, AMD super processors and pricey Radeon cards.
NVIDIA could come up with a Mantle of their own: do you think most developers will want to work on two radically different wrappers just to make a PC port?
Sorry, I've been too long, but I'd like to say one last thing: I don't like the comparison between Mantle and Physx: the latter is just a gimmick, some additional bling bling that doesn't alter gameplay and doesn't really punish those who can't/don't wanna use it and let's not even mention the fact that it has a software version which works like any other Havok out there.
Mantle is a declaration of war: win or die trying.