I did tests of a few games last night. Unfortunately, they all still have a CPU bottleneck with 3D vision enabled. The Witcher 3, for example, showed a CPU usage decrease of 20% on the new system. On my old system, it was as high as 46% usage drop. It seems the lower the IPC, the worse the problem gets.
nVidia also confirmed a degradation in CPU usage in games when 3D Vision is enabled via email a while back. They said that they found that it was due to 3D Vision making the game thread idle on the CPU, as well as un-needed copying over of data between cards if using SLi. As for a fix or a time frame, we don't know - all we know is that it has been assigned to an nvidia driver developer as lower priority, and they are working on it.
What does this mean?
Well, If it's not fixed (many people here are skeptical about nVidia fixing it), it probably means that Ryzen may not be as great at 3D Vision gaming, even if it's spectacular at 2D gaming. I hope that I am wrong; I really want AMD to do well :)
FWIW, A lot of people were getting the 6900K up to 4.4GHz from 4.0GHz. Hopefully, we can expect similar from Ryzen!
I did tests of a few games last night. Unfortunately, they all still have a CPU bottleneck with 3D vision enabled. The Witcher 3, for example, showed a CPU usage decrease of 20% on the new system. On my old system, it was as high as 46% usage drop. It seems the lower the IPC, the worse the problem gets.
nVidia also confirmed a degradation in CPU usage in games when 3D Vision is enabled via email a while back. They said that they found that it was due to 3D Vision making the game thread idle on the CPU, as well as un-needed copying over of data between cards if using SLi. As for a fix or a time frame, we don't know - all we know is that it has been assigned to an nvidia driver developer as lower priority, and they are working on it.
What does this mean?
Well, If it's not fixed (many people here are skeptical about nVidia fixing it), it probably means that Ryzen may not be as great at 3D Vision gaming, even if it's spectacular at 2D gaming. I hope that I am wrong; I really want AMD to do well :)
FWIW, A lot of people were getting the 6900K up to 4.4GHz from 4.0GHz. Hopefully, we can expect similar from Ryzen!
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
The Witcher 3 is one of the games that is not optimised. It has the same problem in 2D
https://www.computerbase.de/2017-02/cpu-skalierung-kerne-spiele-test/#diagramm-the-witcher-3-fps
Would be relevant to test
BF1, Rise of the tomb Raider, Shadow Warrior 2, Deus Ex Mankind devided.
The Witcher 3 is one of the games that is not optimised. It has the same problem in 2D
https://www.computerbase.de/2017-02/cpu-skalierung-kerne-spiele-test/#diagramm-the-witcher-3-fps
Would be relevant to test
BF1, Rise of the tomb Raider, Shadow Warrior 2, Deus Ex Mankind devided.
Intel i7 8086K
Gigabyte GTX 1080Ti Aorus Extreme
DDR4 2x8gb 3200mhz Cl14
TV LG OLED65E6V
Avegant Glyph
Windows 10 64bits
PSA: http://wccftech.com is banned from a lot of discussion forums and is generally considered scummy trash as they will print any nonsense to get views.
More info here on r/AMD (previous post on this thread linked to an r/AMD post, hence relevence), and other places:
https://www.reddit.com/r/Amd/comments/4pvzc7/hey_mods_can_we_ban_wccftech/
AMD is usually sincere in its claims. It's websites like wccftech.com which hype the products up, and ultimately it disappoints everyone due to unreasonable expectations; it being no fault of AMD's.
What we know officially is that:
1800X stock IPC 163 @CineBench, vs something like a 7700K Stock IPC @CineBench 200.
Multi Core performance is:
1601@CineBench vs 985
[img]http://images.anandtech.com/doci/11143/R-18b_575px.jpg[/img]
http://www.anandtech.com/show/11143/amd-launch-ryzen-52-more-ipc-eight-cores-for-under-330-preorder-today-on-sale-march-2nd
Let's be very cautious about what we consume :)
PSA: http://wccftech.com is banned from a lot of discussion forums and is generally considered scummy trash as they will print any nonsense to get views.
AMD is usually sincere in its claims. It's websites like wccftech.com which hype the products up, and ultimately it disappoints everyone due to unreasonable expectations; it being no fault of AMD's.
What we know officially is that:
1800X stock IPC 163 @CineBench, vs something like a 7700K Stock IPC @CineBench 200.
The new windows gaming mode might make these AMD chips actually better for games.
Especially the assigning cores for just the game feature.
It's going to be a good year for pc hardware prices and performance. That's for sure.
I bet nvidia are worried!
I'm so glad this has happened. Should start a nice pricing war!
Maybe nvidia won't hold back with the 1080TI seeing its so late to the game.
Full chip pascal and hbm2 would be nice!
The new windows gaming mode might make these AMD chips actually better for games.
Especially the assigning cores for just the game feature.
It's going to be a good year for pc hardware prices and performance. That's for sure.
I bet nvidia are worried!
I'm so glad this has happened. Should start a nice pricing war!
Maybe nvidia won't hold back with the 1080TI seeing its so late to the game.
Full chip pascal and hbm2 would be nice!
Came across this, seems legit, although not very well done:
https://www.youtube.com/watch?v=YWEHs_R5t9s&feature=youtu.be&t=305
More specifically:
https://youtu.be/YWEHs_R5t9s?t=305
GTA5 is a great benchmark, however, 1080p with averages being so close while max and mins have a wide disparity means the benchmark is being mostly GPU limited. He needs to turn down the resolution to as low as it will go. As it stands, he can have a 50GHz 100 core CPU, and it will still only show averages of ~90 fps.
He does mention that he did it this way to show a "real world" gaming scenario *sigh*. Not too useful, especially as we don't know how high Ryzen will overclock. Hopefully above 4.5.
@GibsonRed
I'm not so sure matey. My view is that the more the number of cores (and with core parking disabled), then windows can run every background process on the most under-utilised cores while the game utilised cores remain untouched for max game performance. This means that game mode would matter less in a higher core count CPU.
Also, since SSDs are faster than what a CPU can decode nowadays (game loading times are the same regardless of how fast an SSD is, only scaling to CPU speed), then even windows update or installations in the background should in theory have a minimal impact on games and mitigate such problems as micro-stutter, assuming the cores being used for games are untouched by windows.
GTA5 is a great benchmark, however, 1080p with averages being so close while max and mins have a wide disparity means the benchmark is being mostly GPU limited. He needs to turn down the resolution to as low as it will go. As it stands, he can have a 50GHz 100 core CPU, and it will still only show averages of ~90 fps.
He does mention that he did it this way to show a "real world" gaming scenario *sigh*. Not too useful, especially as we don't know how high Ryzen will overclock. Hopefully above 4.5.
@GibsonRed
I'm not so sure matey. My view is that the more the number of cores (and with core parking disabled), then windows can run every background process on the most under-utilised cores while the game utilised cores remain untouched for max game performance. This means that game mode would matter less in a higher core count CPU.
Also, since SSDs are faster than what a CPU can decode nowadays (game loading times are the same regardless of how fast an SSD is, only scaling to CPU speed), then even windows update or installations in the background should in theory have a minimal impact on games and mitigate such problems as micro-stutter, assuming the cores being used for games are untouched by windows.
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
Yeah the new gaming mode coming out has a feature where it dedicates cores for gaming only.
At the moment everything is still run over all threads in the background.
There's a few more benchmarks out showing it beating Intel in battlefield 1 and beating an Intel 7700K whilst playing dots and streaming at the same time.
I think I'll get an AMD next. It's worth it for twice as many cores. Especially for music production software etc.
Yeah the new gaming mode coming out has a feature where it dedicates cores for gaming only.
At the moment everything is still run over all threads in the background.
There's a few more benchmarks out showing it beating Intel in battlefield 1 and beating an Intel 7700K whilst playing dots and streaming at the same time.
I think I'll get an AMD next. It's worth it for twice as many cores. Especially for music production software etc.
Either way, it doesn't seem like either would be a wrong choice :)
I was just reading a post linked to Gibbo, from Overclockers.co.uk, who says that after trying out Ryzen 1700, he can't get them past 3.8GHz on normal boards, and ~ 4.05GHz on OC boards. Hopefully at least a 1800X will do 4.4, which you might want to go for over a more popular "gaming" 1700/X.
[url]https://forums.overclockers.co.uk/threads/amd-zen-thread-inc-am4-apu-discussion.18665505/page-401#post-30533503[/url]
Either way, it doesn't seem like either would be a wrong choice :)
I was just reading a post linked to Gibbo, from Overclockers.co.uk, who says that after trying out Ryzen 1700, he can't get them past 3.8GHz on normal boards, and ~ 4.05GHz on OC boards. Hopefully at least a 1800X will do 4.4, which you might want to go for over a more popular "gaming" 1700/X.
According to another [URL="https://translate.google.co.uk/translate?hl=en&sl=bg&tl=en&u=http%3A%2F%2Fhardwarebg.com%2Fforum%2Fshowthread.php%2F273143-AMD-RYZEN-CPU-(AM4)-2017-%25D0%25B3%3Fp%3D4509548%26viewfull%3D1%23post4509548"]forum[/URL] (post 1222), there is light for win7 users and Ryzen procesors.
For sure I am not going to buy "soon", if anything I will wait for prices to settle or by the time GTX 1080 drop in price so I can get a second one.
Then, according to [url="https://www.scan.co.uk/shop/computer-hardware/overclocked-bundles/3xs-professionally-overclocked-amd-ryzen-bundles"]scan.co.uk overclocked bundles[/url]
1700X does 4Ghz and 1080X 4.2Ghz, where the board is playing a big role of course.
According to another forum (post 1222), there is light for win7 users and Ryzen procesors.
For sure I am not going to buy "soon", if anything I will wait for prices to settle or by the time GTX 1080 drop in price so I can get a second one.
Then, according to scan.co.uk overclocked bundles
1700X does 4Ghz and 1080X 4.2Ghz, where the board is playing a big role of course.
Ryzen 1700X 3.9GHz | Asrock X370 Taichi | 16GB G.Skill
GTX 1080 Ti SLI | 850W EVGA P2 | Win7x64
Asus VG278HR | Panasonic TX-58EX750B 4K Active 3D
I already got hyped for Ryzen and if the Win7 support is true, then my next CPU will definitely be an AMD. Because I have no intention to jump on Win10 train in near future (maybe dual boot for special occasions).
I already got hyped for Ryzen and if the Win7 support is true, then my next CPU will definitely be an AMD. Because I have no intention to jump on Win10 train in near future (maybe dual boot for special occasions).
Asus Deluxe Gen3, Core i7 2700k@4.5Ghz, GTX 1080Ti, 16 GB RAM, Win 7 64bit
Samsung Pro 250 GB SSD, 4 TB WD Black (games)
Benq XL2720Z
So, finally, we have non-GPU limited Ryzen benchmarks from a reputable site:
[url]http://www.gamersnexus.net/hwreviews/2822-amd-ryzen-r7-1800x-review-premiere-blender-fps-benchmarks/page-7[/url]
We shouldn't be disappointing. Ryzen is a great CPU, even if not the best choice for gaming.
If someone builds a Ryzen system, please post some 3D Vision benchmarks and we can compare :)
Asus Maximus X Hero Z370
MSI Gaming X 1080Ti (2100 mhz OC Watercooled)
8700k (4.7ghz OC Watercooled)
16gb DDR4 3000 Ram
500GB SAMSUNG 860 EVO SERIES SSD M.2
[quote="RAGEdemon"]So, finally, we have non-GPU limited Ryzen benchmarks from a reputable site:
[url]http://www.gamersnexus.net/hwreviews/2822-amd-ryzen-r7-1800x-review-premiere-blender-fps-benchmarks/page-7[/url]
We shouldn't be disappointing. Ryzen is a great CPU, even if not the best choice for gaming.
If someone builds a Ryzen system, please post some 3D Vision benchmarks and we can compare :) [/quote]
Meh... I am extremely disappointed. It scales extremely weakly even in the games like watch dogs 2 which seems to take advantage of +8cores. I really thought it would outperform 7700k in watch dogs 2. In my opinion, its extremely troubling that it didnt.
[quote="clammy"]Def wanna see if there are any 3D Vision performance gains[/quote]
There wont be, compared to modern i7's.
We shouldn't be disappointing. Ryzen is a great CPU, even if not the best choice for gaming.
If someone builds a Ryzen system, please post some 3D Vision benchmarks and we can compare :)
Meh... I am extremely disappointed. It scales extremely weakly even in the games like watch dogs 2 which seems to take advantage of +8cores. I really thought it would outperform 7700k in watch dogs 2. In my opinion, its extremely troubling that it didnt.
clammy said:Def wanna see if there are any 3D Vision performance gains
Indeed, I was hoping that it would be on par with a 6900K in gaming, because of all the official hoo-ha, well; about it being on par with a 6900K in gaming.
What I find dissapointing is:
[color="green"]"When we approached AMD with these results pre-publication, the company defended its product by suggesting that intentionally creating a GPU bottleneck (read: no longer benchmarking the CPU’s performance) would serve as a great equalizer. AMD asked that we consider 4K benchmarks to more heavily load the GPU, thus reducing workload on the CPU and leveling the playing field ... benchmarking CPUs at 4K would be analogous to benchmarking GPUs at 720p: The conclusion would be that every GPU is “the same,” since they’d all choke on the CPU. Same idea here, just the inverse."[/color]
and
[color="green"]"In the Sniper Elite demo, AMD frequently looked at the skybox when reloading, and often kept more of the skybox in the frustum than on the side-by-side Intel processor. A skybox has no geometry, which is what loads a CPU with draw calls, and so it’ll inflate the framerate by nature of testing with chaotically conducted methodology. As for the Battlefield 1 benchmarks, AMD also conducted using chaotic methods wherein the AMD CPU would zoom / look at different intervals than the Intel CPU, making it effectively impossible to compare the two head-to-head.
And, most importantly, all of these demos were run at 4K resolution. That creates a GPU bottleneck, meaning we are no longer observing true CPU performance. The analog would be to benchmark all GPUs at 720p, then declare they are equal (by way of tester-created CPU bottlenecks). There’s an argument to be made that low-end performance doesn’t matter if you’re stuck on the GPU, but that’s a bad argument: You don’t buy a worse-performing product for more money, especially when GPU upgrades will eventually out those limitations as bottlenecks external to the CPU vanish."[/color]
Basically, they tried to con their fans and consumers with underhandedness, and tried to taint supposedly unbiased review sites to do the same. It succeeded at times, as a lot of the armature reviews aimed at the Average Joe Public such as Linus Tech Tips, are only benchmarking 4k - Linus got some exclusive coverage on the lead-up to the release - Bribery is another charge on the list. This is what really gets me. So much for the "being the better company" :(
I was really rooting for you AMD!
Still, the uninitiated average person will buy these for one reason or another, and they are generally great investments, if not the best. Hopefully it will lead to better products from both Intel and AMD in the future.
Indeed, I was hoping that it would be on par with a 6900K in gaming, because of all the official hoo-ha, well; about it being on par with a 6900K in gaming.
What I find dissapointing is:
"When we approached AMD with these results pre-publication, the company defended its product by suggesting that intentionally creating a GPU bottleneck (read: no longer benchmarking the CPU’s performance) would serve as a great equalizer. AMD asked that we consider 4K benchmarks to more heavily load the GPU, thus reducing workload on the CPU and leveling the playing field ... benchmarking CPUs at 4K would be analogous to benchmarking GPUs at 720p: The conclusion would be that every GPU is “the same,” since they’d all choke on the CPU. Same idea here, just the inverse."
and
"In the Sniper Elite demo, AMD frequently looked at the skybox when reloading, and often kept more of the skybox in the frustum than on the side-by-side Intel processor. A skybox has no geometry, which is what loads a CPU with draw calls, and so it’ll inflate the framerate by nature of testing with chaotically conducted methodology. As for the Battlefield 1 benchmarks, AMD also conducted using chaotic methods wherein the AMD CPU would zoom / look at different intervals than the Intel CPU, making it effectively impossible to compare the two head-to-head.
And, most importantly, all of these demos were run at 4K resolution. That creates a GPU bottleneck, meaning we are no longer observing true CPU performance. The analog would be to benchmark all GPUs at 720p, then declare they are equal (by way of tester-created CPU bottlenecks). There’s an argument to be made that low-end performance doesn’t matter if you’re stuck on the GPU, but that’s a bad argument: You don’t buy a worse-performing product for more money, especially when GPU upgrades will eventually out those limitations as bottlenecks external to the CPU vanish."
Basically, they tried to con their fans and consumers with underhandedness, and tried to taint supposedly unbiased review sites to do the same. It succeeded at times, as a lot of the armature reviews aimed at the Average Joe Public such as Linus Tech Tips, are only benchmarking 4k - Linus got some exclusive coverage on the lead-up to the release - Bribery is another charge on the list. This is what really gets me. So much for the "being the better company" :(
I was really rooting for you AMD!
Still, the uninitiated average person will buy these for one reason or another, and they are generally great investments, if not the best. Hopefully it will lead to better products from both Intel and AMD in the future.
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
It looks like this is a great chip for workstations for people that are looking for value. Gaming on the other hand it isn't so impressive...
I feel like if I get a 7700 k i'm going to get sandybridged with my purchase when coffeelake comes out(6core 12 thread). like when I got my current chip (17 860).
Nobody ever talks about the i7 860/920 because of the almighty sandybridge.
It looks like this is a great chip for workstations for people that are looking for value. Gaming on the other hand it isn't so impressive...
I feel like if I get a 7700 k i'm going to get sandybridged with my purchase when coffeelake comes out(6core 12 thread). like when I got my current chip (17 860).
Nobody ever talks about the i7 860/920 because of the almighty sandybridge.
[quote="RAGEdemon"]Indeed, I was hoping that it would be on par with a 6900K in gaming, because of all the official hoo-ha, well; about it being on par with a 6900K in gaming.
What I find dissapointing is:
[color="green"]"When we approached AMD with these results pre-publication, the company defended its product by suggesting that intentionally creating a GPU bottleneck (read: no longer benchmarking the CPU’s performance) would serve as a great equalizer. AMD asked that we consider 4K benchmarks to more heavily load the GPU, thus reducing workload on the CPU and leveling the playing field ... benchmarking CPUs at 4K would be analogous to benchmarking GPUs at 720p: The conclusion would be that every GPU is “the same,” since they’d all choke on the CPU. Same idea here, just the inverse."[/color]
and
[color="green"]"In the Sniper Elite demo, AMD frequently looked at the skybox when reloading, and often kept more of the skybox in the frustum than on the side-by-side Intel processor. A skybox has no geometry, which is what loads a CPU with draw calls, and so it’ll inflate the framerate by nature of testing with chaotically conducted methodology. As for the Battlefield 1 benchmarks, AMD also conducted using chaotic methods wherein the AMD CPU would zoom / look at different intervals than the Intel CPU, making it effectively impossible to compare the two head-to-head.
And, most importantly, all of these demos were run at 4K resolution. That creates a GPU bottleneck, meaning we are no longer observing true CPU performance. The analog would be to benchmark all GPUs at 720p, then declare they are equal (by way of tester-created CPU bottlenecks). There’s an argument to be made that low-end performance doesn’t matter if you’re stuck on the GPU, but that’s a bad argument: You don’t buy a worse-performing product for more money, especially when GPU upgrades will eventually out those limitations as bottlenecks external to the CPU vanish."[/color]
Basically, they tried to con their fans and consumers with underhandedness, and tried to taint supposedly unbiased review sites to do the same. It succeeded at times, as a lot of the armature reviews aimed at the Average Joe Public such as Linus Tech Tips, are only benchmarking 4k - Linus got some exclusive coverage on the lead-up to the release - Bribery is another charge on the list. This is what really gets me. So much for the "being the better company" :(
I was really rooting for you AMD!
Still, the uninitiated average person will buy these for one reason or another, and they are generally great investments, if not the best. Hopefully it will lead to better products from both Intel and AMD in the future.[/quote]I'm seeing varying performance across the net. One thing for certain is the platform appears to be a bit buggy. Check out Jokers results. They are actually very favorable for the 1700.
https://www.youtube.com/watch?v=BXVIPo_qbc4
RAGEdemon said:Indeed, I was hoping that it would be on par with a 6900K in gaming, because of all the official hoo-ha, well; about it being on par with a 6900K in gaming.
What I find dissapointing is:
"When we approached AMD with these results pre-publication, the company defended its product by suggesting that intentionally creating a GPU bottleneck (read: no longer benchmarking the CPU’s performance) would serve as a great equalizer. AMD asked that we consider 4K benchmarks to more heavily load the GPU, thus reducing workload on the CPU and leveling the playing field ... benchmarking CPUs at 4K would be analogous to benchmarking GPUs at 720p: The conclusion would be that every GPU is “the same,” since they’d all choke on the CPU. Same idea here, just the inverse."
and
"In the Sniper Elite demo, AMD frequently looked at the skybox when reloading, and often kept more of the skybox in the frustum than on the side-by-side Intel processor. A skybox has no geometry, which is what loads a CPU with draw calls, and so it’ll inflate the framerate by nature of testing with chaotically conducted methodology. As for the Battlefield 1 benchmarks, AMD also conducted using chaotic methods wherein the AMD CPU would zoom / look at different intervals than the Intel CPU, making it effectively impossible to compare the two head-to-head.
And, most importantly, all of these demos were run at 4K resolution. That creates a GPU bottleneck, meaning we are no longer observing true CPU performance. The analog would be to benchmark all GPUs at 720p, then declare they are equal (by way of tester-created CPU bottlenecks). There’s an argument to be made that low-end performance doesn’t matter if you’re stuck on the GPU, but that’s a bad argument: You don’t buy a worse-performing product for more money, especially when GPU upgrades will eventually out those limitations as bottlenecks external to the CPU vanish."
Basically, they tried to con their fans and consumers with underhandedness, and tried to taint supposedly unbiased review sites to do the same. It succeeded at times, as a lot of the armature reviews aimed at the Average Joe Public such as Linus Tech Tips, are only benchmarking 4k - Linus got some exclusive coverage on the lead-up to the release - Bribery is another charge on the list. This is what really gets me. So much for the "being the better company" :(
I was really rooting for you AMD!
Still, the uninitiated average person will buy these for one reason or another, and they are generally great investments, if not the best. Hopefully it will lead to better products from both Intel and AMD in the future.
I'm seeing varying performance across the net. One thing for certain is the platform appears to be a bit buggy. Check out Jokers results. They are actually very favorable for the 1700.
nVidia also confirmed a degradation in CPU usage in games when 3D Vision is enabled via email a while back. They said that they found that it was due to 3D Vision making the game thread idle on the CPU, as well as un-needed copying over of data between cards if using SLi. As for a fix or a time frame, we don't know - all we know is that it has been assigned to an nvidia driver developer as lower priority, and they are working on it.
What does this mean?
Well, If it's not fixed (many people here are skeptical about nVidia fixing it), it probably means that Ryzen may not be as great at 3D Vision gaming, even if it's spectacular at 2D gaming. I hope that I am wrong; I really want AMD to do well :)
FWIW, A lot of people were getting the 6900K up to 4.4GHz from 4.0GHz. Hopefully, we can expect similar from Ryzen!
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
https://www.computerbase.de/2017-02/cpu-skalierung-kerne-spiele-test/#diagramm-the-witcher-3-fps
Would be relevant to test
BF1, Rise of the tomb Raider, Shadow Warrior 2, Deus Ex Mankind devided.
Intel i7 8086K
Gigabyte GTX 1080Ti Aorus Extreme
DDR4 2x8gb 3200mhz Cl14
TV LG OLED65E6V
Avegant Glyph
Windows 10 64bits
More info here on r/AMD (previous post on this thread linked to an r/AMD post, hence relevence), and other places:
https://www.reddit.com/r/Amd/comments/4pvzc7/hey_mods_can_we_ban_wccftech/
AMD is usually sincere in its claims. It's websites like wccftech.com which hype the products up, and ultimately it disappoints everyone due to unreasonable expectations; it being no fault of AMD's.
What we know officially is that:
1800X stock IPC 163 @CineBench, vs something like a 7700K Stock IPC @CineBench 200.
Multi Core performance is:
1601@CineBench vs 985
http://www.anandtech.com/show/11143/amd-launch-ryzen-52-more-ipc-eight-cores-for-under-330-preorder-today-on-sale-march-2nd
Let's be very cautious about what we consume :)
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
Especially the assigning cores for just the game feature.
It's going to be a good year for pc hardware prices and performance. That's for sure.
I bet nvidia are worried!
I'm so glad this has happened. Should start a nice pricing war!
Maybe nvidia won't hold back with the 1080TI seeing its so late to the game.
Full chip pascal and hbm2 would be nice!
;feature=youtu.be&t=305
More specifically:
https://youtu.be/YWEHs_R5t9s?t=305
GTA5 is a great benchmark, however, 1080p with averages being so close while max and mins have a wide disparity means the benchmark is being mostly GPU limited. He needs to turn down the resolution to as low as it will go. As it stands, he can have a 50GHz 100 core CPU, and it will still only show averages of ~90 fps.
He does mention that he did it this way to show a "real world" gaming scenario *sigh*. Not too useful, especially as we don't know how high Ryzen will overclock. Hopefully above 4.5.
@GibsonRed
I'm not so sure matey. My view is that the more the number of cores (and with core parking disabled), then windows can run every background process on the most under-utilised cores while the game utilised cores remain untouched for max game performance. This means that game mode would matter less in a higher core count CPU.
Also, since SSDs are faster than what a CPU can decode nowadays (game loading times are the same regardless of how fast an SSD is, only scaling to CPU speed), then even windows update or installations in the background should in theory have a minimal impact on games and mitigate such problems as micro-stutter, assuming the cores being used for games are untouched by windows.
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
At the moment everything is still run over all threads in the background.
There's a few more benchmarks out showing it beating Intel in battlefield 1 and beating an Intel 7700K whilst playing dots and streaming at the same time.
I think I'll get an AMD next. It's worth it for twice as many cores. Especially for music production software etc.
I was just reading a post linked to Gibbo, from Overclockers.co.uk, who says that after trying out Ryzen 1700, he can't get them past 3.8GHz on normal boards, and ~ 4.05GHz on OC boards. Hopefully at least a 1800X will do 4.4, which you might want to go for over a more popular "gaming" 1700/X.
https://forums.overclockers.co.uk/threads/amd-zen-thread-inc-am4-apu-discussion.18665505/page-401#post-30533503
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
For sure I am not going to buy "soon", if anything I will wait for prices to settle or by the time GTX 1080 drop in price so I can get a second one.
Then, according to scan.co.uk overclocked bundles
1700X does 4Ghz and 1080X 4.2Ghz, where the board is playing a big role of course.
Ryzen 1700X 3.9GHz | Asrock X370 Taichi | 16GB G.Skill
GTX 1080 Ti SLI | 850W EVGA P2 | Win7x64
Asus VG278HR | Panasonic TX-58EX750B 4K Active 3D
Asus Deluxe Gen3, Core i7 2700k@4.5Ghz, GTX 1080Ti, 16 GB RAM, Win 7 64bit
Samsung Pro 250 GB SSD, 4 TB WD Black (games)
Benq XL2720Z
http://www.gamersnexus.net/hwreviews/2822-amd-ryzen-r7-1800x-review-premiere-blender-fps-benchmarks/page-7
We shouldn't be disappointing. Ryzen is a great CPU, even if not the best choice for gaming.
If someone builds a Ryzen system, please post some 3D Vision benchmarks and we can compare :)
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
Gaming Rig 1
i7 5820K 3.3ghz (Stock Clock)
GTX 1080 Founders Edition (Stock Clock)
16GB DDR4 2400 RAM
512 SAMSUNG 840 PRO
Gaming Rig 2
My new build
Asus Maximus X Hero Z370
MSI Gaming X 1080Ti (2100 mhz OC Watercooled)
8700k (4.7ghz OC Watercooled)
16gb DDR4 3000 Ram
500GB SAMSUNG 860 EVO SERIES SSD M.2
Meh... I am extremely disappointed. It scales extremely weakly even in the games like watch dogs 2 which seems to take advantage of +8cores. I really thought it would outperform 7700k in watch dogs 2. In my opinion, its extremely troubling that it didnt.
There wont be, compared to modern i7's.
What I find dissapointing is:
"When we approached AMD with these results pre-publication, the company defended its product by suggesting that intentionally creating a GPU bottleneck (read: no longer benchmarking the CPU’s performance) would serve as a great equalizer. AMD asked that we consider 4K benchmarks to more heavily load the GPU, thus reducing workload on the CPU and leveling the playing field ... benchmarking CPUs at 4K would be analogous to benchmarking GPUs at 720p: The conclusion would be that every GPU is “the same,” since they’d all choke on the CPU. Same idea here, just the inverse."
and
"In the Sniper Elite demo, AMD frequently looked at the skybox when reloading, and often kept more of the skybox in the frustum than on the side-by-side Intel processor. A skybox has no geometry, which is what loads a CPU with draw calls, and so it’ll inflate the framerate by nature of testing with chaotically conducted methodology. As for the Battlefield 1 benchmarks, AMD also conducted using chaotic methods wherein the AMD CPU would zoom / look at different intervals than the Intel CPU, making it effectively impossible to compare the two head-to-head.
And, most importantly, all of these demos were run at 4K resolution. That creates a GPU bottleneck, meaning we are no longer observing true CPU performance. The analog would be to benchmark all GPUs at 720p, then declare they are equal (by way of tester-created CPU bottlenecks). There’s an argument to be made that low-end performance doesn’t matter if you’re stuck on the GPU, but that’s a bad argument: You don’t buy a worse-performing product for more money, especially when GPU upgrades will eventually out those limitations as bottlenecks external to the CPU vanish."
Basically, they tried to con their fans and consumers with underhandedness, and tried to taint supposedly unbiased review sites to do the same. It succeeded at times, as a lot of the armature reviews aimed at the Average Joe Public such as Linus Tech Tips, are only benchmarking 4k - Linus got some exclusive coverage on the lead-up to the release - Bribery is another charge on the list. This is what really gets me. So much for the "being the better company" :(
I was really rooting for you AMD!
Still, the uninitiated average person will buy these for one reason or another, and they are generally great investments, if not the best. Hopefully it will lead to better products from both Intel and AMD in the future.
Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.
I feel like if I get a 7700 k i'm going to get sandybridged with my purchase when coffeelake comes out(6core 12 thread). like when I got my current chip (17 860).
Nobody ever talks about the i7 860/920 because of the almighty sandybridge.