Switching from ATI 6850 to nVidia 3D
  6 / 6    
Depends on the chipset. Intel is a much more overall solid brand for [u]gaming[/u]. For a long time, amd cpu's were just bad for gaming (still might be, I been out of the loop for a while). They just kept adding cores cores and doesnt matter when they are not utilized. I really dont see AMD as great for gaming tbh. Like people will see its got 1ghz on an intel CPU and assume its faster, but its not. I stopped following AMD as of like two years ago though. But it was so bad that at what point the I5 core 2 Duo was outperforming the amd 6/8 cores. I cant find the main tom's hardware post that I am looking for but this sums it up. [url]http://www.tomshardware.com/reviews/cpu-performance-comparison,3370-16.html[/url] I really recommend looking at some large scale benchmarks from toms hardware before buying any piece of hardware for computer.
Depends on the chipset. Intel is a much more overall solid brand for gaming.

For a long time, amd cpu's were just bad for gaming (still might be, I been out of the loop for a while). They just kept adding cores cores and doesnt matter when they are not utilized. I really dont see AMD as great for gaming tbh. Like people will see its got 1ghz on an intel CPU and assume its faster, but its not. I stopped following AMD as of like two years ago though. But it was so bad that at what point the I5 core 2 Duo was outperforming the amd 6/8 cores.
I cant find the main tom's hardware post that I am looking for but this sums it up.
http://www.tomshardware.com/reviews/cpu-performance-comparison,3370-16.html


I really recommend looking at some large scale benchmarks from toms hardware before buying any piece of hardware for computer.

Co-founder of helixmod.blog.com

If you like one of my helixmod patches and want to donate. Can send to me through paypal - eqzitara@yahoo.com

#76
Posted 05/08/2013 09:48 PM   
[quote="purpletalon"]AMD FX-8350 Vishera 4.0GHz (4.2GHz Turbo) Socket AM3+ 125W Eight-Core lol hahahahaha made a long time before the 6200 was and not as high of performance[/quote] What are you smoking there purple? The FX-6200 was released end of Feb 2012 and the FX-8350 & FX-6300 was released end of Oct 2012. Are you talking about the FX-6350 released at the end of April this year? And I really think you must of looked cross eyed at performance charts O.O Could have been how you messed up the release dates also :p AMD vs Intel. Always amusing thing to read. Pick one, buy it, smile until your face hurts and don't let a hard core cpu/gpu lover kick your puppy. My junk will be here within the next 16 hours! w00t! I may be delayed on posting new results as my brother is coming into town. Which means I get to spoil my nieces and maybe take them to the movies :) But I should def have some results posted by Sat night
purpletalon said:AMD FX-8350 Vishera 4.0GHz (4.2GHz Turbo) Socket AM3+ 125W Eight-Core

lol hahahahaha

made a long time before the 6200 was

and not as high of performance


What are you smoking there purple? The FX-6200 was released end of Feb 2012 and the FX-8350 & FX-6300 was released end of Oct 2012. Are you talking about the FX-6350 released at the end of April this year? And I really think you must of looked cross eyed at performance charts O.O
Could have been how you messed up the release dates also :p

AMD vs Intel. Always amusing thing to read. Pick one, buy it, smile until your face hurts and don't let a hard core cpu/gpu lover kick your puppy.

My junk will be here within the next 16 hours! w00t!
I may be delayed on posting new results as my brother is coming into town. Which means I get to spoil my nieces and maybe take them to the movies :)
But I should def have some results posted by Sat night

#77
Posted 05/10/2013 05:22 AM   
Switch back. Quickly. This will be painful. Trust me.
Switch back. Quickly. This will be painful. Trust me.

#78
Posted 05/12/2013 05:25 PM   
[quote="Krighton"]Switch back. Quickly. This will be painful. Trust me.[/quote] Judging from displays you are a 2D user. You kind of need a 3d display to give that kind of recommendation.
Krighton said:Switch back. Quickly. This will be painful. Trust me.

Judging from displays you are a 2D user. You kind of need a 3d display to give that kind of recommendation.

Co-founder of helixmod.blog.com

If you like one of my helixmod patches and want to donate. Can send to me through paypal - eqzitara@yahoo.com

#79
Posted 05/12/2013 06:04 PM   
[quote="Krighton"]Switch back. Quickly. This will be painful. Trust me.[/quote] You don't know what you're talking about dude...
Krighton said:Switch back. Quickly. This will be painful. Trust me.


You don't know what you're talking about dude...

AsRock X58 Extreme6 mobo
Intel Core-i7 950 @ 4ghz
12gb Corsair Dominator DDR3 1600
ASUS DirectCU II GTX 780 3gb
Corsair TX 950w PSU
NZXT Phantom Red/Black Case
3d Vision 1 w/ Samsung 2233rz Monitor
3d Vision 2 w/ ASUS VG278HE Monitor

#80
Posted 05/12/2013 06:50 PM   
Didn't mean to take so long in updating. Mothers Day, family etc.. than I've just been backing up files from other computers to store in my new system =) This new system has increased overall 2D and 3D performance by 10-20fps. This is still based on WoW & Skyrim. I no longer have access to the 660s to test in this new system for comparison. But I'll have to be happy for now. Skyrim runs smooth now (wish they would fix shadows in 3D). Still irritates me when I read others on amazon and newegg that have this exact same setup and mention that with 1080 ultra settings they receive '80+ fps', '80-100fps', 'avg 80fps never dropped below 60fps'... while I maintain 50-70fps and range 35-50fps in busy orgrimmar. *they stated no OC Last week I sent an email to gigabyte asking for their input about my 670. The first reply was insulting as it was obvious they barely read the 5 sentences I typed and asked if I could test the card in a 2nd computer... my 5 sentences stated I tested in 4 different computers and a link to this thread. I chewed them out how their response was insulting to us gamers, enthusiasts and IT techs who spend days and dollars to trouble shoot defective hardware/software/drivers and than have a lazy ### glance at my/our text and toss us a suggestion that was already tried. They need to be fired and start a career cooking my burger. The next response was very impressive. They assembled my exact computer and OS and tested the exact same 2 GPUs I have been testing (6850 & 670) and sent me screen shots showing the hardware, OS & driver versions and performance results. This was impressive enough for me to forget all the normal lazy responses most 'tech' departments give us. Using some of the same GPU software they used, my video card is close enough to their results. I run 1fps slower than what they show and 2C hotter. The software they used: FurMark FluidMark If anyone wants to compare results, I will run another test now and post. For anyone curious on if i'm happy with switching to NVidia. Yes, the 3D like others have stated is greater than I thought it would be. Everyone I show 3D is amazed. Usually followed by "I hate you" as now they have to spend money for 3D. There is glitches in games such as Skyrim where you need to turn off shadows, but it is well worth it. I am a long time ATI fan but the 3D is worth the switch. For any other dedicated ATI fan, I am still an ATI fan. But as long as NVidia rocks the 3D, I will be using NVidia from now on. As an AMD fan, if you can afford it, I suggest you pair NVidia with Intel. My neighbors i7 does give my 670 better fps than this AMD FX-8350.
Didn't mean to take so long in updating. Mothers Day, family etc.. than I've just been backing up files from other computers to store in my new system =)

This new system has increased overall 2D and 3D performance by 10-20fps. This is still based on WoW & Skyrim. I no longer have access to the 660s to test in this new system for comparison. But I'll have to be happy for now. Skyrim runs smooth now (wish they would fix shadows in 3D). Still irritates me when I read others on amazon and newegg that have this exact same setup and mention that with 1080 ultra settings they receive '80+ fps', '80-100fps', 'avg 80fps never dropped below 60fps'... while I maintain 50-70fps and range 35-50fps in busy orgrimmar. *they stated no OC

Last week I sent an email to gigabyte asking for their input about my 670. The first reply was insulting as it was obvious they barely read the 5 sentences I typed and asked if I could test the card in a 2nd computer... my 5 sentences stated I tested in 4 different computers and a link to this thread. I chewed them out how their response was insulting to us gamers, enthusiasts and IT techs who spend days and dollars to trouble shoot defective hardware/software/drivers and than have a lazy ### glance at my/our text and toss us a suggestion that was already tried. They need to be fired and start a career cooking my burger.
The next response was very impressive. They assembled my exact computer and OS and tested the exact same 2 GPUs I have been testing (6850 & 670) and sent me screen shots showing the hardware, OS & driver versions and performance results. This was impressive enough for me to forget all the normal lazy responses most 'tech' departments give us.
Using some of the same GPU software they used, my video card is close enough to their results. I run 1fps slower than what they show and 2C hotter.

The software they used:
FurMark
FluidMark

If anyone wants to compare results, I will run another test now and post.

For anyone curious on if i'm happy with switching to NVidia. Yes, the 3D like others have stated is greater than I thought it would be. Everyone I show 3D is amazed. Usually followed by "I hate you" as now they have to spend money for 3D. There is glitches in games such as Skyrim where you need to turn off shadows, but it is well worth it. I am a long time ATI fan but the 3D is worth the switch. For any other dedicated ATI fan, I am still an ATI fan. But as long as NVidia rocks the 3D, I will be using NVidia from now on. As an AMD fan, if you can afford it, I suggest you pair NVidia with Intel. My neighbors i7 does give my 670 better fps than this AMD FX-8350.

#81
Posted 05/16/2013 11:36 PM   
[quote="Erasamus"]There is glitches in games such as Skyrim where you need to turn off shadows, but it is well worth it.[/quote] news: [url]http://helixmod.blogspot.com/[/url] list: [url]http://helixmod.wikispot.org/gamelist[/url] skyrim: [url]http://helixmod.blogspot.com/2012/03/elder-scrolls-v-skyrim-3d-vision-all-in.html[/url]
Erasamus said:There is glitches in games such as Skyrim where you need to turn off shadows, but it is well worth it.

news:
http://helixmod.blogspot.com/
list:
http://helixmod.wikispot.org/gamelist
skyrim:
http://helixmod.blogspot.com/2012/03/elder-scrolls-v-skyrim-3d-vision-all-in.html
#82
Posted 05/16/2013 11:50 PM   
Welcome to the storm. Once you are hooked theres no going back, you'll be spending hours fixing and configuring games, likely longer than you'll play them for :) Go here, http://helixmod.blogspot.co.uk/2012/03/elder-scrolls-v-skyrim-3d-vision-all-in.html and here http://helixmod.blogspot.co.uk/ Skyrim is beautiful in 3D once setup right. Also not sure if you know how to set up convergence so if not I'll tell you, head to the Nvidia control panel, enter the 3D Vis menu and from there the hotkey panel. Scroll down and enable advanced control. If you think of 3D Vision in terms of your eyes, convergence equates to the distance between your eyes, and depth the maximum distance your eyes need to separate to focus an image. You can use the Helix tool to create convergence presets, sometimes in games there are times when a lower convergence is needed, e.g. in an FPS if you set convergence high the sights on a gun will be useless, as its like looking down the sights with both eyes open and your eyes 12" apart. Use this http://www.lagmonster.org/docs/DOS7/v-ansi-keys.html to find the key you want to apply a preset to, then in the 'DX9 Settings' file in the Helix fix you set that to be the presets, it would end up looking something like this. [General] Preset1Key = 57 Preset2Key = 48 DefPreset = 1 Then you go in your game, hit the key you want for a preset, adjust convergence, then hit Ctrl-F7. This is worth a read too. https://forums.geforce.com/default/topic/519243/maximum-depth-hack-go-beyond-100-/ I personally find the Nvidia default 100% depth to be a little low, I'm using the same display as you and find 150% to be alot better. Simply follow that if you want it, though if you go too far you can damage your eyes. Hope I didn't repeat stuff you know, but its just stuff I wish I knew from the get go. Wish I could have saw my face when I first played a game and realized the convergence could be adjusted.
Welcome to the storm. Once you are hooked theres no going back, you'll be spending hours fixing and configuring games, likely longer than you'll play them for :)

Go here, http://helixmod.blogspot.co.uk/2012/03/elder-scrolls-v-skyrim-3d-vision-all-in.html
and here http://helixmod.blogspot.co.uk/

Skyrim is beautiful in 3D once setup right. Also not sure if you know how to set up convergence so if not I'll tell you, head to the Nvidia control panel, enter the 3D Vis menu and from there the hotkey panel. Scroll down and enable advanced control. If you think of 3D Vision in terms of your eyes, convergence equates to the distance between your eyes, and depth the maximum distance your eyes need to separate to focus an image.

You can use the Helix tool to create convergence presets, sometimes in games there are times when a lower convergence is needed, e.g. in an FPS if you set convergence high the sights on a gun will be useless, as its like looking down the sights with both eyes open and your eyes 12" apart. Use this http://www.lagmonster.org/docs/DOS7/v-ansi-keys.html to find the key you want to apply a preset to, then in the 'DX9 Settings' file in the Helix fix you set that to be the presets, it would end up looking something like this.

[General]
Preset1Key = 57
Preset2Key = 48
DefPreset = 1

Then you go in your game, hit the key you want for a preset, adjust convergence, then hit Ctrl-F7.

This is worth a read too. https://forums.geforce.com/default/topic/519243/maximum-depth-hack-go-beyond-100-/

I personally find the Nvidia default 100% depth to be a little low, I'm using the same display as you and find 150% to be alot better. Simply follow that if you want it, though if you go too far you can damage your eyes.

Hope I didn't repeat stuff you know, but its just stuff I wish I knew from the get go. Wish I could have saw my face when I first played a game and realized the convergence could be adjusted.

#83
Posted 05/16/2013 11:49 PM   
  6 / 6    
Scroll To Top