RTX 2080 incoming...
  3 / 27    
[quote="Metal-O-Holic"][quote="xXxStarManxXx"][quote="Metal-O-Holic"][quote="Gold444"]Yeah, I can't believe not more people game in Nvidia 3DTV Play. It's a whole other world. Playing games in 2D is like monochrome to me now. I'm not INSIDE that world. I'm just looking at it. Most people don't even know what Nvidia toyification means. Eventhough it was discovered in 2010. I guess most people are just satisfied looking at a 2D screen with no idea of the life changing immersion they are missing out on.[/quote] I have a theory on why 3d gaming/watching is not that popular. I strongly feel most people have poor ”stereo vision”. Yes they see depth and in stereo but the efect Is not that special to them. Second, most people Are too lazy. They just want everything easy as it is, no fiddling. I think most stereo lovers Are people who Are willing to go the extra mile for what ever they wanna do just better result/quality. I Personally can’t game in 2D anymore as all i can think of is how good this what im playing would look in 3d[/quote] I can list a few barriers to why 3D Vision never took off, in order of most pertinent: [/quote] You have good points, but i still believe its mainly due lazyness. When you love Something and experience it better than ever by far you can’t really go back and Price is not major obstacle. Polarized glasses Are great i can easily game 8h and i still have my perscription glasses additionaly to 3d glasses. Also i don’t understand the 60hz issue as you need vsynch with 3d so don’t see lots of reason to go bigger refresh rate as the 3d vision games Are mostly so demanding getting steady 60fps is an archievement on its own already and setting refresh higher while lacking steady framerate over refresh rate creates unnessisery stutter. Though i understand the motion resolution enhances with higher refresh rate if the panel is up to it.[/quote] Wait are you saying that youre using glasses other than 3D Vision 2 for 3D Vision? The difference between 60 and 72 FPS is nearly the same between 45 and 60. If I could run my games at 72 FPS 3D Vision I would be in heaven. I can run them at 60 FPS, here's TW3 (last post, .JPS images) at 60 FPS with 78% load and that's with SweetFX enabled for SMAA and Lumasharpen: https://forums.geforce.com/default/topic/1066696/3d-vision/the-witcher-3-conflicting-with-hd-reworked-5-1/ Utilization does jump to 99% in heavily wooded areas and cities with a lot going on (i.e. Novigrad market / central square) and the FPS may drop to 50-55 but GTX 1180 Ti is right around the corner and will, in all likelihood, be 50% faster than 1080 Ti. We need to be able to use 3D Vision at more than 60 FPS / 120 Hz. I don't know why this is even an issue to be honest.
Metal-O-Holic said:
xXxStarManxXx said:
Metal-O-Holic said:
Gold444 said:Yeah, I can't believe not more people game in Nvidia 3DTV Play. It's a whole other world. Playing games in 2D is like monochrome to me now. I'm not INSIDE that world. I'm just looking at it.

Most people don't even know what Nvidia toyification means. Eventhough it was discovered in 2010.

I guess most people are just satisfied looking at a 2D screen with no idea of the life changing immersion they are missing out on.


I have a theory on why 3d gaming/watching is not that popular.
I strongly feel most people have poor ”stereo vision”. Yes they see depth and in stereo but the efect
Is not that special to them.
Second, most people Are too lazy. They just want everything easy as it is, no fiddling. I think most stereo lovers Are people who Are willing to go the extra mile for what ever they wanna do just better result/quality.
I Personally can’t game in 2D anymore as all i can think of is how good this what im playing would look in 3d


I can list a few barriers to why 3D Vision never took off, in order of most pertinent:



You have good points, but i still believe its mainly due lazyness.
When you love Something and experience it better than ever by far you can’t really go back and Price is not major obstacle.
Polarized glasses Are great i can easily game 8h and i still have my perscription glasses additionaly to 3d glasses.
Also i don’t understand the 60hz issue as you need vsynch with 3d so don’t see lots of reason to go bigger refresh rate as the 3d vision games Are mostly so demanding getting steady 60fps is an archievement on its own already and setting refresh higher while lacking steady framerate over refresh rate creates unnessisery stutter.
Though i understand the motion resolution enhances with higher refresh rate if the panel is up to it.


Wait are you saying that youre using glasses other than 3D Vision 2 for 3D Vision?

The difference between 60 and 72 FPS is nearly the same between 45 and 60. If I could run my games at 72 FPS 3D Vision I would be in heaven.

I can run them at 60 FPS, here's TW3 (last post, .JPS images) at 60 FPS with 78% load and that's with SweetFX enabled for SMAA and Lumasharpen: https://forums.geforce.com/default/topic/1066696/3d-vision/the-witcher-3-conflicting-with-hd-reworked-5-1/

Utilization does jump to 99% in heavily wooded areas and cities with a lot going on (i.e. Novigrad market / central square) and the FPS may drop to 50-55 but GTX 1180 Ti is right around the corner and will, in all likelihood, be 50% faster than 1080 Ti.

We need to be able to use 3D Vision at more than 60 FPS / 120 Hz. I don't know why this is even an issue to be honest.

i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703

https://www.3dmark.com/compare/fs/14520125/fs/11807761#

#31
Posted 08/05/2018 12:11 AM   
[quote="masterotaku"][quote="xXxStarManxXx"] 5. 60 Hz is 60 Hz. Why are we still limited to 60 Hz? I mean, technically I should at least be able to run 3D Vision titles at 72 Hz on my PG278Q considering it is a 144 Hz panel? And why does 3D Vision not work with G-Sync? I have not heard a satisfying answer to this. When one experiences 90-120+ Hz in G-Sync, having to drop to 60 Hz with V-Sync is a noticeable downgrade in responsiveness and smoothness.[/quote] The problem here is most probably TN panel limitations. For example, in ULMB, there is a point at higher refresh rates where you can't avoid the crosstalk area (even in 2D) getting bigger and bigger. Even at 125Hz there is some difference compared to 120Hz, and I've heard that 240Hz monitors, which can have ULMB up to ~151Hz, have even more at those high refresh rates. Lightboost has a better tuned overdrive that avoids crosstalk in most of the screen, but I don't think it can be pushed much higher. Meanwhile, IPS/AHVA/AMVA have too slow response times for 3D, OLED monitors (with the features we want) are inexistent at the moment, and it's still too early for microLED. We are in a tough spot where you have TN at 120Hz or nothing (only talking about normal 3D Vision monitors). At least we got 1440p. By the way, there are ways to force 3D Vision at almost any refresh rate (>=64Hz) if you have ULMB and can deal with reversed eyes and crosstalk not as good as Lightboost. Although my limit was 125Hz with ULMB for timing reasons. The glasses can even work fine at 165Hz.[/quote] Ah I see, and I can reverse the eyes with 3D Vision 2 glasses by holding down the power button correct? PG278Q does ULMB, have you tried the above with this monitor? I'm curious to try 70 FPS. What steps exactly would I need to do to accomplish this? Thanks in advance.
masterotaku said:
xXxStarManxXx said:
5. 60 Hz is 60 Hz. Why are we still limited to 60 Hz? I mean, technically I should at least be able to run 3D Vision titles at 72 Hz on my PG278Q considering it is a 144 Hz panel? And why does 3D Vision not work with G-Sync? I have not heard a satisfying answer to this. When one experiences 90-120+ Hz in G-Sync, having to drop to 60 Hz with V-Sync is a noticeable downgrade in responsiveness and smoothness.


The problem here is most probably TN panel limitations. For example, in ULMB, there is a point at higher refresh rates where you can't avoid the crosstalk area (even in 2D) getting bigger and bigger. Even at 125Hz there is some difference compared to 120Hz, and I've heard that 240Hz monitors, which can have ULMB up to ~151Hz, have even more at those high refresh rates. Lightboost has a better tuned overdrive that avoids crosstalk in most of the screen, but I don't think it can be pushed much higher.

Meanwhile, IPS/AHVA/AMVA have too slow response times for 3D, OLED monitors (with the features we want) are inexistent at the moment, and it's still too early for microLED. We are in a tough spot where you have TN at 120Hz or nothing (only talking about normal 3D Vision monitors). At least we got 1440p.

By the way, there are ways to force 3D Vision at almost any refresh rate (>=64Hz) if you have ULMB and can deal with reversed eyes and crosstalk not as good as Lightboost. Although my limit was 125Hz with ULMB for timing reasons. The glasses can even work fine at 165Hz.


Ah I see, and I can reverse the eyes with 3D Vision 2 glasses by holding down the power button correct? PG278Q does ULMB, have you tried the above with this monitor? I'm curious to try 70 FPS. What steps exactly would I need to do to accomplish this? Thanks in advance.

i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703

https://www.3dmark.com/compare/fs/14520125/fs/11807761#

#32
Posted 08/05/2018 04:03 PM   
[quote="xXxStarManxXx"]Ah I see, and I can reverse the eyes with 3D Vision 2 glasses by holding down the power button correct? PG278Q does ULMB, have you tried the above with this monitor? I'm curious to try 70 FPS. What steps exactly would I need to do to accomplish this? Thanks in advance. [/quote] To clarify, I don't think this is true. It works on some TV based shutter glasses, but not on 3D Vision glasses. That's actually different than how they designed 3D Vision as a whole- they were trying to make it foolproof, not lots of options. Which is why you cannot choose different eyes during the SetupWizard, and why we have to use EDID workarounds. They only wanted to stamp 3D Vision on a high quality experience. I just tried this running the nvidia stereo test using 3D Vision 2 glasses, and it does not switch eyes.
xXxStarManxXx said:Ah I see, and I can reverse the eyes with 3D Vision 2 glasses by holding down the power button correct? PG278Q does ULMB, have you tried the above with this monitor? I'm curious to try 70 FPS. What steps exactly would I need to do to accomplish this? Thanks in advance.

To clarify, I don't think this is true. It works on some TV based shutter glasses, but not on 3D Vision glasses. That's actually different than how they designed 3D Vision as a whole- they were trying to make it foolproof, not lots of options. Which is why you cannot choose different eyes during the SetupWizard, and why we have to use EDID workarounds. They only wanted to stamp 3D Vision on a high quality experience.

I just tried this running the nvidia stereo test using 3D Vision 2 glasses, and it does not switch eyes.

Acer H5360 (1280x720@120Hz) - ASUS VG248QE with GSync mod - 3D Vision 1&2 - Driver 372.54
GTX 970 - i5-4670K@4.2GHz - 12GB RAM - Win7x64+evilKB2670838 - 4 Disk X25 RAID
SAGER NP9870-S - GTX 980 - i7-6700K - Win10 Pro 1607
Latest 3Dmigoto Release
Bo3b's School for ShaderHackers

#33
Posted 08/05/2018 09:35 PM   
[quote="bo3b"][quote="xXxStarManxXx"]Ah I see, and I can reverse the eyes with 3D Vision 2 glasses by holding down the power button correct? PG278Q does ULMB, have you tried the above with this monitor? I'm curious to try 70 FPS. What steps exactly would I need to do to accomplish this? Thanks in advance. [/quote] To clarify, I don't think this is true. It works on some TV based shutter glasses, but not on 3D Vision glasses. That's actually different than how they designed 3D Vision as a whole- they were trying to make it foolproof, not lots of options. Which is why you cannot choose different eyes during the SetupWizard, and why we have to use EDID workarounds. They only wanted to stamp 3D Vision on a high quality experience. I just tried this running the nvidia stereo test using 3D Vision 2 glasses, and it does not switch eyes. [/quote] Ok, I just read this here somewhere a few days ago, I guess it's incorrect.
bo3b said:
xXxStarManxXx said:Ah I see, and I can reverse the eyes with 3D Vision 2 glasses by holding down the power button correct? PG278Q does ULMB, have you tried the above with this monitor? I'm curious to try 70 FPS. What steps exactly would I need to do to accomplish this? Thanks in advance.

To clarify, I don't think this is true. It works on some TV based shutter glasses, but not on 3D Vision glasses. That's actually different than how they designed 3D Vision as a whole- they were trying to make it foolproof, not lots of options. Which is why you cannot choose different eyes during the SetupWizard, and why we have to use EDID workarounds. They only wanted to stamp 3D Vision on a high quality experience.

I just tried this running the nvidia stereo test using 3D Vision 2 glasses, and it does not switch eyes.



Ok, I just read this here somewhere a few days ago, I guess it's incorrect.

i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703

https://www.3dmark.com/compare/fs/14520125/fs/11807761#

#34
Posted 08/06/2018 05:20 AM   
While the 'pressing button for reverse on official 3DV glasses' part is incorrect, as bo3b highlights; the result itself is correct, and 3DV glasses can be made to work in reverse otherwise... In this community, you will find many people using different glasses for 3DV. I myself use DLP-link glasses, for which the reverse button works. I remember in the old days with CRT monitors, some users used some LCD glasses to as high as 144Hz without issue - the longer phosphorous fade time in the CRT was offset by a faster scan time which compensated well for the residual image. For 3DV however, for testing, I have worn the glasses upside down. I have 3 pairs of official 3DV glasses - one pair I have even hardware modified by opening up the LCD's and reversing the wiring so they work in reverse - all quite doable with a little patience. Also, would you clarify what you mean by 60Hz and 60FPS? You seem to use them interchangeably but they are very different things. I and many others, for example, use a 120Hz projector which does 60fps in 3DV but 120fps in 2D gaming. With the right software, it could be used at 120fps 3DV if a 3D driver were to present each motion progression frame as a separate frame (currently 3DV provides the same motion frame over 2 frames, each from a different perspective, hence the 3DV fps being cut in half of the refresh rate), or perhaps 120Hz might be possible in even some interlaced mode if it could be got to work. I think users using a check-board pattern also might be able to use >60fps if their display is capable.
While the 'pressing button for reverse on official 3DV glasses' part is incorrect, as bo3b highlights; the result itself is correct, and 3DV glasses can be made to work in reverse otherwise...

In this community, you will find many people using different glasses for 3DV. I myself use DLP-link glasses, for which the reverse button works. I remember in the old days with CRT monitors, some users used some LCD glasses to as high as 144Hz without issue - the longer phosphorous fade time in the CRT was offset by a faster scan time which compensated well for the residual image.

For 3DV however, for testing, I have worn the glasses upside down.

I have 3 pairs of official 3DV glasses - one pair I have even hardware modified by opening up the LCD's and reversing the wiring so they work in reverse - all quite doable with a little patience.

Also, would you clarify what you mean by 60Hz and 60FPS? You seem to use them interchangeably but they are very different things.

I and many others, for example, use a 120Hz projector which does 60fps in 3DV but 120fps in 2D gaming. With the right software, it could be used at 120fps 3DV if a 3D driver were to present each motion progression frame as a separate frame (currently 3DV provides the same motion frame over 2 frames, each from a different perspective, hence the 3DV fps being cut in half of the refresh rate), or perhaps 120Hz might be possible in even some interlaced mode if it could be got to work.

I think users using a check-board pattern also might be able to use >60fps if their display is capable.

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

#35
Posted 08/06/2018 07:45 AM   
[quote="xXxStarManxXx"]Ok, I just read this here somewhere a few days ago, I guess it's incorrect. [/quote] Age must be catching up =/ Double checked that post of D-Man and cant recall why I thought he was talking about official 3D Vision 2 glasses. Sorry for the confusion.
xXxStarManxXx said:Ok, I just read this here somewhere a few days ago, I guess it's incorrect.
Age must be catching up =/
Double checked that post of D-Man and cant recall why I thought he was talking about official 3D Vision 2 glasses.
Sorry for the confusion.

#36
Posted 08/06/2018 01:02 PM   
[quote="RAGEdemon"]While the 'pressing button for reverse on official 3DV glasses' part is incorrect, as bo3b highlights; the result itself is correct, and 3DV glasses can be made to work in reverse otherwise... In this community, you will find many people using different glasses for 3DV. I myself use DLP-link glasses, for which the reverse button works. I remember in the old days with CRT monitors, some users used some LCD glasses to as high as 144Hz without issue - the longer phosphorous fade time in the CRT was offset by a faster scan time which compensated well for the residual image. For 3DV however, for testing, I have worn the glasses upside down. I have 3 pairs of official 3DV glasses - one pair I have even hardware modified by opening up the LCD's and reversing the wiring so they work in reverse - all quite doable with a little patience. Also, would you clarify what you mean by 60Hz and 60FPS? You seem to use them interchangeably but they are very different things. I and many others, for example, use a 120Hz projector which does 60fps in 3DV but 120fps in 2D gaming. With the right software, it could be used at 120fps 3DV if a 3D driver were to present each motion progression frame as a separate frame (currently 3DV provides the same motion frame over 2 frames, each from a different perspective, hence the 3DV fps being cut in half of the refresh rate), or perhaps 120Hz might be possible in even some interlaced mode if it could be got to work. I think users using a check-board pattern also might be able to use >60fps if their display is capable.[/quote] Thanks for the clarification. If I said 60 Hz is 60 FPS in 3D Vision that was erroneous, 60 FPS 3D Vision = 120 Hz. Is there a way to use different glasses with 3D Vision and PG278Q? Thanks for the reply. [quote="KoelerMeester"][quote="xXxStarManxXx"]Ok, I just read this here somewhere a few days ago, I guess it's incorrect. [/quote] Age must be catching up =/ Double checked that post of D-Man and cant recall why I thought he was talking about official 3D Vision 2 glasses. Sorry for the confusion. [/quote] No worries!
RAGEdemon said:While the 'pressing button for reverse on official 3DV glasses' part is incorrect, as bo3b highlights; the result itself is correct, and 3DV glasses can be made to work in reverse otherwise...

In this community, you will find many people using different glasses for 3DV. I myself use DLP-link glasses, for which the reverse button works. I remember in the old days with CRT monitors, some users used some LCD glasses to as high as 144Hz without issue - the longer phosphorous fade time in the CRT was offset by a faster scan time which compensated well for the residual image.

For 3DV however, for testing, I have worn the glasses upside down.

I have 3 pairs of official 3DV glasses - one pair I have even hardware modified by opening up the LCD's and reversing the wiring so they work in reverse - all quite doable with a little patience.

Also, would you clarify what you mean by 60Hz and 60FPS? You seem to use them interchangeably but they are very different things.

I and many others, for example, use a 120Hz projector which does 60fps in 3DV but 120fps in 2D gaming. With the right software, it could be used at 120fps 3DV if a 3D driver were to present each motion progression frame as a separate frame (currently 3DV provides the same motion frame over 2 frames, each from a different perspective, hence the 3DV fps being cut in half of the refresh rate), or perhaps 120Hz might be possible in even some interlaced mode if it could be got to work.

I think users using a check-board pattern also might be able to use >60fps if their display is capable.


Thanks for the clarification. If I said 60 Hz is 60 FPS in 3D Vision that was erroneous, 60 FPS 3D Vision = 120 Hz.

Is there a way to use different glasses with 3D Vision and PG278Q? Thanks for the reply.

KoelerMeester said:
xXxStarManxXx said:Ok, I just read this here somewhere a few days ago, I guess it's incorrect.
Age must be catching up =/
Double checked that post of D-Man and cant recall why I thought he was talking about official 3D Vision 2 glasses.
Sorry for the confusion.



No worries!

i7 8700k @ 5.1 GHz w/ EK Monoblock | GTX 1080 Ti FE + Full Nickel EK Block | EK SE 420 + EK PE 360 | 16GB G-Skill Trident Z @ 3200 MHz | Samsung 850 Evo | Corsair RM1000x | Asus ROG Swift PG278Q + Alienware AW3418DW | Win10 Pro 1703

https://www.3dmark.com/compare/fs/14520125/fs/11807761#

#37
Posted 08/06/2018 09:04 PM   
Geforce RTX Turing ?: [img]https://s33.postimg.cc/e4opdnnxb/GF_RTX.png[/img] Anyway the new RTX Quadro, chip Turing are here: (2000€, 6000€ and 10.000€) [img]https://s33.postimg.cc/5z6nfh4tb/NVIDIA-_RTX-6000-1000x324.jpg[/img] [img]https://s33.postimg.cc/xmjctkxpr/NVIDIA-_Turing-_GPU-1000x362.jpg[/img] [img]https://s33.postimg.cc/eu7hq0gr3/NVIDIA-_Turing-vs-_Pascal-1-1000x594.jpg[/img] [img]https://s33.postimg.cc/jsv04jcu7/NVIDIA-_Turing-vs-_Pascal-3-1000x567.jpg[/img] [img]https://s33.postimg.cc/idtffsolr/Quadro-_RTX-6000-4-1000x469.jpg[/img]
Geforce RTX Turing ?:
Image


Anyway the new RTX Quadro, chip Turing are here: (2000€, 6000€ and 10.000€)
Image
Image
Image
Image
Image
https://www.pcgamer.com/rtx-2080-release-date/ Looks like the topic title needs to change.
https://www.pcgamer.com/rtx-2080-release-date/


Looks like the topic title needs to change.

The very powerful and the very stupid have one thing in common. Instead of altering their views to fit the facts, they alter the facts to fit their views ... which can be very uncomfortable if you happen to be one of the facts that needs altering.

-- Doctor Who, "Face of Evil"

#39
Posted 08/15/2018 01:20 AM   
I'll be skipping this gen I think and waiting for a TI version at least. I hope then around I can abandon SLI because support has been getting to the point where it's pretty much not worth it. =(
I'll be skipping this gen I think and waiting for a TI version at least.

I hope then around I can abandon SLI because support has been getting to the point where it's pretty much not worth it. =(

i7-4790K CPU 4.8Ghz stable overclock.
16 GB RAM Corsair
ASUS Turbo 2080TI
Samsung SSD 840Pro
ASUS Z97-WS3D
Surround ASUS Rog Swift PG278Q(R), 2x PG278Q (yes it works)
Obutto R3volution.
Windows 10 pro 64x (Windows 7 Dual boot)

#40
Posted 08/15/2018 10:51 AM   
[quote="Necropants"]I'll be skipping this gen I think and waiting for a TI version at least. I hope then around I can abandon SLI because support has been getting to the point where it's pretty much not worth it. =([/quote] Turing won't have SLI support, nvidia seems to be moving to modular virtualization.
Necropants said:I'll be skipping this gen I think and waiting for a TI version at least.

I hope then around I can abandon SLI because support has been getting to the point where it's pretty much not worth it. =(


Turing won't have SLI support, nvidia seems to be moving to modular virtualization.

How to provide valuable feedback to NVIDIA
How to enable NVIDIA Graphics Driver and GeForce Experience installer logging
Wagnard Tools (DDU,GMP,TDR Manupulator)-(Updated 09/19/14)
Fix for Control Panel not saving settings
How to make the NVCP display in English
PASCAL/TURING WARNING: Bundled and Cheap PCI-E Riser cables can cause decoder corruption and TDR's

In Memory of Chris "ChrisRay" Arthington, 1982-2010

OS:Windows 7 SP1, Case:NZXT Phantom 820, PSU:Seasonic X-850, Cooler: ThermalRight Silver Arrow IB-E Extreme
CPU:Intel Xeon x5690 @ 4.2Ghz, Mainboard:Asus Rampage III Extreme, Memory:48GB Corsair Vengeance LP 1600
Video:EVGA Geforce GTX 1080 Founders Edition, NVidia Geforce GTX 1060 Founders Edition
Monitor:ROG PG279Q, BenQ BL2211, Sound:Creative XFI Titanium Fatal1ty Pro
SDD:Crucial MX300 275 and 525, Crucial MX500 2000 and 1000
HDD:500GB Spinpoint F3, 1TB WD Black, 2TB WD Red, 1TB WD Black

OS:Windows 10, Case:NZXT Phantom 410, PSU:Corsair 620HX, Cooler: ThermalRight TRUE Spirit 120M BW Rev.A
CPU:Intel Xeon x5670 @ 3.8Gz, Mainboard:Asus Rampage II Gene, Memory:24GB Corsair Vengeance LP 1600
Video:EVGA Geforce GTX 680+ 4GB
Monitor:55" Thorn TV, Sound:Sony Muteki 7.1
SDD:Crucial MX200 250, HDD: 1TB Samsung Spinpoint F3

#41
Posted 08/15/2018 12:33 PM   
It will have NVLink which let you now double the Vram. NVLink x16 for RTX 2080Ti and RTX Titan and x8 for RTX 2080 Vram will be Samsung GDDR6 with: 12GB 384bit for Titan (Chip RT102 with 4352 CUDA) 11GB 352bit for 2080Ti (Chip RT102 with 2944 CUDA) 8GB 256bit for 2080 (Chip RT104) Always (for Founder E) with 3 DisplayPort, 1 HDMI and 1 USB Type C.
It will have NVLink which let you now double the Vram. NVLink x16 for RTX 2080Ti and RTX Titan and x8 for RTX 2080

Vram will be Samsung GDDR6 with:
12GB 384bit for Titan (Chip RT102 with 4352 CUDA)
11GB 352bit for 2080Ti (Chip RT102 with 2944 CUDA)
8GB 256bit for 2080 (Chip RT104)

Always (for Founder E) with 3 DisplayPort, 1 HDMI and 1 USB Type C.
[quote="Dugom"]It will have NVLink which let you now double the Vram. NVLink x16 for RTX 2080Ti and RTX Titan and x8 for RTX 2080 Vram will be Samsung GDDR6 with: 12GB 384bit for Titan (Chip RT102 with 4352 CUDA) 11GB 352bit for 2080Ti (Chip RT102 wirh 2944 CUDA) 8GB 256bit for 2080 (Chip RT104) Always (for Founder E) with 3 DisplayPort, 1 HDMI and 1 USB Type C. [/quote] Looks like Nvidia has yet again moved their entire product stack down one notch, and will likely raise prices. So, what used to be the "x80ti" is now the "titan" what used to be the "x80" is now the "ti" what used to be the "x70" is now "x80" This is the second time they've moved everything down a notch, too. Originally, the x80ti was the same exact chip as the x80, just with an overclock applied. Often, you didn't get an x80ti in a given gen, just the x80 - that was the top. It's all more than a little insulting to anyone paying attention. I've never bought a lobotomized chip before, I've always waited for the real thing (which was previously the x80, and then then x80ti), but now that'll likely never happen outside of the "titan" cards, which are a joke. So then, I'll have to change my metrics - I'll wait for 2x the performance of my 1080ti at 1000 CAD, or 1.5x at 500 CAD. Which means I'll be waiting a damn long time. Nvidia has taken a page straight from Intel's playbook here: remember when Intel sold 4 core CPUs at 500 dollars for almost a decade straight, and if you wanted 2 or 4 more cores it suddenly became 1500-3000 dollars? Everyone thought "that's fine, that's all you need anyways". And then AMD finally released RyZen with 8 cores for 450, and Intel immediately shit their pants and gave themselves a brain contusion. It'll be a while, but here's hoping Nvidia gets to experience that same thing sometime. I expect this gen of GPUs will become infamous - the time when Nvidia called their x70 an x80 and sold it for x80ti prices... and got away with it.
Dugom said:It will have NVLink which let you now double the Vram. NVLink x16 for RTX 2080Ti and RTX Titan and x8 for RTX 2080

Vram will be Samsung GDDR6 with:
12GB 384bit for Titan (Chip RT102 with 4352 CUDA)
11GB 352bit for 2080Ti (Chip RT102 wirh 2944 CUDA)
8GB 256bit for 2080 (Chip RT104)

Always (for Founder E) with 3 DisplayPort, 1 HDMI and 1 USB Type C.


Looks like Nvidia has yet again moved their entire product stack down one notch, and will likely raise prices.

So, what used to be the "x80ti" is now the "titan"
what used to be the "x80" is now the "ti"
what used to be the "x70" is now "x80"

This is the second time they've moved everything down a notch, too. Originally, the x80ti was the same exact chip as the x80, just with an overclock applied. Often, you didn't get an x80ti in a given gen, just the x80 - that was the top.

It's all more than a little insulting to anyone paying attention. I've never bought a lobotomized chip before, I've always waited for the real thing (which was previously the x80, and then then x80ti), but now that'll likely never happen outside of the "titan" cards, which are a joke. So then, I'll have to change my metrics - I'll wait for 2x the performance of my 1080ti at 1000 CAD, or 1.5x at 500 CAD.

Which means I'll be waiting a damn long time. Nvidia has taken a page straight from Intel's playbook here: remember when Intel sold 4 core CPUs at 500 dollars for almost a decade straight, and if you wanted 2 or 4 more cores it suddenly became 1500-3000 dollars? Everyone thought "that's fine, that's all you need anyways". And then AMD finally released RyZen with 8 cores for 450, and Intel immediately shit their pants and gave themselves a brain contusion. It'll be a while, but here's hoping Nvidia gets to experience that same thing sometime. I expect this gen of GPUs will become infamous - the time when Nvidia called their x70 an x80 and sold it for x80ti prices... and got away with it.

#43
Posted 08/15/2018 02:28 PM   
I'd say you're overthinking it a bit! You're absolutely right when you allude to lack of competition being the issue, but naming conventions are arbitrary and both AMD and Nvidia have rebranded and altered their product lines many times in the past. I'm excited for the new cards and I expect that they will be expensive, but there will be a price adjustment generally so we will get the current cards a bit cheaper (at least until the GPUs run out) and we'll have the options for paying premium for those that want it. I don't expect that will change until later next year when AMD catch up a bit. Also bear in mind there's still speculation around specs and price.
I'd say you're overthinking it a bit!

You're absolutely right when you allude to lack of competition being the issue, but naming conventions are arbitrary and both AMD and Nvidia have rebranded and altered their product lines many times in the past.

I'm excited for the new cards and I expect that they will be expensive, but there will be a price adjustment generally so we will get the current cards a bit cheaper (at least until the GPUs run out) and we'll have the options for paying premium for those that want it.

I don't expect that will change until later next year when AMD catch up a bit.

Also bear in mind there's still speculation around specs and price.

Gigabyte RTX2080TI Gaming OC, I7-6700k ~ 4.4Ghz, 3x BenQ XL2420T, BenQ TK800, LG 55EG960V (3D OLED), Samsung 850 EVO SSD, Crucial M4 SSD, 3D vision kit, Xpand x104 glasses, Corsair HX1000i, Win 10 pro 64/Win 7 64https://www.3dmark.com/fs/9529310

#44
Posted 08/16/2018 11:04 AM   
rustyk21 is right. I didn't want to open up a can of worms but what I want to say is that we have to be careful what we consume as consumers, and not consume BS i.e. marketing. The only thing that ought to matter to us is "Is price x worth paying for performance y (and possibly feature z, i.e. 3D Vision)". Everything else, including, naming convention, die size, disabled parts, memory / core bandwidth / power, shroud colour etc, is all marketing speak. Also, one needs to keep in mind that the primary and overpowering goal of any company in the world is to make money - full stop - (unless you are a non-profit of course). We as consumers should not be shocked when a company - any company - tries to maximise their profits by any means. Companies are not your friends - they just attempt to portray themselves as such because the image sells more, e.g. brand loyalty, and hence is profitable to do so. Frankly, they don't give a toss about you as a consumer, aside from how much you will buy their product for. The only time they will pretend to care is if you expose their BS to more people e.g. on social media, when they might potentially lose sales from the bad publicity. The truth is that they have teams of analysts being paid $$$$$$ to try and figure out how much you will be willing to pay, and how much they can fleece you for. If the prices are high, then they have calculated that this is the price that your peers are willing to pay. No company in their right mind would sell a product for $10 if they could sell it for $20 and shift a similar volume. In fact, it is illegal to do so, as by the company's and country's own rules and regulations - the articles of association et al. The people who run the company, i.e. the directors and executives - their primary goal above anything (even token ethics policies put into place by HR solely to avoid lawsuits) is to increase shareholder wealth. If they can't do that, then they won't get their bonuses at the end of the year and the board will find someone who can (hence the usual short sighted strategies of companies). FYI, the only reason AMD is charging low prices for their high core count CPUs is because if priced higher, they won't sell. Don't think for a second that they wouldn't raise the prices many-fold if their performance was superior to Intel's. In fact, think back to Athlon XP vs Pentium when AMD performance was superior - remember how 'unreasonably' high AMD prices were and how much profit AMD made back then? Yea. AMD/Intel/Nvidia/ATi et al, they are all the same - they exist solely to make money for their shareholders. [BTW, The AMD CEO is nVidia CEO's real niece]. Let's not buy into their game; let's instead try to play smart. Let's keep it simple: All one needs to ask themselves is "Does this product's VALUE TO ME outweigh the Price Tag to me". If the answer is yes, then we should not care the least bit if the chip is even 99% disabled.
rustyk21 is right. I didn't want to open up a can of worms but what I want to say is that we have to be careful what we consume as consumers, and not consume BS i.e. marketing.

The only thing that ought to matter to us is "Is price x worth paying for performance y (and possibly feature z, i.e. 3D Vision)".

Everything else, including, naming convention, die size, disabled parts, memory / core bandwidth / power, shroud colour etc, is all marketing speak.

Also, one needs to keep in mind that the primary and overpowering goal of any company in the world is to make money - full stop - (unless you are a non-profit of course).

We as consumers should not be shocked when a company - any company - tries to maximise their profits by any means. Companies are not your friends - they just attempt to portray themselves as such because the image sells more, e.g. brand loyalty, and hence is profitable to do so. Frankly, they don't give a toss about you as a consumer, aside from how much you will buy their product for. The only time they will pretend to care is if you expose their BS to more people e.g. on social media, when they might potentially lose sales from the bad publicity.

The truth is that they have teams of analysts being paid $$$$$$ to try and figure out how much you will be willing to pay, and how much they can fleece you for. If the prices are high, then they have calculated that this is the price that your peers are willing to pay.

No company in their right mind would sell a product for $10 if they could sell it for $20 and shift a similar volume. In fact, it is illegal to do so, as by the company's and country's own rules and regulations - the articles of association et al.

The people who run the company, i.e. the directors and executives - their primary goal above anything (even token ethics policies put into place by HR solely to avoid lawsuits) is to increase shareholder wealth. If they can't do that, then they won't get their bonuses at the end of the year and the board will find someone who can (hence the usual short sighted strategies of companies).

FYI, the only reason AMD is charging low prices for their high core count CPUs is because if priced higher, they won't sell. Don't think for a second that they wouldn't raise the prices many-fold if their performance was superior to Intel's. In fact, think back to Athlon XP vs Pentium when AMD performance was superior - remember how 'unreasonably' high AMD prices were and how much profit AMD made back then? Yea.

AMD/Intel/Nvidia/ATi et al, they are all the same - they exist solely to make money for their shareholders. [BTW, The AMD CEO is nVidia CEO's real niece]. Let's not buy into their game; let's instead try to play smart. Let's keep it simple:

All one needs to ask themselves is "Does this product's VALUE TO ME outweigh the Price Tag to me".

If the answer is yes, then we should not care the least bit if the chip is even 99% disabled.

Windows 10 64-bit, Intel 7700K @ 5.1GHz, 16GB 3600MHz CL15 DDR4 RAM, 2x GTX 1080 SLI, Asus Maximus IX Hero, Sound Blaster ZxR, PCIe Quad SSD, Oculus Rift CV1, DLP Link PGD-150 glasses, ViewSonic PJD6531w 3D DLP Projector @ 1280x800 120Hz native / 2560x1600 120Hz DSR 3D Gaming.

#45
Posted 08/16/2018 12:37 PM   
  3 / 27    
Scroll To Top