[quote="Pirateguybrush"]If you do get time to write up a guide, that would be amazing.
The W1070 isn't 3d vision ready, but 3d tv play - would that make a difference?[/quote]
This is not a problem, neither is the Optoma EH500, this is why I would provide the gutted Acer driver, the driver will force Nvidia to think the projector is actually a 3D vision ready Acer LCD (in other words, it enables Frame Sequential and disables 3D TV Play) Then we just add the resolutions and refresh rates/manual timings required/possible, you would figure these out in the test phase and I would provide details on how to place them in the ini file (EDID override).
[quote="Paul33993"]Some great work here, CaptainTaco.[/quote]
Thanks, just playing though, I don't considered it "work" heh heh.
EDIT:
Oh! I forgot to mention, I realized while I was sleeping why the VESA Sync Cable isn't required... duh... VESA Sync output from the projector is only needed for frame packing, as with HDMI 1.4a and 1.4b frame packing, Nvidia isn't doing the syncing for the projector and emitter, it is just packing the frames, and it is up to the projector to sync the IR signal. In Frame Sequential however (how we set up this experiment, and how I will be setting up the experiment with the EH501) the display adapter does the syncing with the emitter, so it is all processed locally on the machine, therefore, no need for communications between the projector and emitter.
Pirateguybrush said:If you do get time to write up a guide, that would be amazing.
The W1070 isn't 3d vision ready, but 3d tv play - would that make a difference?
This is not a problem, neither is the Optoma EH500, this is why I would provide the gutted Acer driver, the driver will force Nvidia to think the projector is actually a 3D vision ready Acer LCD (in other words, it enables Frame Sequential and disables 3D TV Play) Then we just add the resolutions and refresh rates/manual timings required/possible, you would figure these out in the test phase and I would provide details on how to place them in the ini file (EDID override).
Paul33993 said:Some great work here, CaptainTaco.
Thanks, just playing though, I don't considered it "work" heh heh.
EDIT:
Oh! I forgot to mention, I realized while I was sleeping why the VESA Sync Cable isn't required... duh... VESA Sync output from the projector is only needed for frame packing, as with HDMI 1.4a and 1.4b frame packing, Nvidia isn't doing the syncing for the projector and emitter, it is just packing the frames, and it is up to the projector to sync the IR signal. In Frame Sequential however (how we set up this experiment, and how I will be setting up the experiment with the EH501) the display adapter does the syncing with the emitter, so it is all processed locally on the machine, therefore, no need for communications between the projector and emitter.
CaptainTaco, could you share the inf that you created for the EH501 which allowed for 1600x900 110-112hz 3D? I'd like to experiment with it and compare to the inf I provided.
CaptainTaco, could you share the inf that you created for the EH501 which allowed for 1600x900 110-112hz 3D? I'd like to experiment with it and compare to the inf I provided.
Gigabyte Gaming 5 Z170X, i7-6700K @ 4.4ghz, Asus GTX 2080 ti Strix OC , 16gb DDR4 Corsair Vengence 2666, LG 60uh8500 and 49ub8500 passive 4K 3D EDID, Dell S2716DG.
[quote="Pirateguybrush"]Out of curiosity, would this be affected by cable length/quality?[/quote]
Only if you are using a cable older than HDMI 1.3, or an HDMI cable longer than 15 foot (without a redmere chip). The port matters more than the cable, as the port is where the standards are programmed to the microchip (in both out, such as the computer graphics card, and in, such as the projectors inputs). The cables, other than redmere HDMI cables, don't have microchips in them, so HDMI 1.3 cable will handle all the way up to HDMI 1.4b, this is because the bandwidth of the cables (and ports for that matter) haven't changed over those renditions. HDMI 1.3a-c, 1.4a, and 1.4b all have the same bandwidth (340MHZ clockrate max, 3.4Gbit/s per channel (10.2Gb/s total bandwidth including 8b/10b overheads), etc. The only thing that changed during the renditions were display technologies in the microchips @ the port itself, such as "3D over HDMI aka frame packing", ethernet channel, ARC, 4k res, etc.
[quote="CeeJayII"]CaptainTaco, could you share the inf that you created for the EH501 which allowed for 1600x900 110-112hz 3D? I'd like to experiment with it and compare to the inf I provided. [/quote]
I would be happy to once I have it sort of stable, it is way off from that right now. Give me a few more days? Right now it would look like an absolute mess if you were to compare it lol.
On a side note, I have been getting together with a friend of mine looking at the possibilities of this projector and the EH501, he thinks he might have a sony projector capable of handling 1080 @ 120hz, but still isn't sure, it is in a room under construction right now so we can't test it (until power is added to the room and such)
That said, we got into the discussion of existing 1080p 120hz projectors. It is my understanding none exist natively (that aren't $20k and up) Does anybody know of a projector that is $5k or so or less that is capable of this? Just curious, not important enough to make a new thread.
Going to play with the analog input tonight and see if I can get any higher refresh.
Pirateguybrush said:Out of curiosity, would this be affected by cable length/quality?
Only if you are using a cable older than HDMI 1.3, or an HDMI cable longer than 15 foot (without a redmere chip). The port matters more than the cable, as the port is where the standards are programmed to the microchip (in both out, such as the computer graphics card, and in, such as the projectors inputs). The cables, other than redmere HDMI cables, don't have microchips in them, so HDMI 1.3 cable will handle all the way up to HDMI 1.4b, this is because the bandwidth of the cables (and ports for that matter) haven't changed over those renditions. HDMI 1.3a-c, 1.4a, and 1.4b all have the same bandwidth (340MHZ clockrate max, 3.4Gbit/s per channel (10.2Gb/s total bandwidth including 8b/10b overheads), etc. The only thing that changed during the renditions were display technologies in the microchips @ the port itself, such as "3D over HDMI aka frame packing", ethernet channel, ARC, 4k res, etc.
CeeJayII said:CaptainTaco, could you share the inf that you created for the EH501 which allowed for 1600x900 110-112hz 3D? I'd like to experiment with it and compare to the inf I provided.
I would be happy to once I have it sort of stable, it is way off from that right now. Give me a few more days? Right now it would look like an absolute mess if you were to compare it lol.
On a side note, I have been getting together with a friend of mine looking at the possibilities of this projector and the EH501, he thinks he might have a sony projector capable of handling 1080 @ 120hz, but still isn't sure, it is in a room under construction right now so we can't test it (until power is added to the room and such)
That said, we got into the discussion of existing 1080p 120hz projectors. It is my understanding none exist natively (that aren't $20k and up) Does anybody know of a projector that is $5k or so or less that is capable of this? Just curious, not important enough to make a new thread.
Going to play with the analog input tonight and see if I can get any higher refresh.
These are the cables I'm going to be using in my setup (they're in the mail right now).
Will they be a limiting factor in my ability to perform this wizardry you've discovered?
http://www.ebay.com.au/itm/15m-LONG-HDMI-HD-DIGITAL-VIDEO-CABLE-LEAD-fr-TV-PS3-XBOX-360-PC-COMPUTER-MONITOR-/130793575918?ssPageName=ADME:L:OC:AU:3160
http://www.ebay.co.uk/itm/10m-WHITE-V1-4-Long-HDMI-High-Speed-With-Ethernet-Cable-Lead-3D-TV-HDTV-PS3-XBOX-/300921043880?ssPageName=ADME:L:OC:AU:3160
Also, I was planning on passing through a receiver. Will that affect things too?
Sometimes after doing multiple EDID overrides, it's a good idea to go in and clean out some of the overrides that you have performed. Although windows has a check box option to uninstall the drivr, it doesn't always clear it out.
Make sure to delete all leftovers from the system:
•Start [Select]
•My Computer [right-click]
•Properties [Select]
•Advanced System Settings {Select]
•Environment Variables [Select]
•New [Select] (Under User variables ...)
•Set field "Variable name" to devmgr_show_nonpresent_devices
•Set field "Variable value" to 1.
•OK [Select]
•OK [Select]
•Now go to Device Manager (Start->Computer->Properties [right click])
•Menu option "View" [Select]
•Menu option "Show hidden devices" [Select]
•Monitors [Select]
•Uninstall any greyed-out display device.
•Tick on the option "Delete all driver files for this device" (if applicable)
As found in this thread by AVS forum member Tulli
http://www.avsforum.com/t/1227161/edid-overrides-to-solve-bitstreaming-issues-for-ati-5xxxs
Sometimes after doing multiple EDID overrides, it's a good idea to go in and clean out some of the overrides that you have performed. Although windows has a check box option to uninstall the drivr, it doesn't always clear it out.
Make sure to delete all leftovers from the system:
•Start [Select]
•My Computer [right-click]
•Properties [Select]
•Advanced System Settings {Select]
•Environment Variables [Select]
•New [Select] (Under User variables ...)
•Set field "Variable name" to devmgr_show_nonpresent_devices
•Set field "Variable value" to 1.
•OK [Select]
•OK [Select]
•Now go to Device Manager (Start->Computer->Properties [right click])
•Menu option "View" [Select]
•Menu option "Show hidden devices" [Select]
•Monitors [Select]
•Uninstall any greyed-out display device.
•Tick on the option "Delete all driver files for this device" (if applicable)
Sorry I have been absent all, I have been extremely busy with both work and... well... cleaning and prepping my house... I swear I didn't forget about you guys, I spent my weekend moving my DLP, removing shelves and such from the wall, repairing holes and cracks in the wall, compounding, priming, and sanding/painting a projector screen on the wall. (using Sherwin-Williams ProClassic Smooth Enamel Satin Extra White, # B20 W 51, which has been proven to be almost identical to the StudioTek 100 flat gain screen, actually it outperforms the studiotek by a little bit). for about $100, you can make a $1,500 (value wise) screen.
Anyways. I didn't get to that much more testing, nor was I able to write up a tutorial yet, but I will get to it eventually, I promise.
[quote="Pirateguybrush"]These are the cables I'm going to be using in my setup (they're in the mail right now).
Will they be a limiting factor in my ability to perform this wizardry you've discovered?
http://www.ebay.com.au/itm/15m-LONG-HDMI-HD-DIGITAL-VIDEO-CABLE-LEAD-fr-TV-PS3-XBOX-360-PC-COMPUTER-MONITOR-/130793575918?ssPageName=ADME:L:OC:AU:3160
http://www.ebay.co.uk/itm/10m-WHITE-V1-4-Long-HDMI-High-Speed-With-Ethernet-Cable-Lead-3D-TV-HDTV-PS3-XBOX-/300921043880?ssPageName=ADME:L:OC:AU:3160
Also, I was planning on passing through a receiver. Will that affect things too?[/quote]
These would be fine, to be honest, as long as it is labeled "high speed HDMI" that means it complies with 10.2Gbps standard (HDMI 1.3 and above) which is perfectly fine for anything we are doing. As a matter of fact, the cable quality itself is almost entirely irrelevant, other than if the connectors were literally so bad that they fail, the quality of the cable will gain you no performance increase. You are in Australia I presume (based on the listings you posted) otherwise I would recommend monoprice cables, as they are extremely cheap, and pretty decent quality (connectors). Not sure if monoprice exists in Au, but I am guessing not.
Edit: oops forgot to comment on the second part of your question.
As for the receiver, this is not an issue (assuming your receiver has the ability to pass the signals as Native (no changes to the video signal such as upscaling or downscaling.)
As a matter of fact, the only way to get your display to fully work through a receiver (especially with 3D vision) is to use an EDID override. Otherwise Windows will read your display capabilities as the Receiver, not the display (projector.)
As a matter of fact, most of my testing (other than the first few, before I felt like rewiring my setup) was done through my receiver, an Outlaw Audio Model 975. The only thing to mention would be you will want to use Notepad++ (after exporting the moninfo of your receiver) to copy the Audio Capabilities section of the ini, otherwise you will lose the recognition of your Receivers audio capabilities (they would be overridden by whatever ini you use.
- I will post this part in the tutorial as well.
[quote="D-Man11"]Sometimes after doing multiple EDID overrides, it's a good idea to go in and clean out some of the overrides that you have performed. Although windows has a check box option to uninstall the drivr, it doesn't always clear it out.
Make sure to delete all leftovers from the system:
•Start [Select]
•My Computer [right-click]
•Properties [Select]
•Advanced System Settings {Select]
•Environment Variables [Select]
•New [Select] (Under User variables ...)
•Set field "Variable name" to devmgr_show_nonpresent_devices
•Set field "Variable value" to 1.
•OK [Select]
•OK [Select]
•Now go to Device Manager (Start->Computer->Properties [right click])
•Menu option "View" [Select]
•Menu option "Show hidden devices" [Select]
•Monitors [Select]
•Uninstall any greyed-out display device.
•Tick on the option "Delete all driver files for this device" (if applicable)
As found in this thread by AVS forum member Tulli
http://www.avsforum.com/t/1227161/edid-overrides-to-solve-bitstreaming-issues-for-ati-5xxxs[/quote]
Thanks for the post, great point. I clean out old drivers all the time, as such, I didn't even think of posting this; but others are likely not as familiar with this concept. As a side note, Windows 7 has a checkbox to enable showing of non-present devices within device manager (at least in the pro version), Windows 8 shows these by default (as long as you check off show hidden devices whilst in device manager.) No need for the environment variable changes.
Sorry I have been absent all, I have been extremely busy with both work and... well... cleaning and prepping my house... I swear I didn't forget about you guys, I spent my weekend moving my DLP, removing shelves and such from the wall, repairing holes and cracks in the wall, compounding, priming, and sanding/painting a projector screen on the wall. (using Sherwin-Williams ProClassic Smooth Enamel Satin Extra White, # B20 W 51, which has been proven to be almost identical to the StudioTek 100 flat gain screen, actually it outperforms the studiotek by a little bit). for about $100, you can make a $1,500 (value wise) screen.
Anyways. I didn't get to that much more testing, nor was I able to write up a tutorial yet, but I will get to it eventually, I promise.
Pirateguybrush said:These are the cables I'm going to be using in my setup (they're in the mail right now).
Also, I was planning on passing through a receiver. Will that affect things too?
These would be fine, to be honest, as long as it is labeled "high speed HDMI" that means it complies with 10.2Gbps standard (HDMI 1.3 and above) which is perfectly fine for anything we are doing. As a matter of fact, the cable quality itself is almost entirely irrelevant, other than if the connectors were literally so bad that they fail, the quality of the cable will gain you no performance increase. You are in Australia I presume (based on the listings you posted) otherwise I would recommend monoprice cables, as they are extremely cheap, and pretty decent quality (connectors). Not sure if monoprice exists in Au, but I am guessing not.
Edit: oops forgot to comment on the second part of your question.
As for the receiver, this is not an issue (assuming your receiver has the ability to pass the signals as Native (no changes to the video signal such as upscaling or downscaling.)
As a matter of fact, the only way to get your display to fully work through a receiver (especially with 3D vision) is to use an EDID override. Otherwise Windows will read your display capabilities as the Receiver, not the display (projector.)
As a matter of fact, most of my testing (other than the first few, before I felt like rewiring my setup) was done through my receiver, an Outlaw Audio Model 975. The only thing to mention would be you will want to use Notepad++ (after exporting the moninfo of your receiver) to copy the Audio Capabilities section of the ini, otherwise you will lose the recognition of your Receivers audio capabilities (they would be overridden by whatever ini you use.
- I will post this part in the tutorial as well.
D-Man11 said:Sometimes after doing multiple EDID overrides, it's a good idea to go in and clean out some of the overrides that you have performed. Although windows has a check box option to uninstall the drivr, it doesn't always clear it out.
Make sure to delete all leftovers from the system:
•Start [Select]
•My Computer [right-click]
•Properties [Select]
•Advanced System Settings {Select]
•Environment Variables [Select]
•New [Select] (Under User variables ...)
•Set field "Variable name" to devmgr_show_nonpresent_devices
•Set field "Variable value" to 1.
•OK [Select]
•OK [Select]
•Now go to Device Manager (Start->Computer->Properties [right click])
•Menu option "View" [Select]
•Menu option "Show hidden devices" [Select]
•Monitors [Select]
•Uninstall any greyed-out display device.
•Tick on the option "Delete all driver files for this device" (if applicable)
Thanks for the post, great point. I clean out old drivers all the time, as such, I didn't even think of posting this; but others are likely not as familiar with this concept. As a side note, Windows 7 has a checkbox to enable showing of non-present devices within device manager (at least in the pro version), Windows 8 shows these by default (as long as you check off show hidden devices whilst in device manager.) No need for the environment variable changes.
http://www.epson.com/cgi-bin/Store/jsp/Product.do?sku=V11H561020
The Epson 2030 is a great projector, and it claims to support 1080p 3D at 60hz. However, I keep getting an hdmi compatibility warning when trying to do it from the hdmi port on my 780 ti to the first hdmi port on the Epson 2030. When I e-mailed Epson, they said to make sure I was using a 1.4 cable.
I have tried
http://www.bestbuy.com/site/rocketfish-12-in-wall-hdmi-cable/3721001.p?id=1219093275244&skuId=3721001 (Rocketfish 12' hdmi)
as well as a smaller Rocketfish one (6') adapted to another brand, both the cheapie ones they sell "for consoles" (that also claim to be "class 2 and 3D ready" and whatnot.
Would I have to use a Displayport to HDMI adapter to go from the Displayport on the 780 ti to the HDMI on the Epson 2030?
The Epson 2030 is a great projector, and it claims to support 1080p 3D at 60hz. However, I keep getting an hdmi compatibility warning when trying to do it from the hdmi port on my 780 ti to the first hdmi port on the Epson 2030. When I e-mailed Epson, they said to make sure I was using a 1.4 cable.
as well as a smaller Rocketfish one (6') adapted to another brand, both the cheapie ones they sell "for consoles" (that also claim to be "class 2 and 3D ready" and whatnot.
Would I have to use a Displayport to HDMI adapter to go from the Displayport on the 780 ti to the HDMI on the Epson 2030?
Gigabyte Gaming 5 Z170X, i7-6700K @ 4.4ghz, Asus GTX 2080 ti Strix OC , 16gb DDR4 Corsair Vengence 2666, LG 60uh8500 and 49ub8500 passive 4K 3D EDID, Dell S2716DG.
This is not a problem, neither is the Optoma EH500, this is why I would provide the gutted Acer driver, the driver will force Nvidia to think the projector is actually a 3D vision ready Acer LCD (in other words, it enables Frame Sequential and disables 3D TV Play) Then we just add the resolutions and refresh rates/manual timings required/possible, you would figure these out in the test phase and I would provide details on how to place them in the ini file (EDID override).
Thanks, just playing though, I don't considered it "work" heh heh.
EDIT:
Oh! I forgot to mention, I realized while I was sleeping why the VESA Sync Cable isn't required... duh... VESA Sync output from the projector is only needed for frame packing, as with HDMI 1.4a and 1.4b frame packing, Nvidia isn't doing the syncing for the projector and emitter, it is just packing the frames, and it is up to the projector to sync the IR signal. In Frame Sequential however (how we set up this experiment, and how I will be setting up the experiment with the EH501) the display adapter does the syncing with the emitter, so it is all processed locally on the machine, therefore, no need for communications between the projector and emitter.
i7-2600K-4.5Ghz/Corsair H100i/8GB/GTX780SC-SLI/Win7-64/1200W-PSU/Samsung 840-500GB SSD/Coolermaster-Tower/Benq 1080ST @ 100"
Gigabyte Gaming 5 Z170X, i7-6700K @ 4.4ghz, Asus GTX 2080 ti Strix OC , 16gb DDR4 Corsair Vengence 2666, LG 60uh8500 and 49ub8500 passive 4K 3D EDID, Dell S2716DG.
Only if you are using a cable older than HDMI 1.3, or an HDMI cable longer than 15 foot (without a redmere chip). The port matters more than the cable, as the port is where the standards are programmed to the microchip (in both out, such as the computer graphics card, and in, such as the projectors inputs). The cables, other than redmere HDMI cables, don't have microchips in them, so HDMI 1.3 cable will handle all the way up to HDMI 1.4b, this is because the bandwidth of the cables (and ports for that matter) haven't changed over those renditions. HDMI 1.3a-c, 1.4a, and 1.4b all have the same bandwidth (340MHZ clockrate max, 3.4Gbit/s per channel (10.2Gb/s total bandwidth including 8b/10b overheads), etc. The only thing that changed during the renditions were display technologies in the microchips @ the port itself, such as "3D over HDMI aka frame packing", ethernet channel, ARC, 4k res, etc.
I would be happy to once I have it sort of stable, it is way off from that right now. Give me a few more days? Right now it would look like an absolute mess if you were to compare it lol.
On a side note, I have been getting together with a friend of mine looking at the possibilities of this projector and the EH501, he thinks he might have a sony projector capable of handling 1080 @ 120hz, but still isn't sure, it is in a room under construction right now so we can't test it (until power is added to the room and such)
That said, we got into the discussion of existing 1080p 120hz projectors. It is my understanding none exist natively (that aren't $20k and up) Does anybody know of a projector that is $5k or so or less that is capable of this? Just curious, not important enough to make a new thread.
Going to play with the analog input tonight and see if I can get any higher refresh.
Will they be a limiting factor in my ability to perform this wizardry you've discovered?
http://www.ebay.com.au/itm/15m-LONG-HDMI-HD-DIGITAL-VIDEO-CABLE-LEAD-fr-TV-PS3-XBOX-360-PC-COMPUTER-MONITOR-/130793575918?ssPageName=ADME:L:OC:AU:3160
http://www.ebay.co.uk/itm/10m-WHITE-V1-4-Long-HDMI-High-Speed-With-Ethernet-Cable-Lead-3D-TV-HDTV-PS3-XBOX-/300921043880?ssPageName=ADME:L:OC:AU:3160
Also, I was planning on passing through a receiver. Will that affect things too?
Make sure to delete all leftovers from the system:
•Start [Select]
•My Computer [right-click]
•Properties [Select]
•Advanced System Settings {Select]
•Environment Variables [Select]
•New [Select] (Under User variables ...)
•Set field "Variable name" to devmgr_show_nonpresent_devices
•Set field "Variable value" to 1.
•OK [Select]
•OK [Select]
•Now go to Device Manager (Start->Computer->Properties [right click])
•Menu option "View" [Select]
•Menu option "Show hidden devices" [Select]
•Monitors [Select]
•Uninstall any greyed-out display device.
•Tick on the option "Delete all driver files for this device" (if applicable)
As found in this thread by AVS forum member Tulli
http://www.avsforum.com/t/1227161/edid-overrides-to-solve-bitstreaming-issues-for-ati-5xxxs
Anyways. I didn't get to that much more testing, nor was I able to write up a tutorial yet, but I will get to it eventually, I promise.
These would be fine, to be honest, as long as it is labeled "high speed HDMI" that means it complies with 10.2Gbps standard (HDMI 1.3 and above) which is perfectly fine for anything we are doing. As a matter of fact, the cable quality itself is almost entirely irrelevant, other than if the connectors were literally so bad that they fail, the quality of the cable will gain you no performance increase. You are in Australia I presume (based on the listings you posted) otherwise I would recommend monoprice cables, as they are extremely cheap, and pretty decent quality (connectors). Not sure if monoprice exists in Au, but I am guessing not.
Edit: oops forgot to comment on the second part of your question.
As for the receiver, this is not an issue (assuming your receiver has the ability to pass the signals as Native (no changes to the video signal such as upscaling or downscaling.)
As a matter of fact, the only way to get your display to fully work through a receiver (especially with 3D vision) is to use an EDID override. Otherwise Windows will read your display capabilities as the Receiver, not the display (projector.)
As a matter of fact, most of my testing (other than the first few, before I felt like rewiring my setup) was done through my receiver, an Outlaw Audio Model 975. The only thing to mention would be you will want to use Notepad++ (after exporting the moninfo of your receiver) to copy the Audio Capabilities section of the ini, otherwise you will lose the recognition of your Receivers audio capabilities (they would be overridden by whatever ini you use.
- I will post this part in the tutorial as well.
Thanks for the post, great point. I clean out old drivers all the time, as such, I didn't even think of posting this; but others are likely not as familiar with this concept. As a side note, Windows 7 has a checkbox to enable showing of non-present devices within device manager (at least in the pro version), Windows 8 shows these by default (as long as you check off show hidden devices whilst in device manager.) No need for the environment variable changes.
" rel="nofollow" target = "_blank">
Gigabyte Gaming 5 Z170X, i7-6700K @ 4.4ghz, Asus GTX 2080 ti Strix OC , 16gb DDR4 Corsair Vengence 2666, LG 60uh8500 and 49ub8500 passive 4K 3D EDID, Dell S2716DG.
The Epson 2030 is a great projector, and it claims to support 1080p 3D at 60hz. However, I keep getting an hdmi compatibility warning when trying to do it from the hdmi port on my 780 ti to the first hdmi port on the Epson 2030. When I e-mailed Epson, they said to make sure I was using a 1.4 cable.
I have tried
http://www.bestbuy.com/site/rocketfish-12-in-wall-hdmi-cable/3721001.p?id=1219093275244&skuId=3721001 (Rocketfish 12' hdmi)
as well as a smaller Rocketfish one (6') adapted to another brand, both the cheapie ones they sell "for consoles" (that also claim to be "class 2 and 3D ready" and whatnot.
Would I have to use a Displayport to HDMI adapter to go from the Displayport on the 780 ti to the HDMI on the Epson 2030?