• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Enthusiasts Debate on Hardware Technology

Joined
Mar 10, 2010
Messages
11,878 (2.31/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
well as far as metals goes silver is the best conductor of electricity 9% better than copper

what?? i must have missed the fact he was being even more erelivant to future tech, silver chips , as in innards, which would be stupid since most are gold or IHS which is even dafter as its unimportant once you put a waterblock on a chip:p
 

T4C Fantasy

CPU & GPU DB Maintainer
Staff member
Joined
May 7, 2012
Messages
2,562 (0.59/day)
Location
Rhode Island
System Name Whaaaat Kiiiiiiid!
Processor Intel Core i9-12900K @ Default
Motherboard Gigabyte Z690 AORUS Elite AX
Cooling Corsair H150i AIO Cooler
Memory Corsair Dominator Platinum 32GB DDR4-3200
Video Card(s) EVGA GeForce RTX 3080 FTW3 ULTRA @ Default
Storage Samsung 970 PRO 512GB + Crucial MX500 2TB x3 + Crucial MX500 4TB + Samsung 980 PRO 1TB
Display(s) 27" LG 27MU67-B 4K, + 27" Acer Predator XB271HU 1440P
Case Thermaltake Core X9 Snow
Audio Device(s) Logitech G935 Headset
Power Supply SeaSonic Platinum 1050W Snow Silent
Mouse Logitech G903 Lightspeed
Keyboard Logitech G915
Software Windows 11 Pro
Benchmark Scores FFXV: 19329
what?? i must have missed the fact he was being even more erelivant to future tech, silver chips , as in innards, which would be stupid since most are gold or IHS which is even dafter as its unimportant once you put a waterblock on a chip:p

silver is better than gold for this sort of thing since it conducts better and you could get more power without the need of a waterblock compared to copper or gold

and its the materials that make the technology more efficient in the first place

im not stuck on silver, rather suggessting an alternative premium for enthusiasts it would cost more money and hardcore enthusiasts will not have a problem with the price
 
Joined
Jul 19, 2006
Messages
43,585 (6.74/day)
Processor AMD Ryzen 7 7800X3D
Motherboard ASUS TUF x670e
Cooling EK AIO 360. Phantek T30 fans.
Memory 32GB G.Skill 6000Mhz
Video Card(s) Asus RTX 4090
Storage WD m.2
Display(s) LG C2 Evo OLED 42"
Case Lian Li PC 011 Dynamic Evo
Audio Device(s) Topping E70 DAC, SMSL SP200 Headphone Amp.
Power Supply FSP Hydro Ti PRO 1000W
Mouse Razer Basilisk V3 Pro
Keyboard Tester84
Software Windows 11
Aquatuning sells a silver plated waterblock. :)
 

T4C Fantasy

CPU & GPU DB Maintainer
Staff member
Joined
May 7, 2012
Messages
2,562 (0.59/day)
Location
Rhode Island
System Name Whaaaat Kiiiiiiid!
Processor Intel Core i9-12900K @ Default
Motherboard Gigabyte Z690 AORUS Elite AX
Cooling Corsair H150i AIO Cooler
Memory Corsair Dominator Platinum 32GB DDR4-3200
Video Card(s) EVGA GeForce RTX 3080 FTW3 ULTRA @ Default
Storage Samsung 970 PRO 512GB + Crucial MX500 2TB x3 + Crucial MX500 4TB + Samsung 980 PRO 1TB
Display(s) 27" LG 27MU67-B 4K, + 27" Acer Predator XB271HU 1440P
Case Thermaltake Core X9 Snow
Audio Device(s) Logitech G935 Headset
Power Supply SeaSonic Platinum 1050W Snow Silent
Mouse Logitech G903 Lightspeed
Keyboard Logitech G915
Software Windows 11 Pro
Benchmark Scores FFXV: 19329
Joined
Mar 10, 2010
Messages
11,878 (2.31/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
researchers develop optical transistor:

yeh thats not news to me, what would be is if they had miniturized it(the laser down to the right or any sort of useable size, and also made an actual circuit that did something And ,Nand anything ,wed then be on the way

and a silver anything(cpu or gpu cooling based) is never going to beat a H100 at cooling said chip for the money, and waters deff cooler anyway in both meanings:) silver chip innards nah ,they use all sorts of wierd metals(Gallium) now Because they are better then Anything else, for purpose..... and in cost><....

Aquatuning sells a silver plated waterblock.

both gotta be happy with that:)
 

T4C Fantasy

CPU & GPU DB Maintainer
Staff member
Joined
May 7, 2012
Messages
2,562 (0.59/day)
Location
Rhode Island
System Name Whaaaat Kiiiiiiid!
Processor Intel Core i9-12900K @ Default
Motherboard Gigabyte Z690 AORUS Elite AX
Cooling Corsair H150i AIO Cooler
Memory Corsair Dominator Platinum 32GB DDR4-3200
Video Card(s) EVGA GeForce RTX 3080 FTW3 ULTRA @ Default
Storage Samsung 970 PRO 512GB + Crucial MX500 2TB x3 + Crucial MX500 4TB + Samsung 980 PRO 1TB
Display(s) 27" LG 27MU67-B 4K, + 27" Acer Predator XB271HU 1440P
Case Thermaltake Core X9 Snow
Audio Device(s) Logitech G935 Headset
Power Supply SeaSonic Platinum 1050W Snow Silent
Mouse Logitech G903 Lightspeed
Keyboard Logitech G915
Software Windows 11 Pro
Benchmark Scores FFXV: 19329
yeh thats not news to me, what would be is if they had miniturized it(the laser down to the right or any sort of useable size, and also made an actual circuit that did something And ,Nand anything ,wed then be on the way

and a silver anything(cpu or gpu cooling based) is never going to beat a H100 at cooling said chip for the money, and waters deff cooler anyway in both meanings:) silver chip innards nah ,they use all sorts of wierd metals(Gallium) now Because they are better then Anything else, for purpose..... and in cost><....



both gotta be happy with that:)

this is not true and the proof of this was shown when they released the 3770k doesnt matter what cooling system you have it will never beat the cooling of a 2600k because of the simple fact that the materials before the ihs are different. you can overclock higher with an 2600k with less heat on lesser cooling ^^. but more to the point silver doesnt just have to be for cpu it can be used to transfer data faster in a pci e lane.

i in no way included silver heat sinks to beat water cooling, im strickly talking about copper vs silver conductivity which silver has no rival when talking about metals
 
Last edited:
Joined
Dec 5, 2006
Messages
7,704 (1.22/day)
System Name Back to Blue
Processor i9 14900k
Motherboard Asrock Z790 Nova
Cooling Corsair H150i Elite
Memory 64GB Corsair Dominator DDR5-6400 @ 6600
Video Card(s) EVGA RTX 3090 Ultra FTW3
Storage 4TB WD 850x NVME, 4TB WD Black, 10TB Seagate Barracuda Pro
Display(s) 1x Samsung Odyssey G7 Neo and 1x Dell u2518d
Case Lian Li o11 DXL w/custom vented front panel
Audio Device(s) Focusrite Saffire PRO 14 -> DBX DriveRack PA+ -> Mackie MR8 and MR10 / Senn PX38X -> SB AE-5 Plus
Power Supply Corsair RM1000i
Mouse Logitech G502x
Keyboard Corsair K95 Platinum
Software Windows 11 x64 Pro
Benchmark Scores 31k multicore Cinebench - CPU limited 125w
I'll take a dive in here-

XDR memory?
Well considering it's being used in the PS3, I would say that might say something it.

Would optical technology surpass copper technology?

Few things here, copper technology is limited by voltage as well as anything "speed at which the current flows" if you made a cable network operate at 10,000 volts odds are you could get some higher performance *while at the cost of extra noise*.

Onto optical, the issue at hand here is we need to create a source to not only ping faster and accurately! We need a sensor to sense it faster, and accurately! Then... When you increase the flash rate "bandwidth requirement" you will need a better and better cable to be able to get it through "noise" free.

Are you speaking about using silver contacts on a board or something? If that's what you mean, simple cost... Even in manufacturing, warranty? Silver doesn't hold up very well, it's very soft, so many issues. With really what benefit? Not a lot...

Tungsten plated heat sinks?
The plating I don't think does anything, at least not much to count for in itself, now if you made the heatsink of pure it would be a different story, cost and weight wise as well. It adds to durability, I think it looks nice *mainly because of all the low quality copper polishes*.

Tri-Gate transistors?
Well I'm an engineering double major. Yes I think it's very realistic and lets be honest, if it isn't it will soon be brought to life and Intel will have some pretty serious answering to do won't they?

Pace of technology?
I think it moves faster than we can keep up (no big surprise there). There is also some concern that our continuing reliance on technology is making us weak to our natural environment and that will be the end of our race... That's a fairly valid argument if you think about it... Scary eh? Me, I'm just here for the ride :toast:
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.65/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
I would like to point out why we are not using XDR memory in gpus and would like an enthusiasts opinion on that.
Cost. XDR is Rambus and Rambus wants money. JEDEC oversees SDRAM specifications and there's no unreasonable licensing fees which promotes competition. Competition means cheaper.


also i would like to know from an enthusiasts opinion what the technology standard should be at if there wasn't such a cost for resources and development.
We're talking about Rambus, yeah? Wishful thinking. DDR SDRAM has worked out pretty well so I don't see any reason to replace it.


would optical technology surpass copper technology?
It already has (e.g. Infiniband) but it is cost-prohibitive. I am hopeful that costs will come down over time.


why not use silver technology as a premium for those that can afford it? like a PCI-E 3.1 Silver Edition or (EE) Extreme Edition for conductive throughput.. also SE for silver edition would work could also stand for second edition... what about tungsten plated heat sinks?
Gold is the best conductor of electricity and some electrical contacts are already gold plated. Everything in the computer would have to be upgraded in order for any component to benefit and that simply isn't practical.

Graphene is the best material to make a heatsink out of with a diamond/silica TIM to prevent current from reaching the heatsink.


whats your opinion on 3d (tri-gate) transistors? just a marketing scheme or is it really revolutionary?
It is revolutionary and allows the manufacturing process to be further reduced. It also reduces leakage meaning higher clockspeeds are possible.


some people think technology moves fast... but some of us think it doesnt move fast enough.
Look out much changed in the past 100 years compared to what changed in the 10,000 years before it. Many technologies are evolving at an exponential rate.
 
Joined
Jan 20, 2007
Messages
217 (0.03/day)
Location
Mataram-NTB, Indonesia
System Name Folding System
Processor Intel 4770K Haswell
Motherboard Gigabyte G1 Sniper
Cooling EK Clean CSQ WB + EK CoolStream RAD XTX 480 + EK GTX780Ti Full WB + Swiftech D5 Pump
Memory 4 x 4 GB DDR3 GSkill Ripjaws F3 17000 GBZL | 4 x 4GB DDR3 Corsair Dominator Platinum
Video Card(s) 2 x Inno3D GTX 780Ti 3GB SLI
Storage 2x Patriot Pyro 240GB Raid 0 | 2x WD Green 3TB Raid 0 | 4x Seagate 4TB Raid 0
Display(s) Viewsonic LED VA1938wa 19" / 2 x Dell U2312HM 23" IPS Panel
Case Corsair 900D + 1 Aerocool 14cm Fan, 3 CM Fan 12cm, 4 Scythe Slim Fan 12cm, 4 Aerocool Shark 12cm
Audio Device(s) ASUS Xonar Phoebus + Hippo cri head amp + Sennheiser HD 598 + Razer Megalodon
Power Supply Seasonic P1000 Platinum
Mouse Logitech G502
Keyboard Gigabyte Aivia Osmium Mech Keyboard
Software Windows 10 Home
i have worked for mining company research for few years, i'm not sure if this suitable in this discussion but Technology advance in IT also connected with the lack supply of http://en.wikipedia.org/wiki/Rare_earth_element, as 95% of the world supply come from china, we in the research business are having a long history of difficulties of obtaining some of them for research purpose, for example, researching micro/nanofracture on chip will require large enough quantity some of the element.
 

T4C Fantasy

CPU & GPU DB Maintainer
Staff member
Joined
May 7, 2012
Messages
2,562 (0.59/day)
Location
Rhode Island
System Name Whaaaat Kiiiiiiid!
Processor Intel Core i9-12900K @ Default
Motherboard Gigabyte Z690 AORUS Elite AX
Cooling Corsair H150i AIO Cooler
Memory Corsair Dominator Platinum 32GB DDR4-3200
Video Card(s) EVGA GeForce RTX 3080 FTW3 ULTRA @ Default
Storage Samsung 970 PRO 512GB + Crucial MX500 2TB x3 + Crucial MX500 4TB + Samsung 980 PRO 1TB
Display(s) 27" LG 27MU67-B 4K, + 27" Acer Predator XB271HU 1440P
Case Thermaltake Core X9 Snow
Audio Device(s) Logitech G935 Headset
Power Supply SeaSonic Platinum 1050W Snow Silent
Mouse Logitech G903 Lightspeed
Keyboard Logitech G915
Software Windows 11 Pro
Benchmark Scores FFXV: 19329
I'll take a dive in here-

XDR memory?
Well considering it's being used in the PS3, I would say that might say something it.

Would optical technology surpass copper technology?

Few things here, copper technology is limited by voltage as well as anything "speed at which the current flows" if you made a cable network operate at 10,000 volts odds are you could get some higher performance *while at the cost of extra noise*.

Onto optical, the issue at hand here is we need to create a source to not only ping faster and accurately! We need a sensor to sense it faster, and accurately! Then... When you increase the flash rate "bandwidth requirement" you will need a better and better cable to be able to get it through "noise" free.

Are you speaking about using silver contacts on a board or something? If that's what you mean, simple cost... Even in manufacturing, warranty? Silver doesn't hold up very well, it's very soft, so many issues. With really what benefit? Not a lot...

Tungsten plated heat sinks?
The plating I don't think does anything, at least not much to count for in itself, now if you made the heatsink of pure it would be a different story, cost and weight wise as well. It adds to durability, I think it looks nice *mainly because of all the low quality copper polishes*.

Tri-Gate transistors?
Well I'm an engineering double major. Yes I think it's very realistic and lets be honest, if it isn't it will soon be brought to life and Intel will have some pretty serious answering to do won't they?

Pace of technology?
I think it moves faster than we can keep up (no big surprise there). There is also some concern that our continuing reliance on technology is making us weak to our natural environment and that will be the end of our race... That's a fairly valid argument if you think about it... Scary eh? Me, I'm just here for the ride :toast:

just because its being used in ps3 doesnt mean anything, XDR2 is proven to be ALOT faster than ddr3

XDR1 ram in a ps3 runs at 230Gbit/s / 28.8GB/s
ddr3-2400 runs at only 153Gbit/s / 19.2GB/s

http://www.rambus.com/us/technology/solutions/xdr2/xdr2_vs_gddr5.html

theres no way around it XDR pwns ddr and gddr

if everyones excuse is price, then why bother trying to innovate anything in the first place, its not a valid excuse in any way shape or form.
 
Last edited:
Joined
Mar 26, 2012
Messages
231 (0.05/day)
Location
Ohio
System Name Cogito Ergo Switch
Processor i7 3930K @ 4.8Ghz
Motherboard Asus Rampage IV Extreme
Cooling Custom Water (CPU/MB/GPU Blocks, 420+240rads PP)
Memory 16GB G.Skill Ripjaws Z 2133 9-11-10-27 @ 2380 9-10-10-28
Video Card(s) EVGA GTX670 FTW 2GB SLI @ 1344core/7648mem + EVGA 650Ti PhysX
Storage Samsung 830 256GB SSD (Boot) + 1TB WD10EZEX (Games/Apps) + 1TB WD10EALX (Prog) + 2x 320GB WD/HGST
Display(s) Dell U2412HM + U2312HM + 2x P2212Hbe + Viewsonic 21.5" 1680x1050
Case NZXT Switch 810 White - MODDED
Audio Device(s) Creative X-Fi Titanium HD PCI-Express
Power Supply NZXT HALE90 850W - Custom Sleeved Cables (White, No Heatshrink)
Software Windows7 Pro x64
I am in a sub-field of Biochemical Engineering (working on higher-level degree in Psychopharmacology), so I can't come close to what many of you know regarding electronic/mechanical/computer engineering, but I think this is a very interesting discussion.

I personally feel that software is what is driving hardware innovations, but we have 6 core desktop processors that can't even be fully utilized by the vast majority of programs, and there is little motivation to design software for the masses that will utilize such resources when most people are not interested in anything more than simply having a computer that "just works". It's a Catch-22 in a way, especially for us enthusiasts.

I would love to see the industry move to "revolutionary" tech over "evolutionary", but the financial aspects of such a move would be a mess, especially when our economy is in such a poor state.
 

TheMailMan78

Big Member
Joined
Jun 3, 2007
Messages
22,599 (3.68/day)
Location
'Merica. The Great SOUTH!
System Name TheMailbox 5.0 / The Mailbox 4.5
Processor RYZEN 1700X / Intel i7 2600k @ 4.2GHz
Motherboard Fatal1ty X370 Gaming K4 / Gigabyte Z77X-UP5 TH Intel LGA 1155
Cooling MasterLiquid PRO 280 / Scythe Katana 4
Memory ADATA RGB 16GB DDR4 2666 16-16-16-39 / G.SKILL Sniper Series 16GB DDR3 1866: 9-9-9-24
Video Card(s) MSI 1080 "Duke" with 8Gb of RAM. Boost Clock 1847 MHz / ASUS 780ti
Storage 256Gb M4 SSD / 128Gb Agelity 4 SSD , 500Gb WD (7200)
Display(s) LG 29" Class 21:9 UltraWide® IPS LED Monitor 2560 x 1080 / Dell 27"
Case Cooler Master MASTERBOX 5t / Cooler Master 922 HAF
Audio Device(s) Realtek ALC1220 Audio Codec / SupremeFX X-Fi with Bose Companion 2 speakers.
Power Supply Seasonic FOCUS Plus Series SSR-750PX 750W Platinum / SeaSonic X Series X650 Gold
Mouse SteelSeries Sensei (RAW) / Logitech G5
Keyboard Razer BlackWidow / Logitech (Unknown)
Software Windows 10 Pro (64-bit)
Benchmark Scores Benching is for bitches.
I am in a sub-field of Biochemical Engineering (working on higher-level degree in Psychopharmacology), so I can't come close to what many of you know regarding electronic/mechanical/computer engineering, but I think this is a very interesting discussion.

I personally feel that software is what is driving hardware innovations, but we have 6 core desktop processors that can't even be fully utilized by the vast majority of programs, and there is little motivation to design software for the masses that will utilize such resources when most people are not interested in anything more than simply having a computer that "just works". It's a Catch-22 in a way, especially for us enthusiasts.

I would love to see the industry move to "revolutionary" tech over "evolutionary", but the financial aspects of such a move would be a mess, especially when our economy is in such a poor state.

I am in the sub-field of Astrophysics. I've been working on spewing BS across 16 different plains of exsitance in 13 galaxies.
 
Joined
Mar 10, 2010
Messages
11,878 (2.31/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
this is not true and the proof of this was shown when they released the 3770k doesnt matter what cooling system you have it will never beat the cooling of a 2600k because of the simple fact that the materials before the ihs are different. you can overclock higher with an 2600k with less heat on lesser cooling ^^. but more to the point silver doesnt just have to be for cpu it can be used to transfer data faster in a pci e lane.

thats a pile of BS mate, a 2600K will not clock higher then a 3770K if the 3770K had been made with a soldered IHS instead of the job lot silicone shit TIM intel used , if the 3770K is relidded with decent Tim or unlidded it will clock higher then the 2600K(on average as chips vary)

and never beat the cooling of a 2600K, wtf does that statement even mean since most will pick their own cooler and add variance their, and the 2600K does put out less heat then a 3770K but thats down to the process used and the trigate transistors in the IB 3770K(more surface area per gate = higher heat dissipation)

get over silver allready:shadedshu:p
Onto optical, the issue at hand here is we need to create a source to not only ping faster and accurately! We need a sensor to sense it faster, and accurately! Then... When you increase the flash rate "bandwidth requirement" you will need a better and better cable to be able to get it through "noise" free.

Are you speaking about using silver contacts on a board or something? If that's what you mean, simple cost... Even in manufacturing, warranty? Silver doesn't hold up very well, it's very soft, so many issues. With really what benefit? Not a lot...

Tri-Gate transistors?
Well I'm an engineering double major. Yes I think it's very realistic and lets be honest, if it isn't it will soon be brought to life and Intel will have some pretty serious answering to do won't they?

think he means fully optical chips ,not interconnecting hardware(and networkin) ,years off

as i said they allready gold plate critical connections on most motherboards (golds harder wearing then silver)

Tri gate transistors are mearly an evolution and intel started the debate, Ivybridge uses Trigate transistors so they have No answering to do, other Fab cos have and are doing the answering Allready. 4d stacked chips:D

http://www.tomshardware.com/news/rochester-3d-processor,6369.html

http://www.hpcwire.com/hpcwire/2011-11-30/ibm_will_chip_in_on_micron_s_3d_hybrid_memory_cube.html

http://techon.nikkeibp.co.jp/article/HONSHI/20081031/160571/

I personally feel that software is what is driving hardware innovations

to us the end users its the software not the hardware that needs a soddin evolution ,on average software's 3-5 years behind the hardware thats out, a catch all scenario thats bad for us enthusiasts.

this is in some way proven by the fact intel could release its 50 core Knightsferry chip whenever they want and did want to also sell it as a render card, however the software is simply not around that would use it in a consumer pc to any degree other then a demo or possibly folding mmmmm
 
Last edited:
Joined
Apr 2, 2011
Messages
2,645 (0.56/day)
yes but intel has enough to make protoypes and what i am saying here is it "doesn't" need to be in mass productions, make maybe 12 specialized chips sell them to developpers and it could lead to mass productions.. a perfect example of this kind of strategy would be the ps3, the ps3 used superior technology than what pcs were using in 2006, 1 ps3 equaled 12 or 18 pcs... and it also cost more to make then what they sold it for.

also intel is already in the makes of a 50-core co processor that runs on pci-e an equivilent would be an nvidia tesla

Sir, my apples are much sweeter than your oranges.


Seriously though, prototypes are insanely expensive to produce. Most of what you have suggested bears the same classification. I'm pretty sure nobody would buy a $10,000 NIC, knowing that the internet connection would still be incapable of using 1% of the bandwidth.

Opto-electronic chips exist. They aren't anywhere near consumer grade ready. A lab sample of a dozen interconnects isn't anywhere near a consumer grade billion transistor chip. The fact that the technology currently can't be scaled is where the problem lies.

Silver in chips? That has to be the most foolish idea, short of gold. Silver is several orders of magnitude (copper being $3/lb, silver being $28 per ounce) more expensive than copper. For the slight decrease in resistivity, you spend nearly 100 times (very round estimate of 28/(3/16)) as much in materials. A low end silver based chip would be triple or quadruple the price of an equivalent copper based chip. That kind of price to performance change would not fly well with consumers.


Finally, the PS3 is a bizarre little experiment. It cannot be compared to a normal PC, because they are two fundamentally different beasts. The PS3 couldn't out perform 12 PCs, unless it was tasked with very specific jobs that favored its talents. A PC could run rings around the PS3 in 90% of computing tasks. Ever wonder why the PS3 based super computers didn't last long?



So, I have to reflect what other people are saying. Economic considerations are what the consumer market is driven by. A $1200 luxury device won't sell nearly as well as one that can do 90% of the features for 50% of the price. Additionally, the technology news has a bias for showing awesome new inventions, that often need a decade more research before they can be viably manufactured. The real motivator of technology is the currency people are willing to spend, which isn't nearly as awesome as saying that technology is driven by humanity's desires... We can dream...


Edit:
if everyones excuse is price, then why bother trying to innovate anything in the first place, its not a valid excuse in any way shape or form.

10 years of development separates the lab and mass manufacturing. XDR memory is a nice technology, but excessive pricing and a lack of availability make it impractical.

Do you remember RDRAM? It was faster than the SDRAM we were using at the time (and still do!). Rambus even had it in P4 sockets for a time. People didn't want to pay, what amounted to highway robbery, to Rambus for their patented technology. The faster technology died, making way for the cheaper (and worse performing) RAM you're using today. This is the economic influence at its purest.

Imagine how many computers would exist if they still cost as much as a car. Imagine the world without the internet. It's crappy to think about, but decreases in pricing are what makes everything available to consumers. Performance is an afterthought, as very few people have the luxury of quantifying price below performance.
 
Last edited:

T4C Fantasy

CPU & GPU DB Maintainer
Staff member
Joined
May 7, 2012
Messages
2,562 (0.59/day)
Location
Rhode Island
System Name Whaaaat Kiiiiiiid!
Processor Intel Core i9-12900K @ Default
Motherboard Gigabyte Z690 AORUS Elite AX
Cooling Corsair H150i AIO Cooler
Memory Corsair Dominator Platinum 32GB DDR4-3200
Video Card(s) EVGA GeForce RTX 3080 FTW3 ULTRA @ Default
Storage Samsung 970 PRO 512GB + Crucial MX500 2TB x3 + Crucial MX500 4TB + Samsung 980 PRO 1TB
Display(s) 27" LG 27MU67-B 4K, + 27" Acer Predator XB271HU 1440P
Case Thermaltake Core X9 Snow
Audio Device(s) Logitech G935 Headset
Power Supply SeaSonic Platinum 1050W Snow Silent
Mouse Logitech G903 Lightspeed
Keyboard Logitech G915
Software Windows 11 Pro
Benchmark Scores FFXV: 19329
Sir, my apples are much sweeter than your oranges.


Seriously though, prototypes are insanely expensive to produce. Most of what you have suggested bears the same classification. I'm pretty sure nobody would buy a $10,000 NIC, knowing that the internet connection would still be incapable of using 1% of the bandwidth.

Opto-electronic chips exist. They aren't anywhere near consumer grade ready. A lab sample of a dozen interconnects isn't anywhere near a consumer grade billion transistor chip. The fact that the technology currently can't be scaled is where the problem lies.

Silver in chips? That has to be the most foolish idea, short of gold. Silver is several orders of magnitude (copper being $3/lb, silver being $28 per ounce) more expensive than copper. For the slight decrease in resistivity, you spend nearly 100 times (very round estimate of 28/(3/16)) as much in materials. A low end silver based chip would be triple or quadruple the price of an equivalent copper based chip. That kind of price to performance change would not fly well with consumers.


Finally, the PS3 is a bizarre little experiment. It cannot be compared to a normal PC, because they are two fundamentally different beasts. The PS3 couldn't out perform 12 PCs, unless it was tasked with very specific jobs that favored its talents. A PC could run rings around the PS3 in 90% of computing tasks. Ever wonder why the PS3 based super computers didn't last long?



So, I have to reflect what other people are saying. Economic considerations are what the consumer market is driven by. A $1200 luxury device won't sell nearly as well as one that can do 90% of the features for 50% of the price. Additionally, the technology news has a bias for showing awesome new inventions, that often need a decade more research before they can be viably manufactured. The real motivator of technology is the currency people are willing to spend, which isn't nearly as awesome as saying that technology is driven by humanity's desires... We can dream...


Edit:


10 years of development separates the lab and mass manufacturing. XDR memory is a nice technology, but excessive pricing and a lack of availability make it impractical.

Do you remember RDRAM? It was faster than the SDRAM we were using at the time (and still do!). Rambus even had it in P4 sockets for a time. People didn't want to pay, what amounted to highway robbery, to Rambus for their patented technology. The faster technology died, making way for the cheaper (and worse performing) RAM you're using today. This is the economic influence at its purest.

Imagine how many computers would exist if they still cost as much as a car. Imagine the world without the internet. It's crappy to think about, but decreases in pricing are what makes everything available to consumers. Performance is an afterthought, as very few people have the luxury of quantifying price below performance.

the air force still uses the ps3 super computer ^^, back in 2006 no consumer grade pc for the same price as 700 could compute better than the ps3, its a fact.
 

T4C Fantasy

CPU & GPU DB Maintainer
Staff member
Joined
May 7, 2012
Messages
2,562 (0.59/day)
Location
Rhode Island
System Name Whaaaat Kiiiiiiid!
Processor Intel Core i9-12900K @ Default
Motherboard Gigabyte Z690 AORUS Elite AX
Cooling Corsair H150i AIO Cooler
Memory Corsair Dominator Platinum 32GB DDR4-3200
Video Card(s) EVGA GeForce RTX 3080 FTW3 ULTRA @ Default
Storage Samsung 970 PRO 512GB + Crucial MX500 2TB x3 + Crucial MX500 4TB + Samsung 980 PRO 1TB
Display(s) 27" LG 27MU67-B 4K, + 27" Acer Predator XB271HU 1440P
Case Thermaltake Core X9 Snow
Audio Device(s) Logitech G935 Headset
Power Supply SeaSonic Platinum 1050W Snow Silent
Mouse Logitech G903 Lightspeed
Keyboard Logitech G915
Software Windows 11 Pro
Benchmark Scores FFXV: 19329
Sir, my apples are much sweeter than your oranges.


Seriously though, prototypes are insanely expensive to produce. Most of what you have suggested bears the same classification. I'm pretty sure nobody would buy a $10,000 NIC, knowing that the internet connection would still be incapable of using 1% of the bandwidth.

Opto-electronic chips exist. They aren't anywhere near consumer grade ready. A lab sample of a dozen interconnects isn't anywhere near a consumer grade billion transistor chip. The fact that the technology currently can't be scaled is where the problem lies.

Silver in chips? That has to be the most foolish idea, short of gold. Silver is several orders of magnitude (copper being $3/lb, silver being $28 per ounce) more expensive than copper. For the slight decrease in resistivity, you spend nearly 100 times (very round estimate of 28/(3/16)) as much in materials. A low end silver based chip would be triple or quadruple the price of an equivalent copper based chip. That kind of price to performance change would not fly well with consumers.


Finally, the PS3 is a bizarre little experiment. It cannot be compared to a normal PC, because they are two fundamentally different beasts. The PS3 couldn't out perform 12 PCs, unless it was tasked with very specific jobs that favored its talents. A PC could run rings around the PS3 in 90% of computing tasks. Ever wonder why the PS3 based super computers didn't last long?



So, I have to reflect what other people are saying. Economic considerations are what the consumer market is driven by. A $1200 luxury device won't sell nearly as well as one that can do 90% of the features for 50% of the price. Additionally, the technology news has a bias for showing awesome new inventions, that often need a decade more research before they can be viably manufactured. The real motivator of technology is the currency people are willing to spend, which isn't nearly as awesome as saying that technology is driven by humanity's desires... We can dream...


Edit:


10 years of development separates the lab and mass manufacturing. XDR memory is a nice technology, but excessive pricing and a lack of availability make it impractical.

Do you remember RDRAM? It was faster than the SDRAM we were using at the time (and still do!). Rambus even had it in P4 sockets for a time. People didn't want to pay, what amounted to highway robbery, to Rambus for their patented technology. The faster technology died, making way for the cheaper (and worse performing) RAM you're using today. This is the economic influence at its purest.

Imagine how many computers would exist if they still cost as much as a car. Imagine the world without the internet. It's crappy to think about, but decreases in pricing are what makes everything available to consumers. Performance is an afterthought, as very few people have the luxury of quantifying price below performance.

people would buy a 10,000 chip.. they sell 25,000 14 inch monitors that people buy! lol, if you are rich and money is not a problem than TRUST me.. they will buy it... my 3d LED 24 inch was only 259... i personally wouldnt want to spend 25k on somethign that looks a TINY bit better lol but people will and they have and they always do.
 
Joined
Jul 1, 2005
Messages
5,197 (0.76/day)
Location
Kansas City, KS
System Name Dell XPS 15 9560
Processor I7-7700HQ
Memory 32GB DDR4
Video Card(s) GTX 1050/1080 Ti
Storage 1TB SSD
Display(s) 2x Dell P2715Q/4k Internal
Case Razer Core
Audio Device(s) Creative E5/Objective 2 Amp/Senn HD650
Mouse Logitech Proteus Core
Keyboard Logitech G910
just because its being used in ps3 doesnt mean anything, XDR2 is proven to be ALOT faster than ddr3

XDR1 ram in a ps3 runs at 230Gbit/s / 28.8GB/s
ddr3-2400 runs at only 153Gbit/s / 19.2GB/s

if everyones excuse is price, then why bother trying to innovate anything in the first place, its not a valid excuse in any way shape or form.

Capacity is the limiting factor, not price.

Sure, you can make DDR3 chips that run 250Gbit and smoke XDR, if you use the same piss poor capacity that a PS3 has.

Computers OTOH use 8, 12, 16, 32, 64, 128GB quantities, not sub 1gb quantities. Your point is invalid. The technology is not "as great as you think it is".

the air force still uses the ps3 super computer ^^, back in 2006 no consumer grade pc for the same price as 700 could compute better than the ps3, its a fact.

By todays standard, a PS3 is now a pretty poor performer when it comes to computing. It was handy before because of the extreme low price/performance/power consumption ratio.

Where the air force got screwed in that deal was the farm is essentially not upgradable in any way. Atleast with a computer using 20,000 opterons you can switch to a new opteron. :roll:

Now, the PS3's are barely worth the cost of the electricity to operate them.
 

T4C Fantasy

CPU & GPU DB Maintainer
Staff member
Joined
May 7, 2012
Messages
2,562 (0.59/day)
Location
Rhode Island
System Name Whaaaat Kiiiiiiid!
Processor Intel Core i9-12900K @ Default
Motherboard Gigabyte Z690 AORUS Elite AX
Cooling Corsair H150i AIO Cooler
Memory Corsair Dominator Platinum 32GB DDR4-3200
Video Card(s) EVGA GeForce RTX 3080 FTW3 ULTRA @ Default
Storage Samsung 970 PRO 512GB + Crucial MX500 2TB x3 + Crucial MX500 4TB + Samsung 980 PRO 1TB
Display(s) 27" LG 27MU67-B 4K, + 27" Acer Predator XB271HU 1440P
Case Thermaltake Core X9 Snow
Audio Device(s) Logitech G935 Headset
Power Supply SeaSonic Platinum 1050W Snow Silent
Mouse Logitech G903 Lightspeed
Keyboard Logitech G915
Software Windows 11 Pro
Benchmark Scores FFXV: 19329
Capacity is the limiting factor, not price.

Sure, you can make DDR3 chips that run 250Gbit and smoke XDR, if you use the same piss poor capacity that a PS3 has.

Computers OTOH use 8, 12, 16, 32, 64, 128GB quantities, not sub 1gb quantities. Your point is invalid. The technology is not "as great as you think it is".



By todays standard, a PS3 is now a pretty poor performer when it comes to computing. It was handy before because of the extreme low price/performance/power consumption ratio.

Where the air force got screwed in that deal was the farm is essentially not upgradable in any way. Atleast with a computer using 20,000 opterons you can switch to a new opteron. :roll:

Now, the PS3's are barely worth the cost of the electricity to operate them.

xdr2 ram is not sub 1gb, they come in the same sizes as ddr3 :p i was only using the ps3 as an example for speed, which you cannot get ddr3 to go as fast as xdr2.. not even close

the only way ddr3 would be faster than a STOCK clock of XDR2 is to overclock ddr3 to 3800GT/s which is DDR3-3800... which is 1900MHz

its about price.. trust me
 
Last edited:
Joined
Jul 1, 2005
Messages
5,197 (0.76/day)
Location
Kansas City, KS
System Name Dell XPS 15 9560
Processor I7-7700HQ
Memory 32GB DDR4
Video Card(s) GTX 1050/1080 Ti
Storage 1TB SSD
Display(s) 2x Dell P2715Q/4k Internal
Case Razer Core
Audio Device(s) Creative E5/Objective 2 Amp/Senn HD650
Mouse Logitech Proteus Core
Keyboard Logitech G910
xdr2 ram is not sub 1gb, they come in the same sizes as ddr3 :p i was only using the ps3 as an example for speed, which you cannot get ddr3 to go as fast as xdr2.. not even close

XDR has other limitations; address length on bursts. 8ghz? Sure. You're fetching 40m You're comparing a theoretical technology to a mature, in market, actively utilized technology. I think DDR3 already won.

Sounds like a wiring nightmare. Theres a reason its intended for GPU's.

Available for licensing today, XDR2 simply has no serious takers.

DDR is significantly cheaper, memory controllers are masterful, memory isn't really too huge of a bottleneck (which leads to even less interest in upgrading technology that isn't a bottleneck).

I'm sure rambus is also seeing $$$ everywhere when they mention $licensing$. Fabless company isn't just gonna hand out plans for a memory technology for free.

Screw Rambus. Intel learned the first time.
 

T4C Fantasy

CPU & GPU DB Maintainer
Staff member
Joined
May 7, 2012
Messages
2,562 (0.59/day)
Location
Rhode Island
System Name Whaaaat Kiiiiiiid!
Processor Intel Core i9-12900K @ Default
Motherboard Gigabyte Z690 AORUS Elite AX
Cooling Corsair H150i AIO Cooler
Memory Corsair Dominator Platinum 32GB DDR4-3200
Video Card(s) EVGA GeForce RTX 3080 FTW3 ULTRA @ Default
Storage Samsung 970 PRO 512GB + Crucial MX500 2TB x3 + Crucial MX500 4TB + Samsung 980 PRO 1TB
Display(s) 27" LG 27MU67-B 4K, + 27" Acer Predator XB271HU 1440P
Case Thermaltake Core X9 Snow
Audio Device(s) Logitech G935 Headset
Power Supply SeaSonic Platinum 1050W Snow Silent
Mouse Logitech G903 Lightspeed
Keyboard Logitech G915
Software Windows 11 Pro
Benchmark Scores FFXV: 19329
Joined
Jul 10, 2010
Messages
1,229 (0.25/day)
Location
USA, Arizona
System Name SolarwindMobile
Processor AMD FX-9800P RADEON R7, 12 COMPUTE CORES 4C+8G
Motherboard Acer Wasp_BR
Cooling It's Copper.
Memory 2 x 8GB SK Hynix/HMA41GS6AFR8N-TF
Video Card(s) ATI/AMD Radeon R7 Series (Bristol Ridge FP4) [ACER]
Storage TOSHIBA MQ01ABD100 1TB + KINGSTON RBU-SNS8152S3128GG2 128 GB
Display(s) ViewSonic XG2401 SERIES
Case Acer Aspire E5-553G
Audio Device(s) Realtek ALC255
Power Supply PANASONIC AS16A5K
Mouse SteelSeries Rival
Keyboard Ducky Channel Shine 3
Software Windows 10 Home 64-bit (Version 1607, Build 14393.969)
this is from the very first post that I posted "I would like to point out why we are not using XDR memory in gpus and would like an enthusiasts opinion on that."
Reason:
No one wants to make it.

If XDR or XDR2 was made for PCs or Consoles it would undermine JEDEC DDR3 and GDDR5.
 
Joined
Jul 1, 2005
Messages
5,197 (0.76/day)
Location
Kansas City, KS
System Name Dell XPS 15 9560
Processor I7-7700HQ
Memory 32GB DDR4
Video Card(s) GTX 1050/1080 Ti
Storage 1TB SSD
Display(s) 2x Dell P2715Q/4k Internal
Case Razer Core
Audio Device(s) Creative E5/Objective 2 Amp/Senn HD650
Mouse Logitech Proteus Core
Keyboard Logitech G910
this is from the very first post that I posted "I would like to point out why we are not using XDR memory in gpus and would like an enthusiasts opinion on that."

Rambus is an awful company, and noone wants to do business with them. Pretty much end of story. Rambus (the DRAM for P4's) had the same problems that XDR would; weird configurations, and Rambus getting full of themselves.

Considering myself an enthusiast, the big DRAM companies are making a sound decision, given history.
 
Joined
Jun 27, 2011
Messages
6,680 (1.43/day)
Processor 7800x3d
Motherboard Gigabyte B650 Auros Elite AX
Cooling Custom Water
Memory GSKILL 2x16gb 6000mhz Cas 30 with custom timings
Video Card(s) MSI RX 6750 XT MECH 2X 12G OC
Storage Adata SX8200 1tb with Windows, Samsung 990 Pro 2tb with games
Display(s) HP Omen 27q QHD 165hz
Case ThermalTake P3
Power Supply SuperFlower Leadex Titanium
Software Windows 11 64 Bit
Benchmark Scores CB23: 1811 / 19424 CB24: 1136 / 7687
I think the biggest bottleneck of my pc is my harddrive. It is relatively fast for mechanical, and I still boot up in 30 seconds... but... ssd's are so much faster. I am still waiting for a good price drop on ssd's to get one, and am happy with how my pc performs now.

Anyone else think mechanical is the main bottleneck of pc's?



I think internet speeds are behind in comparison to routers and NIC's. This varies by area to area. Japan has very very fast internet, but here in the US we are much slower. I know "some" locations, not mine, have pretty fast internet.

Anyone else think think the internet infrastructure is behind, at least in America?
 
Joined
Jul 1, 2005
Messages
5,197 (0.76/day)
Location
Kansas City, KS
System Name Dell XPS 15 9560
Processor I7-7700HQ
Memory 32GB DDR4
Video Card(s) GTX 1050/1080 Ti
Storage 1TB SSD
Display(s) 2x Dell P2715Q/4k Internal
Case Razer Core
Audio Device(s) Creative E5/Objective 2 Amp/Senn HD650
Mouse Logitech Proteus Core
Keyboard Logitech G910
I
I think internet speeds are behind in comparison to routers and NIC's. This varies by area to area. Japan has very very fast internet, but here in the US we are much slower. I know "some" locations, not mine, have pretty fast internet.

Anyone else think think the internet infrastructure is behind, at least in America?

Its behind in most areas, because unlike japan where covering 1sq mile nets you tons of customers, covering 1 sq mile in the US is a small fraction of the subscribers.

Compound that with the US having hundreds times more actual land than Japan, it costs exponentially more to spread gigabit fibers from house to house.

Not really a mystery, its just plain hard to cover everywhere :p

Its the exact same issue with edge Vs 3g Vs LTE currently. You'll notice high population density areas always get stuff first.
 
Joined
Jun 27, 2011
Messages
6,680 (1.43/day)
Processor 7800x3d
Motherboard Gigabyte B650 Auros Elite AX
Cooling Custom Water
Memory GSKILL 2x16gb 6000mhz Cas 30 with custom timings
Video Card(s) MSI RX 6750 XT MECH 2X 12G OC
Storage Adata SX8200 1tb with Windows, Samsung 990 Pro 2tb with games
Display(s) HP Omen 27q QHD 165hz
Case ThermalTake P3
Power Supply SuperFlower Leadex Titanium
Software Windows 11 64 Bit
Benchmark Scores CB23: 1811 / 19424 CB24: 1136 / 7687
Its behind in most areas, because unlike japan where covering 1sq mile nets you tons of customers, covering 1 sq mile in the US is a small fraction of the subscribers.

Compound that with the US having hundreds times more actual land than Japan, it costs exponentially more to spread gigabit fibers from house to house.

Not really a mystery, its just plain hard to cover everywhere :p

Its the exact same issue with edge Vs 3g Vs LTE currently. You'll notice high population density areas always get stuff first.
Weren't we supposed to forget cost for this discussion. Internet is a bottleneck.
 
Top