• We've upgraded our forums. Please post any issues/requests in this thread.

AMD "Jaguar" Micro-architecture Takes the Fight to Atom with AVX, SSE4, Quad-Core

Joined
Nov 19, 2012
Messages
750 (0.40/day)
Likes
430
System Name Chaos
Processor Intel Core i5 4590K @ 4.0 GHz
Motherboard MSI Z97 MPower MAX AC
Cooling Arctic Cooling Freezer i30 + MX4
Memory 2x4 GB Kingston HyperX Beast 2400 GT/s CL11
Video Card(s) Sapphire HD7950 Vapor X, 800/1400 @ 1.075V/1.45V
Storage 256GB Samsung 840 Pro SSD + 1 TB WD Green (Idle timer off) + 320 GB WD Blue
Display(s) Dell U2515H
Case Fractal Design Define R3
Audio Device(s) Onboard
Power Supply Seasonic SS-380GB
Mouse CM Storm Recon
Keyboard CM Storm Quickfire Pro (MX Red)
#51
I believe that AIDA does round-trip latency, and Ikaruga (love that game btw) probably claims that the GDDR5 used has a CL of 32ns. 1600 MT/s CL9 DDR3 has a CL of ~11.25ns max, close to three times less.

Still, with some intelligent queues and cache management, this won't be too much of a problem.


## EDIT ##
Have I ever mentioned how I hate it when I get distracted when replying, only to find out I made myself look like an idiot by posting the exact same thing as the person before me? Well, I do.
Sorry Ikaruga.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
10,404 (4.84/day)
Likes
5,483
Location
Concord, NH
System Name Kratos
Processor Intel Core i7 3930k @ 4.2Ghz
Motherboard ASUS P9X79 Deluxe
Cooling Zalman CPNS9900MAX 130mm
Memory G.Skill DDR3-2133, 16gb (4x4gb) @ 9-11-10-28-108-1T 1.65v
Video Card(s) MSI AMD Radeon R9 390 GAMING 8GB @ PCI-E 3.0
Storage 2x120Gb SATA3 Corsair Force GT Raid-0, 4x1Tb RAID-5, 1x500GB
Display(s) 1x LG 27UD69P (4k), 2x Dell S2340M (1080p)
Case Antec 1200
Audio Device(s) Onboard Realtek® ALC898 8-Channel High Definition Audio
Power Supply Seasonic 1000-watt 80 PLUS Platinum
Mouse Logitech G602
Keyboard Rosewill RK-9100
Software Ubuntu 17.10
Benchmark Scores Benchmarks aren't everything.
#52
No, and I don't really understand why would I joke about ram timings on my favorite enthusiast site. Do you understand that I was citing the actual latency of the chip itself, and not the latency the MC will have to deal with when accessing the memory?
For example, a typical DDR3@1600 module has about 12ns latency in a modern PC.
You mean the 32ns refresh? That's not access speeds my friend, that is how often that a bit in a DRAM cell is refreshed. All DRAM needs to be refreshed since data is stored in a capacitor and needs to be replenished as caps leak when they're disconnected from active power. Other than that, I see no mention of 32ns there.

That "32ns" sounds a lot like tRFC on DDR3 chips, not access latency.
 
Joined
Feb 18, 2011
Messages
1,240 (0.50/day)
Likes
503
#53
I believe that AIDA does round-trip latency, and Ikaruga (love that game btw) probably claims that the GDDR5 used has a CL of 32ns. 1600 MT/s CL9 DDR3 has a CL of ~11.25ns max, close to three times less.

Still, with some intelligent queues and cache management, this won't be too much of a problem.


## EDIT ##
Have I ever mentioned how I hate it when I get distracted when replying, only to find out I made myself look like an idiot by posting the exact same thing as the person before me? Well, I do.
Sorry Ikaruga.
Yes I meant that speed, sorry for my English:shadedshu
 
Joined
Feb 20, 2011
Messages
124 (0.05/day)
Likes
17
Location
UK
#54
so sony will be accessing it with there own version of LibGCM , along with OCL 1.2 means some awesome and better control over the hardware - something they cant really do now in the PC world as the hardware is variable , could see `on the fly` changes to core useage depending on whether is high physics load or a cut scene movie
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
10,404 (4.84/day)
Likes
5,483
Location
Concord, NH
System Name Kratos
Processor Intel Core i7 3930k @ 4.2Ghz
Motherboard ASUS P9X79 Deluxe
Cooling Zalman CPNS9900MAX 130mm
Memory G.Skill DDR3-2133, 16gb (4x4gb) @ 9-11-10-28-108-1T 1.65v
Video Card(s) MSI AMD Radeon R9 390 GAMING 8GB @ PCI-E 3.0
Storage 2x120Gb SATA3 Corsair Force GT Raid-0, 4x1Tb RAID-5, 1x500GB
Display(s) 1x LG 27UD69P (4k), 2x Dell S2340M (1080p)
Case Antec 1200
Audio Device(s) Onboard Realtek® ALC898 8-Channel High Definition Audio
Power Supply Seasonic 1000-watt 80 PLUS Platinum
Mouse Logitech G602
Keyboard Rosewill RK-9100
Software Ubuntu 17.10
Benchmark Scores Benchmarks aren't everything.
#55
claims that the GDDR5 used has a CL of 32ns. 1600 MT/s CL9 DDR3 has a CL of ~11.25ns max, close to three times less.
Isn't that kind of moot since GDDR5 can run at clocks that are 3 times faster than DDR3-1600? It's the same deal that happened when moving from DDR to DDR2 and to DDR3. Latencies increased but access times remained the same because the memory frequency increased which compensates for it and at the same time provides more bandwidth.

Yeah, there might be more latency, it's possible, but I don't think it will make that much of a difference. Also with more bandwidth you can load more data into cache in one clock than DDR3. So I think the benefits will far outweigh the costs.
 
Joined
Feb 18, 2011
Messages
1,240 (0.50/day)
Likes
503
#56
Isn't that kind of moot since GDDR5 can run at clocks that are 3 times faster than DDR3-1600? It's the same deal that happened when moving from DDR to DDR2 and to DDR3. Latencies increased but access times remained the same because the memory frequency increased which compensates for it and at the same time provides more bandwidth.

Yeah, there might be more latency, it's possible, but I don't think it will make that much of a difference. Also with more bandwidth you can load more data into cache in one clock than DDR3. So I think the benefits will far outweigh the costs.
I don't think the price is the reason why we still don't use GDDR5 as main memory in PCs, after all they are selling graphics cards for much more than how much a GDDR5 ram kit or a supporting chipset/architecture would cost. I did not really red anything about overcoming the GDDR5 latency issue in the past, so that's what made me curious.

.....and Ikaruga (love that game btw)
:toast:
 
Joined
Dec 1, 2011
Messages
343 (0.16/day)
Likes
34
Location
Ft Stewart
System Name Queen Bee
Processor 3570k @ 4.0GHz
Motherboard Gigabyte UD3 Z77
Cooling Water Loop by EK
Memory 8GB Corsair 1600 DDR3
Video Card(s) MSI GTX 970 Gaming WaterCooled
Storage 1x Western Digital 500GB Black 1x Intel 20GB 311 SSD
Display(s) BenQ XL2420G
Case CoolTek W2
Power Supply Corsair 650Watt
Software Windows 7 Pro
#57
we also know it will have 18gcn clusters = 1152 gcn cores rated at 800mhz
and it was rated at 1.84gflops or something actually
18gcn clusters! Can that be right! That would mean Jaguar would get 576 which is more then a 7750, and that alone is 40w of power. Something is a miss here for me.

So 45+45+ say 45 again (cpu) is 135w+!! Something has to be a miss.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
10,404 (4.84/day)
Likes
5,483
Location
Concord, NH
System Name Kratos
Processor Intel Core i7 3930k @ 4.2Ghz
Motherboard ASUS P9X79 Deluxe
Cooling Zalman CPNS9900MAX 130mm
Memory G.Skill DDR3-2133, 16gb (4x4gb) @ 9-11-10-28-108-1T 1.65v
Video Card(s) MSI AMD Radeon R9 390 GAMING 8GB @ PCI-E 3.0
Storage 2x120Gb SATA3 Corsair Force GT Raid-0, 4x1Tb RAID-5, 1x500GB
Display(s) 1x LG 27UD69P (4k), 2x Dell S2340M (1080p)
Case Antec 1200
Audio Device(s) Onboard Realtek® ALC898 8-Channel High Definition Audio
Power Supply Seasonic 1000-watt 80 PLUS Platinum
Mouse Logitech G602
Keyboard Rosewill RK-9100
Software Ubuntu 17.10
Benchmark Scores Benchmarks aren't everything.
#58
18gcn clusters! Can that be right! That would mean Jaguar would get 576 which is more then a 7750, and that alone is 40w of power. Something is a miss here for me.

So 45+45+ say 45 again (cpu) is 135w+!! Something has to be a miss.
They already said that the graphics power is going to be similar to a 7870, didn't they?
 
Joined
Feb 13, 2012
Messages
359 (0.17/day)
Likes
61
#59
You mean the 32ns refresh? That's not access speeds my friend, that is how often that a bit in a DRAM cell is refreshed. All DRAM needs to be refreshed since data is stored in a capacitor and needs to be replenished as caps leak when they're disconnected from active power. Other than that, I see no mention of 32ns there.

That "32ns" sounds a lot like tRFC on DDR3 chips, not access latency.
and that is exactly what latency is tho, as what happens the ram issues the data to the cpu, and after 32ns it refreshes to send the next batch,gpus are highly paralleled so they arent as affected by latency as most gpus just need a certain amount of data to render while the ram sends the next batch, while cpus are a much more random and general purpose than gpus, for example were certain calculations would be issued from the ram, but in order for the cpu to complete the process it must wait for the second batch of data for example, in such a case the cpu would wait for another 32ns, and this is a big issue with cpus now adays but i think it can be easily masked with a large enough l2 cache for the jaguar cores(I think having 8 of them means 4mb cache that can be shared meaning one core can have all 4mb if it needs to. another thing i can think of is whether all 8gb refresh all at once, or whether sony will allow for the ram to work in turns to feed the cpu/gpu more dynamicaly rather than in big chunks of data(note that bulldozer/piledriver have relatively large data pools of l3 and l2 cache to mask their higher latency, and with steamroller adding larger l1 cahce aswell that sais something, and not to mention how much l3 cache affects piledriver in trinity which is pretty slower than fx piledriver, while phenom II vs athlon II barely had any affect due to its lower latency)

18gcn clusters! Can that be right! That would mean Jaguar would get 576 which is more then a 7750, and that alone is 40w of power. Something is a miss here for me.

So 45+45+ say 45 again (cpu) is 135w+!! Something has to be a miss.
jaguar gets 576 what?
and the highest end jaguar apu with its graphics cores(128 of them?) is rated at 25watt and with much higher clockspeed than 1.6ghz(amd in their presentation said jaguar will clock 10-15 higher than what bobcat wouldve clocked at 28nm) so ur talking atleast over 2ghz.
and if llano with 400outdated radeon cores, and 4 k10.5 cores clocked atleast 1.6 before turbo, so expect jaguar to be much more efficient on a new node and power efficient architecture, say 25watt max for the cpu cores only, if not less, that leaves them with 75-100watt headroom to work with(think hd7970m rated at 100w, thats 1280gcn cores at 800mhz, this would have 1152gcn cores at 800mhz and after a year of optimization its easily at 75watt)to add up to 100-125w which is very reasonable and since its an apu u just need one proper cooler, also think of graphics cards rated at 250w only requiring one blower fan and a dual slot cooler to cool both gddr5 chips and the gpu. in other words the motherboard and the chip can be as big as a hd7970(but with 100-125w u only need something the size of hd7850 which is rated at 110w-130w) but then of course add the br drive and other goodies. main point is cooling is no problem unless multiple chips are involved requiring cooling the case in general rather than the chip itself using a graphic card style cooler
They already said that the graphics power is going to be similar to a 7870, didn't they?
more like between hd7850 and hd7970m, it seems 800mhz is the sweet spot in terms of performance/efficiency/die size considering an hd7970m with 1280gnc cores at 800mhz is at 100w versus 110w measured/130w rated on hd7850 with 1024w with 860mhz
not to mention the mobile pitcairn loses 30watts-75watts measured/rated when clocked at 800mhz(advertised tdp on desktop pitcairn is 175wat but measured at 130w according to the link i have below)

http://www.guru3d.com/articles_pages/amd_radeon_hd_7850_and_7870_review,6.html
here is a reference in regards to the measured tdp, because advertised tdp by amd is higher but also consider other parts on the board and allowing overclock headroom or whatever the case is
 
Last edited:
Joined
Sep 15, 2011
Messages
4,358 (1.91/day)
Likes
1,074
Processor Intel Core i7 3770k @ 4.3GHz
Motherboard Asus P8Z77-V LK
Memory 16GB(2x8) DDR3@2133MHz 1.5v Patriot
Video Card(s) MSI GeForce GTX 1080 GAMING X 8G
Storage 59.63GB Samsung SSD 830 + 465.76 GB Samsung SSD 840 EVO + 2TB Hitachi + 300GB Velociraptor HDD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Anker
Software Win 10 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
#60
I don't think the price is the reason why we still don't use GDDR5 as main memory in PCs, after all they are selling graphics cards for much more than how much a GDDR5 ram kit or a supporting chipset/architecture would cost. I did not really red anything about overcoming the GDDR5 latency issue in the past, so that's what made me curious.

:toast:
Guys, you need to stop the confusion. You CANNOT use GDDR5 in your PC as a main memory because the Graphic DDR5 is special RAM only to be used in graphics. Is a very big difference between how a GPU and CPU uses RAM. Also GDDR5 is based on DDR3 so you are already using it for a long time. Don't know exactly the specifics, but you can google it already... ;)
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
10,404 (4.84/day)
Likes
5,483
Location
Concord, NH
System Name Kratos
Processor Intel Core i7 3930k @ 4.2Ghz
Motherboard ASUS P9X79 Deluxe
Cooling Zalman CPNS9900MAX 130mm
Memory G.Skill DDR3-2133, 16gb (4x4gb) @ 9-11-10-28-108-1T 1.65v
Video Card(s) MSI AMD Radeon R9 390 GAMING 8GB @ PCI-E 3.0
Storage 2x120Gb SATA3 Corsair Force GT Raid-0, 4x1Tb RAID-5, 1x500GB
Display(s) 1x LG 27UD69P (4k), 2x Dell S2340M (1080p)
Case Antec 1200
Audio Device(s) Onboard Realtek® ALC898 8-Channel High Definition Audio
Power Supply Seasonic 1000-watt 80 PLUS Platinum
Mouse Logitech G602
Keyboard Rosewill RK-9100
Software Ubuntu 17.10
Benchmark Scores Benchmarks aren't everything.
#61
Guys, you need to stop the confusion. You CANNOT use GDDR5 in your PC as a main memory because the Graphic DDR5 is special RAM only to be used in graphics. Is a very big difference between how a GPU and CPU uses RAM. Also GDDR5 is based on DDR3 so you are already using it for a long time. Don't know exactly the specifics, but you can google it already... ;)
Read the entire thread before you jump to conclusions, this is mainly stemming from the PS4 discussion.

GDDR5 itself can do whatever it wants, there are no packages or CPU IMCs that will handle it though, that does not mean that it can not be used. PS4 is lined up to use GDDR5 for system and graphics memory and I suspect that Sony isn't just saying that for shits and giggles.

Also it's not all that different, latencies are different, performance is (somewhat, not a ton,) optimized for bandwidth other latency but other than that, communication is about the same sans two control lines for reading and writing. It's a matter of how that data is transmitted, but your statement here is really actually wrong.

Just because devices don't use a particular bit of hardware to do something doesn't mean that you can't use that hardware to do something else. For example, for the longest time low voltage DDR2 was used in phones and mobile devices and not DDR3. Does that mean that DDR3 will never get used in smartphones? Most of us know the answer to that and it's a solid no, GDDR5 is no different. Just because it works best on video cards doesn't mean that it can not be used of a CPU that would be build with a GDDR5 memory controller.
 
Joined
Feb 18, 2011
Messages
1,240 (0.50/day)
Likes
503
#62
Guys, you need to stop the confusion. You CANNOT use GDDR5 in your PC as a main memory because the Graphic DDR5 is special RAM only to be used in graphics. Is a very big difference between how a GPU and CPU uses RAM. Also GDDR5 is based on DDR3 so you are already using it for a long time. Don't know exactly the specifics, but you can google it already... ;)
Please consider switching from "write-only mode" on the forum, and read my comments if your reply to me:

many thanks:toast:
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
34,335 (9.22/day)
Likes
17,427
Location
Hyderabad, India
System Name Long shelf-life potato
Processor Intel Core i7-4770K
Motherboard ASUS Z97-A
Cooling Xigmatek Aegir CPU Cooler
Memory 16GB Kingston HyperX Beast DDR3-1866
Video Card(s) 2x GeForce GTX 970 SLI
Storage ADATA SU800 512GB
Display(s) Samsung U28D590D 28-inch 4K
Case Cooler Master CM690 Window
Audio Device(s) Creative Sound Blaster Recon3D PCIe
Power Supply Corsair HX850W
Mouse Razer Abyssus 2014
Keyboard Microsoft Sidewinder X4
Software Windows 10 Pro Creators Update
#63
Guys, you need to stop the confusion. You CANNOT use GDDR5 in your PC as a main memory.
Oh but you can. PS4 uses GDDR5 as system memory.
 
Joined
Sep 15, 2011
Messages
4,358 (1.91/day)
Likes
1,074
Processor Intel Core i7 3770k @ 4.3GHz
Motherboard Asus P8Z77-V LK
Memory 16GB(2x8) DDR3@2133MHz 1.5v Patriot
Video Card(s) MSI GeForce GTX 1080 GAMING X 8G
Storage 59.63GB Samsung SSD 830 + 465.76 GB Samsung SSD 840 EVO + 2TB Hitachi + 300GB Velociraptor HDD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Anker
Software Win 10 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
#64
Please consider switching from "write-only mode" on the forum, and read my comments if your reply to me:

many thanks:toast:
Please don't tell me what to do, or what I am allowed to do or not.

many thanks:toast:

Oh but you can. PS4 uses GDDR5 as system memory.
PS4 is NOT PC...But if what you all say is true, than why nobody introduced GDDR5 for PC?? Is from a long time on video cards. And why is it called Graphic DDR then?
 

Frick

Fishfaced Nincompoop
Joined
Feb 27, 2006
Messages
14,888 (3.45/day)
Likes
5,416
System Name A dancer in your disco of fire
Processor i3 4130 3.4Ghz
Motherboard MSI B85M-E45
Cooling Cooler Master Hyper 212 Evo
Memory 4 x 4GB Crucial Ballistix Sport 1400Mhz
Video Card(s) Asus GTX 760 DCU2OC 2GB
Storage Crucial BX100 120GB | WD Blue 1TB x 2
Display(s) BenQ GL2450HT
Case AeroCool DS Cube White
Power Supply Cooler Master G550M
Mouse Intellimouse Explorer 3.0
Keyboard Dell SK-3205
Software Windows 10 Pro
#65
Ps4 is pretty much a custom PC.

EDIT: With a custom OS.
 
Last edited:

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
34,335 (9.22/day)
Likes
17,427
Location
Hyderabad, India
System Name Long shelf-life potato
Processor Intel Core i7-4770K
Motherboard ASUS Z97-A
Cooling Xigmatek Aegir CPU Cooler
Memory 16GB Kingston HyperX Beast DDR3-1866
Video Card(s) 2x GeForce GTX 970 SLI
Storage ADATA SU800 512GB
Display(s) Samsung U28D590D 28-inch 4K
Case Cooler Master CM690 Window
Audio Device(s) Creative Sound Blaster Recon3D PCIe
Power Supply Corsair HX850W
Mouse Razer Abyssus 2014
Keyboard Microsoft Sidewinder X4
Software Windows 10 Pro Creators Update
#67
PS4 is NOT PC...But if what you all say is true, than why nobody introduced GDDR5 for PC?? Is from a long time on video cards. And why is it called Graphic DDR then?
The CPU and software are completely oblivious to memory type. The only component that really needs to know how the memory works at the physical level is the integrated memory controller. To every other component, memory type is irrelevant. It's the same "load" "store" "fetch" everywhere else.

Just because GDDR5 isn't a PC memory standard doesn't mean it can't be used as system main memory. It would have comparatively high latency to DDR3, but it still yields high bandwidth. GDDR5 stores data in the same ones and zeroes as DDR3, SDR, and EDO.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
10,404 (4.84/day)
Likes
5,483
Location
Concord, NH
System Name Kratos
Processor Intel Core i7 3930k @ 4.2Ghz
Motherboard ASUS P9X79 Deluxe
Cooling Zalman CPNS9900MAX 130mm
Memory G.Skill DDR3-2133, 16gb (4x4gb) @ 9-11-10-28-108-1T 1.65v
Video Card(s) MSI AMD Radeon R9 390 GAMING 8GB @ PCI-E 3.0
Storage 2x120Gb SATA3 Corsair Force GT Raid-0, 4x1Tb RAID-5, 1x500GB
Display(s) 1x LG 27UD69P (4k), 2x Dell S2340M (1080p)
Case Antec 1200
Audio Device(s) Onboard Realtek® ALC898 8-Channel High Definition Audio
Power Supply Seasonic 1000-watt 80 PLUS Platinum
Mouse Logitech G602
Keyboard Rosewill RK-9100
Software Ubuntu 17.10
Benchmark Scores Benchmarks aren't everything.
#68
And why is it called Graphic DDR then?
Because it is optimized for graphics, not exclusively for graphics.

You're just digging yourself into a hole.
 
Joined
Jan 22, 2013
Messages
478 (0.27/day)
Likes
119
System Name Desktop
Processor i5 3570k
Motherboard Asrock Z77
Cooling Corsair H60
Memory G Skill 8gb 1600 mhz X 2
Video Card(s) Sapphire Radeon 7850 X 2
Storage 1 TB Velociraptor, 240GB 840 Samsung
Display(s) 27" Samsung LED X 2
Case Thermaltake V9
Power Supply Seasonic 620 W, CX600M on stand by
Software Win 8.1 64
Benchmark Scores Benches are silly
#69
Strange they do this before Sony's PS4 announcement tomorrow. Both new consoles from MS and Sony gonna have these new cores in their CPUs, I thought Sony would ask them for all the "flare" they can get. It's also strange only four cores allowed on the PC side while there will be more in the consoles (assuming that all the leaks are correct ofc).
Game consoles are not mobile. These are designed for mobile devices with 5-25W power envelope.

The CPU and software are completely oblivious to memory type. The only component that really needs to know how the memory works at the physical level is the integrated memory controller. To every other component, memory type is irrelevant. It's the same "load" "store" "fetch" everywhere else.

Just because GDDR5 isn't a PC memory standard doesn't mean it can't be used as system main memory. It would have comparatively high latency to DDR3, but it still yields high bandwidth. GDDR5 stores data in the same ones and zeroes as DDR3, SDR, and EDO.
^^tru dat!

Because it is optimized for graphics, not exclusively for graphics.

You're just digging yourself into a hole.
By graphics you probably meant bandwidth which is correct.

I'm guessing latency is not as much of an issue when it comes to a specific design for a console rather than a broad compatibility design for PC.
 
Last edited:

Mussels

Moderprator
Staff member
Joined
Oct 6, 2004
Messages
46,130 (9.57/day)
Likes
13,562
Location
Australalalalalaia.
System Name Daddy Long Legs
Processor Ryzen R7 1700, 3.9GHz 1.375v
Motherboard MSI X370 Gaming PRO carbon
Cooling Fractal Celsius S24 (Silent fans, meh pump)
Memory 16GB 2133 generic @ 2800
Video Card(s) MSI GTX 1080 Gaming X (BIOS modded to Gaming Z - faster and solved black screen bugs!)
Storage 1TB Intel SSD Pro 6000p (60TB USB3 storage)
Display(s) Samsung 4K 40" HDTV (UA40KU6000WXXY) / 27" Qnix 2K 110Hz
Case Fractal Design R5. So much room, so quiet...
Audio Device(s) Pioneer VSX-519V + Yamaha YHT-270 / sennheiser HD595/518 + bob marley zion's
Power Supply Corsair HX 750i (Platinum, fan off til 300W)
Mouse Logitech G403 + KKmoon desk-sized mousepad
Keyboard Corsair K65 Rapidfire
Software Windows 10 pro x64 (all systems)
Benchmark Scores Laptops: i7-4510U + 840M 2GB (touchscreen) 275GB SSD + 16GB i7-2630QM + GT 540M + 8GB
#70
I'm guessing latency is not as much of an issue when it comes to a specific design for a console rather than a broad compatibility design for PC.
my thoughts as well. these arent meant to be generic multipurpose machines, they're meant to he gaming consoles with pre-set roles, and time to code each game/program to run specifically on them.


this gives game devs the ability to split that 8GB up at will, between CPU and GPU. that could really extend the life of the console, and its capabilities.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
10,404 (4.84/day)
Likes
5,483
Location
Concord, NH
System Name Kratos
Processor Intel Core i7 3930k @ 4.2Ghz
Motherboard ASUS P9X79 Deluxe
Cooling Zalman CPNS9900MAX 130mm
Memory G.Skill DDR3-2133, 16gb (4x4gb) @ 9-11-10-28-108-1T 1.65v
Video Card(s) MSI AMD Radeon R9 390 GAMING 8GB @ PCI-E 3.0
Storage 2x120Gb SATA3 Corsair Force GT Raid-0, 4x1Tb RAID-5, 1x500GB
Display(s) 1x LG 27UD69P (4k), 2x Dell S2340M (1080p)
Case Antec 1200
Audio Device(s) Onboard Realtek® ALC898 8-Channel High Definition Audio
Power Supply Seasonic 1000-watt 80 PLUS Platinum
Mouse Logitech G602
Keyboard Rosewill RK-9100
Software Ubuntu 17.10
Benchmark Scores Benchmarks aren't everything.
#71
By graphics you probably meant bandwidth which is correct.
That's to go without saying. GDDR is optimized for graphics which performs best under high bandwidth, high(er) latency situations.
I'm guessing latency is not as much of an issue when it comes to a specific design for a console rather than a broad compatibility design for PC.
I'm not willing to go that far, but I'm sure they will have stuff to mitigate any slowdown it may cause such as intelligent caching and pre-fetching.
 
Joined
Feb 18, 2011
Messages
1,240 (0.50/day)
Likes
503
#72
my thoughts as well. these arent meant to be generic multipurpose machines, they're meant to he gaming consoles with pre-set roles, and time to code each game/program to run specifically on them.
Yes that's one of the advantage of working on closed systems like consoles. It helps a lot both in development speed and efficiency wise, but the developing procedure is still the same.

this gives game devs the ability to split that 8GB up at will, between CPU and GPU. that could really extend the life of the console, and its capabilities.
You can't do anything else but split unified memory (this is also the case with APUs and IGPs on the PC ofc), that's why it's called unified.
The N64 was a console and developers actually released titles on it, but this doesn't change the fact how horrid the memory latency really was on that system, and how much extra effort and work the programmers had to make to get over that huge limitation (probably the main reason why Nintendo introduced 1T-SRAM in the Gamecube, which was basically eDram on die).
If you really wan't to split unified memory into CPU and GPU memory (you can't btw, but let's assume you could), it's extremely unlikely that developers will use more than 1-2GB for "Video memory" in the PS4, not only because the bandwidth would be not enough to use more, but also because it's simply not needed (ok, the ps4 is extremely powerful on the bandwidth side and perhaps there will be some new rendering technique in the future which we don't know about yet, but current methods like deferred rendering, voxel, megatexturing, etc will run just fine using only 1-2GB for "rendering").
 
Last edited: