• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Core i9-10900K 10-core Processor and Z490 Chipset Arrive April 2020

Joined
Sep 15, 2011
Messages
5,369 (1.72/day)
Processor Intel Core i7 3770k @ 4.3GHz
Motherboard Asus P8Z77-V LK
Memory 16GB(2x8) DDR3@2133MHz 1.5v Patriot
Video Card(s) MSI GeForce GTX 1080 GAMING X 8G
Storage 59.63GB Samsung SSD 830 + 465.76 GB Samsung SSD 840 EVO + 2TB Hitachi + 300GB Velociraptor HDD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Anker
Software Win 10 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
No really, what would be the difference then between the K and the X series?!??
Please don't say the chipset and the nr of PCIE-E lines only...
 

viandyka

New Member
Joined
Dec 11, 2019
Messages
1 (0.01/day)
AMD only need 3 generation Ryzen

and intel just screw their head with 4 "NEW" intel product xD and soon 5 skylake refresh
 
Joined
Oct 2, 2015
Messages
2,498 (1.51/day)
Location
Argentina
System Name Ciel / Yukino
Processor AMD Ryzen R5 3400G @ 3750MHz / Intel Core i3 5005U
Motherboard MSI B350M PRO-VDH / HP 240 G5
Cooling Wraith Spire / Stock
Memory 2x 8GB Corsair Vengeance LPX DDR4 3200MHz / 2x 4GB Hynix + Kingston DDR3L 1600MHz
Video Card(s) Vega 11 / Intel HD 5500
Storage SSD WD Green 240GB M.2 + HDD Toshiba 2TB / SSD Kingston A400 120GB SATA
Display(s) HP w17e 1440x900 @ 75Hz/ Integrated 1366x768 @ 94Hz
Case Generic / Stock
Audio Device(s) Realtek ALC892 / Realtek ALC282
Power Supply Sentey XPP 525W / Power Brick
Mouse Logitech G203 / Elan Touchpad
Keyboard Generic / Stock
Software Windows 10 x64 + Arch Linux
No really, what would be the difference then between the K and the X series?!??
Please don't say the chipset and the nr of PCIE-E lines only...
Price.
 
Joined
Mar 16, 2017
Messages
863 (0.77/day)
Location
Tanagra
Processor AMD 2700X
Motherboard Gigabyte B450M DS3H
Cooling Scythe Mugen 5
Memory G.Skill 16GB DDR4 @ 3200
Video Card(s) Asrock RX 5700 XT 8GB
Storage Inland 512GB NVMe
Display(s) LG 27UL500-W
Case NZXT H510
Audio Device(s) My ears
Power Supply EVGA 500W
Software Windows 10
No really, what would be the difference then between the K and the X series?!??
Please don't say the chipset and the nr of PCIE-E lines only...
Don't the X series have quad channel memory? They are based on the Xeon line, not the desktop line.
 
Joined
Jul 19, 2016
Messages
238 (0.18/day)
This is sad from Intel.

April 2020 and it'll be couple months away from going up against Ryzen 4000 which is two generations ahead in terms of process node and well, even performance given the rumours.

- Less cores
- Slower IPC
- Much less efficient
- Slower multithreading
- Slower single core
- Loses in 720p gaming eith 2080 Ti tests (likely given the +10% IPC and +5% frequency rumours of 4000 series).

But 5Ghz all-core drawing 450W. Yay Intel!
 
Joined
Feb 20, 2019
Messages
488 (1.18/day)
System Name PowerEdge R730 DRS Cluster
Processor 4x Xeon E5-2698 v3
Cooling Many heckin screamy bois
Memory 480GB ECC DDR4-2133
Video Card(s) Matrox G200eR2
Storage SD Card. Yep, really no other local storage.
Display(s) It's probably a couple of boring Dell Ultrasharps and a sacrificial laptop.
Case 39U 6-rack server room with HEVC and 44KVA UPS
Mouse Maybe
Keyboard Yes!
Software ESXi 6.5 U3
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
This is sad from Intel.

April 2020 and it'll be couple months away from going up against Ryzen 4000 which is two generations ahead in terms of process node and well, even performance given the rumours.

- Less cores
- Slower IPC
- Much less efficient
- Slower multithreading
- Slower single core
- Loses in 720p gaming eith 2080 Ti tests (likely given the +10% IPC and +5% frequency rumours of 4000 series).

But 5Ghz all-core drawing 450W. Yay Intel!
Don't forget that Skylake rebranded for the 5th time is still a proven high-risk architecture with a fundamentally-exploitable design and new exploits popping up to haunt it faster than they're patched.

On the other side, AMD's frequent architectural jumps mean that by the time hackers glean enough info about Zen 2, 3, 4 to start exploiting it, AMD will have moved on to newer architecture anyway.
 
Joined
Jun 28, 2016
Messages
3,405 (2.47/day)
I dunno, once they drop so far, the Atom-based architecture is able to step in. I think a 2C/2T Core-based architecture is finally being eclipsed by Gemini Lake in most tasks. I had an Apollo Lake quad core that could handle a 4K stream. It’s a low bar, but we’re talking about really small dies and super cheap prices.
Actually it's the opposite. Consumer Atoms aren't developed anymore. Intel replaced them with Celeron lineup (e.g. this is what you get in ~$300 laptops right now).

Server Atoms (C-series) are still in production, but haven't been updated since 2018.

Ultimately the Atom lineup will be replaced by ARM.
 
Joined
Mar 16, 2017
Messages
863 (0.77/day)
Location
Tanagra
Processor AMD 2700X
Motherboard Gigabyte B450M DS3H
Cooling Scythe Mugen 5
Memory G.Skill 16GB DDR4 @ 3200
Video Card(s) Asrock RX 5700 XT 8GB
Storage Inland 512GB NVMe
Display(s) LG 27UL500-W
Case NZXT H510
Audio Device(s) My ears
Power Supply EVGA 500W
Software Windows 10
Actually it's the opposite. Consumer Atoms aren't developed anymore. Intel replaced them with Celeron lineup (e.g. this is what you get in ~$300 laptops right now).

Server Atoms (C-series) are still in production, but haven't been updated since 2018.

Ultimately the Atom lineup will be replaced by ARM.
I’m not talking abut Atom the brand, but the architecture. Today, it is called Gemini Lake. Intel has been gradually adding IPC to the original architecture. I just can’t recall the code name. Something that ends in “mont.”
 
Joined
Jun 28, 2016
Messages
3,405 (2.47/day)
I’m not talking abut Atom the brand, but the architecture. Today, it is called Gemini Lake. Intel has been gradually adding IPC to the original architecture. I just can’t recall the code name. Something that ends in “mont.”
Atom and Celeron/Pentium J-, N-series were all based on the same architecture (Goldmont). Super small cores - up to 16C in a package smaller than LGA1151.
Consumer Atom lineup was dropped.

I'm not sure what will happen when Tremont arrives.
I've seen rumors that server chips (and everything with ECC) will be unified under Xeon brand...
 
Joined
Nov 4, 2005
Messages
10,514 (2.00/day)
System Name MoFo 2
Processor AMD PhenomII 1100T @ 4.2Ghz
Motherboard Asus Crosshair IV
Cooling Swiftec 655 pump, Apogee GT,, MCR360mm Rad, 1/2 loop.
Memory 8GB DDR3-2133 @ 1900 8.9.9.24 1T
Video Card(s) HD7970 1250/1750
Storage Agility 3 SSD 6TB RAID 0 on RAID Card
Display(s) 46" 1080P Toshiba LCD
Case Rosewill R6A34-BK modded (thanks to MKmods)
Audio Device(s) ATI HDMI
Power Supply 750W PC Power & Cooling modded (thanks to MKmods)
Software A lot.
Benchmark Scores Its fast. Enough.
And still faster than AMD in most things (when comparing same core count cpus), but especially in gaming :cool:
Wrong.

Intel is faster at our of order operations, due to AMD using a chiplet design.

When gaming at resolutions of 1080 or above AMD are neck and neck. At 720 or lower Intel wins, but who games at that resolution?

AMD is significantly faster at 80% of other actual work due to more cores, and more cache.


Do you even read reviews?
 
Joined
Jul 19, 2016
Messages
238 (0.18/day)
And still faster than AMD in most things (when comparing same core count cpus), but especially in gaming :cool:
3.8% faster (using TPU's own numbers after chipset update) at 1080p gaming using a 2080 Ti, whilst consuming more power, being far less efficient and losing in multithreaded tasks by way more than 3.8%, right? I'll take the Ryzen thanks ;)
 
Joined
Mar 18, 2015
Messages
2,412 (1.31/day)
Location
Long Island
This is sad from Intel.

April 2020 and it'll be couple months away from going up against Ryzen 4000 which is two generations ahead in terms of process node and well, even performance given the rumours.

- Less cores
- Slower IPC
- Much less efficient
- Slower multithreading
- Slower single core
- Loses in 720p gaming eith 2080 Ti tests (likely given the +10% IPC and +5% frequency rumours of 4000 series).
Just picked a random post that centered on specs, so not addressing you directly, but the mindset that puts specs ahead of performance.

Rumors - not relevant
Cores - Don't care
IPC - Don't care
Efficiency - Don't care
Mutithreading - don't care
Single core - don't care
720p gaming - who plays @ 720p ... and "likely" has no place in real world discussions.

All that is relevant is how fast a CPU runs the apps they use and the games they play. And won't pay any attention to any new release till i see it tested here... and then only in the apps I actually use and games I actually play ... chest beating and benchmark scores are meaningless. Fanbois beating their chests about the chip they like being faster at tasks that are never or rarely done things that they never do isn't . When I looked at the 9900KF versus the 3900x test results here in TPU ... here's what I see ...

3900X kicks tail in rendering which would be relevant if my user did rendering
3900X kicks tail in game and software development which would be relevant if my user did those things
3900x shares wins in browser performance but differences are too small to observe anyway.
3900X kicks tail in scientific applications which would be relevant if my user did rendering
3900x shares wins in office apps but differences (0.05 seconds) are too small to affect user experience.
3900X lose in Photoshop by 0.05 seconds... but loses by 10 seconds ... finally something that matters to my user
Skipping a few more things 99% of us don't ever do
File compression / media encoding ...also not on the list
Encoding ... use does an occasional MP3 encode and the 3900x trailing by 12 secs might be significant if it was more than an occasional thing.
3900x loses in overall 720 game performance by 7% ... as he plays at 1440p, it's entirely irrelevant
3900x loses in 1440p overall game performance but 2.5% ... not a big deal but 2.5% is an advantage more than most of what we have seen so far.
3900x losses all the power consumption comparisons ... 29 watts in gaming
3900x runs 22 C hotter
3900X doesn't OC as well
3900x is more expensive.

AMD did good w/ the 3900x .... but despite the differences in cores, IPC, die size whatever ... the only think that matters is performance. The are many things that the 3900x does better than the 9900KF, but most users aren't doing this things. You can look at a football player and how much he can bench or how fast he can run the 40 ... but none of those things determine his value to the team, his contribution to the score or how much he gets paid.
 

r9

Joined
Jul 28, 2008
Messages
2,633 (0.62/day)
System Name PC1| PC2|Poweredge r410
Processor i5 6600k| Ryzen 1600| 2 x E5620 @2.4GHz
Memory 16GB DDR4 |16GB DDR4 | 32GB ECC DDR3
Video Card(s) GTX 1070|2 x RX570 |On-Board
Storage 512GB SSD+1TB SSD|512GB SSD+1TB|2x256GBSSD 2x2TBGB
Display(s) 27" Dell + 2 x 24" LCD Setup
Software Windows 10 |Windows 10| Server 2012 r2
Those chiplets is what kicked Intel's ass.
In reversed positions AMD offered the Phenom I the TRUE quad core vs the glued together two C2D into Q6600.
Guess what we don't care what's true and what's not it's all about performance and this time around AMD are the ones who made those chiplets to work.
And for inter it's not just going to 10/7nm because I bet that they won't be able to hit those 5 GHz on those nodes for years to come.
So for them it will be a step back at first but I'm sure they will bounce back eventually like they did with C2Duo by getting "inspired" by AMD architecture.
And guys don't for get ARM is coming 8CX ....
 
Joined
Aug 21, 2013
Messages
497 (0.21/day)
Just picked a random post that centered on specs, so not addressing you directly, but the mindset that puts specs ahead of performance.

Rumors - not relevant
Cores - Don't care
IPC - Don't care
Efficiency - Don't care
Mutithreading - don't care
Single core - don't care
720p gaming - who plays @ 720p ... and "likely" has no place in real world discussions.

All that is relevant is how fast a CPU runs the apps they use and the games they play. And won't pay any attention to any new release till i see it tested here... and then only in the apps I actually use and games I actually play ... chest beating and benchmark scores are meaningless. Fanbois beating their chests about the chip they like being faster at tasks that are never or rarely done things that they never do isn't . When I looked at the 9900KF versus the 3900x test results here in TPU ... here's what I see ...

3900X kicks tail in rendering which would be relevant if my user did rendering
3900X kicks tail in game and software development which would be relevant if my user did those things
3900x shares wins in browser performance but differences are too small to observe anyway.
3900X kicks tail in scientific applications which would be relevant if my user did rendering
3900x shares wins in office apps but differences (0.05 seconds) are too small to affect user experience.
3900X lose in Photoshop by 0.05 seconds... but loses by 10 seconds ... finally something that matters to my user
Skipping a few more things 99% of us don't ever do
File compression / media encoding ...also not on the list
Encoding ... use does an occasional MP3 encode and the 3900x trailing by 12 secs might be significant if it was more than an occasional thing.
3900x loses in overall 720 game performance by 7% ... as he plays at 1440p, it's entirely irrelevant
3900x loses in 1440p overall game performance but 2.5% ... not a big deal but 2.5% is an advantage more than most of what we have seen so far.
3900x losses all the power consumption comparisons ... 29 watts in gaming
3900x runs 22 C hotter
3900X doesn't OC as well
3900x is more expensive.

AMD did good w/ the 3900x .... but despite the differences in cores, IPC, die size whatever ... the only think that matters is performance. The are many things that the 3900x does better than the 9900KF, but most users aren't doing this things. You can look at a football player and how much he can bench or how fast he can run the 40 ... but none of those things determine his value to the team, his contribution to the score or how much he gets paid.
Lets put it this way. Would you buy a CPU that does one thing really good (Gaming at lower resolutions) or would you buy a CPU that does gaming <10% worse but everything else >10-50% better than the other CPU?

3900X is more well rounded for all tasks. Not just one. And more secure. The difference in price is rather small. 9900 variants should cost no more than 3700X. Not what they are now.
Heat i would say is more of an issue for Intel due to higher power consumption. 3900X does not need to OC well because out of the box it already boosts to it's highest speed and with higher IPC it can afford to run at lower clocks. People need to let go of the 5Ghz or bust mentality. Remember Bulldozer was also 5Ghz. I hope no one is missing that.
 
Joined
Jun 28, 2016
Messages
3,405 (2.47/day)
Lets put it this way. Would you buy a CPU that does one thing really good (Gaming at lower resolutions) or would you buy a CPU that does gaming <10% worse but everything else >10-50% better than the other CPU?
If I were buying a CPU for gaming, i.e. gaming was the only tasks where I really wanted my PC to shine?
Of course I would buy the CPU that wins in games - even if it gives just few fps more. It doesn't matter if the other CPU can render 3D 50% faster. Why would it? Am I really building a PC for gaming, or am I more into screenshots from Cinebench?
This is exactly the problem with many people here. They buy the wrong CPU.

Lets say you're buying a GPU for gaming and you play just 3 games that TPU - luckily - tests in their reviews.
Would you buy the GPU that wins in these 3 games or the one with better average over 20 titles?

I think this is also why some people didn't understand low popularity of 1st EPYC CPUs. They had good value and looked very competitive in many benchmarks.
But they fell significantly behind in databases. So the typical comment on gaming forums was: but it's just one task. Yeah, but it's the task that 95% real life systems are bought for (or at least limited by).
3900X is more well rounded for all tasks. Not just one.
Whenever I see an argument like this one, I ask the same question.
I assume you game, right?
What are the 3 other tasks that you frequently do on your PC? But just those that you're really limited by performance.

Because having 56 vs 60 fps in games may not seem a lot, but it's a real effect that could affect your comfort.

Most people can't name a single thing.
Some say crap like: they encode videos. A lot? Nah, few times a year, just a few GB.
 
Joined
Oct 2, 2015
Messages
2,498 (1.51/day)
Location
Argentina
System Name Ciel / Yukino
Processor AMD Ryzen R5 3400G @ 3750MHz / Intel Core i3 5005U
Motherboard MSI B350M PRO-VDH / HP 240 G5
Cooling Wraith Spire / Stock
Memory 2x 8GB Corsair Vengeance LPX DDR4 3200MHz / 2x 4GB Hynix + Kingston DDR3L 1600MHz
Video Card(s) Vega 11 / Intel HD 5500
Storage SSD WD Green 240GB M.2 + HDD Toshiba 2TB / SSD Kingston A400 120GB SATA
Display(s) HP w17e 1440x900 @ 75Hz/ Integrated 1366x768 @ 94Hz
Case Generic / Stock
Audio Device(s) Realtek ALC892 / Realtek ALC282
Power Supply Sentey XPP 525W / Power Brick
Mouse Logitech G203 / Elan Touchpad
Keyboard Generic / Stock
Software Windows 10 x64 + Arch Linux
Long live the i3 9350, why bother with anything else then?
I game, in emulators too, and Skylake is starting to become useless there too.
 
Joined
Aug 21, 2013
Messages
497 (0.21/day)
If I were buying a CPU for gaming, i.e. gaming was the only tasks where I really wanted my PC to shine?
Of course I would buy the CPU that wins in games - even if it gives just few fps more. It doesn't matter if the other CPU can render 3D 50% faster. Why would it? Am I really building a PC for gaming, or am I more into screenshots from Cinebench?
"PC for gaming" statement is funny to me every time. Like really. You never open a browser? you never unpack anything game related (like mods), you never run any other programs on this PC? Obviously you do unless you run cracked games that do not require launchers.

And this gap between Intel and AMD is greatly exaggerated in reviews due to low resolutions and using the fastest GPU around. I bet most of these people buying i7 or i9 for "only gaming" also do bunch of other stuff (even if lightly threaded) and do no ever notice the miniscule performance difference with a naked eye vs AMD. This is not a FX vs Sandy Bridge or Ryzen 1xxx vs 7700K situation any more where you can easily tell the difference. Since AMD is so close in performance and much better in nearly everything else people are buying AMD 9 to 1 compared to Intel.

Plus there is the matter of priorities. For a gaming the GPU is always #1. A person with a cheaper R7 3700X and a RTX 2080S will always achieve better performance than the next guy with i9 9900KS with a RTX 2070S. The only case where getting the i9 for gaming makes any sense is when money is not a problem and the person already owns a 2080 Ti.

Lets say you're buying a GPU for gaming and you play just 3 games that TPU - luckily - tests in their reviews.
Would you buy the GPU that wins in these 3 games or the one with better average over 20 titles?
The one that gets the better average. Obviously. I won't be playing those 3 games forever. Case in point: AMD cards that are really fast in some games like Dirt Rally 4 or Forza Horizon. These are outlier results. I can't and won't base my purcase on one off results that are not representative of overall performance.

Whenever I see an argument like this one, I ask the same question.
I assume you game, right?
What are the 3 other tasks that you frequently do on your PC? But just those that you're really limited by performance.
Because having 56 vs 60 fps in games may not seem a lot, but it's a real effect that could affect your comfort.

Most people can't name a single thing.
Some say crap like: they encode videos. A lot? Nah, few times a year, just a few GB.
Is not everything performance limited?
I would say web browser is very much performance limited. I noticed massive speed boost when launching Firefox after upgrading from 2500K to 3800X. Going to 3900X or 3950X i would be able to give Firefox even more threads to work with.
Also i feel like im IO limited and need faster PCI-E 4.0 NVME SSD to replace my SATA SSD. Just waiting on Samsung to announce their client drives next year. Current Phison controller based drives are just a stopgap and not very compelling.
Also network speed is becoming a major bottleneck for me. What can i say - VDSL 20/5 just does not cut it for me. Ideally i would upgrade to symmetrical 300/300 or 500/500 speed. 1G/1G sounds nice but then the web itself would become a bottleneck.
 
Joined
Oct 27, 2009
Messages
747 (0.20/day)
Location
Republic of Texas
System Name [H]arbringer
Processor 4x 61XX ES @3.5Ghz (48cores)
Motherboard SM GL
Cooling 3x xspc rx360, rx240, 4x DT G34 snipers, D5 pump.
Memory 16x gskill DDR3 1600 cas6 2gb
Video Card(s) blah bigadv folder no gfx needed
Storage 32GB Sammy SSD
Display(s) headless
Case Xigmatek Elysium (whats left of it)
Audio Device(s) yawn
Power Supply Antec 1200w HCP
Software Ubuntu 10.10
Benchmark Scores http://valid.canardpc.com/show_oc.php?id=1780855 http://www.hwbot.org/submission/2158678 http://ww
Don't the X series have quad channel memory? They are based on the Xeon line, not the desktop line.
lool, Xeons are not some special silicon... and they span the whole lineup.
There are Xeon-Ws that parallel the HEDT 2066 platform yes.
There are also Xeon E's on the 1151 socket.
So desktop platform intel has whatever-lake 2-chan mem
HEDT refresh-lake 4chan mem
3647 stillnotfrozen-lake 6 chan mem
BGA abomination Glued-lake 12-chan mem
 
Joined
Oct 14, 2017
Messages
157 (0.17/day)
System Name Lightning
Processor 4790K
Motherboard asrock z87 extreme 3
Cooling hwlabs black ice 20 fpi radiator, cpu mosfet blocks, MCW60 cpu block, full cover on 780Ti's
Memory corsair dominator platinum 2400C10, 32 giga, DDR3
Video Card(s) 2x780Ti
Storage intel S3700 400GB, samsung 850 pro 120 GB, a cheep intel MLC 120GB, an another even cheeper 120GB
Display(s) eizo foris fg2421
Case 700D
Audio Device(s) ESI Juli@
Power Supply seasonic platinum 1000
Mouse mx518
Software Lightning v2.0a
that is alote of channels, but do you realy need so many ?
I saw many tests done on channels and from 2 to 4 thare is diffrence in synthetic but real applications don't realy increase in speed
I think if intel would have unganged it could be a improvement with more channels
 
Joined
Feb 18, 2009
Messages
359 (0.09/day)
Processor i7 8700K
Motherboard MSI Z370 Gaming Plus
Cooling Noctua NH-D15S + NF-A12x25 PWM + 4xNF-A14 PWM
Memory 16 GB Adata XPG Dazzle DDR4 3000 MHz CL16
Video Card(s) Gigabyte GTX 1070 Ti Gaming 8G
Storage Samsung 970 EVO Plus, Samsung 850 Evo
Display(s) Samsung C24FG73 144Hz 1080p
Case Fractal Design Meshify C
Audio Device(s) Steelseries Arctis 3
Power Supply Superflower Leadex II Gold 650W
Mouse Steelseries Rival 600
Keyboard Steelseries Apex M750
Software Windows 10 x64 Pro
"Modern Standby" is not new, it was already available in (some?) Z390 boards.

Other than that, big yawn. More 14nm. I assume more 200-300W power consumption with AVX, and probably impossible to fully stresstest your OCs unless you hit the super golden sample stuff. It's kind of hilarious to see so many people on the web complaining about how their 9900K throttles at 5GHz MCE/OC if they dare to try Prime95 or LinpackX.

The sad part is that I saved money even for what I assumed to be a 8/16 9700K. When I saw it's now an i9, I kept the cash. Looks like I'll keep the cash even longer, which is fine, until we get a CPU that you can play with and tweak without the caveats of the Ryzen platform (like lower clocks to go with undervolting, no OC space) or those of Intel's (super high power consumption to the point where it's impossible to cool the beast and stresstest OCs properly).
 

Give.me.lanes

New Member
Joined
Dec 16, 2019
Messages
1 (0.01/day)
if there are fewer lanes on the 10900k Im just going to stick with 10900x and overclock that. I dont understand how everyone is jumping on amd's dick at a loss of PCIE lanes, Im pretty sure you arent getting a chip this big unless you are running nvlink @4k or using it as a WORKSTATION. or do people not know how to build pc's anymore?

and cmon, my sandy at 5 ghz still rocks. dont bash sandy :D
 
Top