Thursday, January 10th 2019

NVIDIA CEO Jensen Huang on Radeon VII: "Underwhelming (...) the Performance is Lousy"; "Freesync Doesn't Work"

PC World managed to get a hold of NVIDIA CEO Jensen Huang, picking his thoughts on AMD's recently announced Radeon VII. Skirting through the usual amicable, politically correct answers, Jensen made his thoughts clear on what the competition is offering to compete with NVIDIA's RTX 2000 series. The answer? Vega VII is an "underwhelming product", because "The performance is lousy and there's nothing new. [There's] no ray tracing, no AI. It's 7nm with HBM memory that barely keeps up with a 2080. And if we turn on DLSS we'll crush it. And if we turn on ray tracing we'll crush it." Not content on dissing the competition's product, Jensen Huang also quipped regarding AMD's presentation and product strategy, saying that "It's a weird launch, maybe they thought of it this morning."
Of course, the real market penetration of the technologies Jensen Huang mentions is currently extremely low - only a handful of games support NVIDIA's forward-looking ray tracing technologies. That AMD chose to not significantly invest resources and die-space for what is essentially a stop-gap high-performance card to go against NVIDIA's RTX 2080 means its 7 nm 331 mm² GPU will compete against NVIDIA's 12 nm, 545 mm² die - if performance estimates are correct, of course.
The next remarks came regarding AMD's FreeSync (essentially a name for VESA's Adaptive Sync), which NVIDIA finally decided to support on its GeForce graphics cards - something the company could have done outright, instead of deciding to go the proprietary, module-added, cost-increased route of G-Sync. While most see this as a sign that NVIDIA has seen a market slowdown for its G-Sync, added price-premium monitors and that they're just ceding to market demands, Huang sees it another way, saying that "We never competed. [FreeSync] was never proven to work. As you know, we invented the area of adaptive sync. The truth is most of the FreeSync monitors do not work. They do not even work with AMD's graphics cards." In the wake of these word from Jensen, it's hard to understand the overall silence from users that might have their FreeSync monitors not working.

Reportedly, NVIDIA only found 12 out of 400 FreeSync-supporting monitors to support their G-Sync technology automatically in the initial battery of tests, with most panels requiring a manual override to enable the technology. Huang promised that "We will test every single card against every single monitor against every single game and if it doesn't work, we will say it doesn't work. And if it does, we will let it work," adding a snarky punchline to this matter with an "We believe that you have to test it to promise that it works, and unsurprisingly most of them don't work." Fun times. Source: PC World
Add your own comment

270 Comments on NVIDIA CEO Jensen Huang on Radeon VII: "Underwhelming (...) the Performance is Lousy"; "Freesync Doesn't Work"

#226
FordGT90Concept
"I go fast!1!11!1!"
Durvelle27 said:
Someone’s oblivious knows that AMD is about to kick them really hard with navi
Mum's the word on Navi. I don't think it's imminent. PS5 releases anywhere between Christmas 2019 and 2021. I think Christmas 2019 is probably the soonest we'll see a consumer Navi graphics card. Makes one wonder what AMD is going to release between now and then. Polaris 40 on 7nm?

I don't think Radeon VII really concerns Huang so much as the road ahead for NVIDIA is rough. AI isn't taking off like they banked on it doing. RTX reception was poor. 7nm transition for NVIDIA is going to be very costly. Cryptocurrency mining on GPUs is mostly dead. FreeSync integration is rougher than he expected. The sky looks cloudy to Huang where the sky over Su looks sunny.
Posted on Reply
#227
gamerman
well i dont think huang is scare.noway.

truth is that amd brand new flagship gpu cant challenge even nvidia 3rd faster gpu, not even close,so even amd compare it 2080 gpu...hehe, so amd compare they flagship not nvidia flagshp, but 3 rd fastest gpu...2080 nad hypeting THAT!! come on! THINK!!
i remembe same issue was when vega64 release...amd compare it gtx 1080..NOT gtx 1080 ti..think and i hope when amd then also hypeting vega64 very high...

truth is that without 7nm update radeon VII is melting,thats fact. so its need 3 fans toREFERENCE gpu,never happend bfore gpu history, just like vega64 must use watercool!


and radeon VII even its make 7nm tech STILL eat more than 300W power and i be sure real power is somewhere 350W and peak 400W its terrible score!


think ppl cant understand how big big handicap amd radeon VII have bcoz it build 7nmm architecture... THINK!! its almost half of rtx 2000 series 12nm,so its should eat much less power what is now eating....something is very wrong, and it is that amd cant build good efficiency gpu.

example rtx 2080 eat 220W for average gaming,radeon VII score is over 300W ,EVEN its make 7nm lines!!!!!!!!

its lausy gpu like it little brother vega64, and i call it vega64 II.

rtx 2060 is 10 times better gpu. if we took fastest gpus, we took rtx 2080,rtc 2080ti and of coz rtx titan.

looks that radeon VII just and just avoid watercool,let see how its oc'd.. if it is power eat go absolutly dky high. eve its 7nm gpu.

lausy gpu again, and lisa know it and huang know it.


we e seen it soon or do i say afer 12 month when nvidia release thys 7nm gpu...i promise , under 250W and performance is 2080ti + 40% at least.

dont lie urself, see truth on eye, amd want every1 have 1kw psu. or amd not care. amd should thanks and bed deep another company TMSC that they brink it 7nm arhitecture,without that help i think amd not even release radeon VII.

lausy gpu. shame amd, and ppl wake up, amd not deserve anything smpaty,its earth eat gpu, slow and for it performace its pricey.
Posted on Reply
#228
xtreemchaos
[USER=173620]gamerman[/USER]
you can get tablets for nvidia addiction :) sounds like you have the same as poor mr Huang matey, times are a changing... you know im joking right.
Posted on Reply
#229
Valantar
xtreemchaos said:
[USER=173620]gamerman[/USER]
you can get tablets for nvidia addiction :) sounds like you have the same as poor mr Huang matey, times are a changing... you know im joking right.
I believe the tablet for Nvidia addiction is called Nintendo Switch.



... I'll see myself out.
Posted on Reply
#230
xtreemchaos
[USER=171585]Valantar[/USER]
lol how did i miss that, come back in i like you humor.
Posted on Reply
#231
Slizzo
Nxodus said:
AMD fanbois will be so angry when in a couple of years RTX becomes standard, and AMD will be forced to adopt it effectively increasing the prices of their hot plastic cards even more.
Sasqui said:
You know, that is a good point, they invested in RTX and maybe or maybe not it'll play out well for them. "Standard" is also a good question, since DX12 introduced the RTX API in Win 10, it's still very young. It's all about the ecosystem, meaning how many developers are going to invest time into implementing it when there's only a handful (or maybe one right now... 2080 ti) that can really handle it.

It's not a very compelling story at the moment.
Guys, you realize that currently, the only implementation of RTRT on desktop is Windows 10, with 1809 update, using DXR as the library to display said technology? You know, the library that anyone can use to accelerate ray tracing? It's not "RTX API". It's Microsofts' DXR API, that is a part of DirectX. Once AMD implements Radeon Rays, they'll be using the same library.

scevism said:
I've just got a 55'' samsung tv with freesync built in 120fps with my r9 290x only at 1080p but works for me. WITH MY LEATHER JACKET ON LOL.....
Loving my Q6 so far. Been a great TV

Anymal said:
No halving thank to dlss
Man, your guys reliance on DLSS being an end all solution is bewildering. Personally, I'd never rely on a scaling solution in my gaming. I'd rather just run at native res, for better or worse. Beyond that, we're already getting 1080p 60fps in the only RTX supported title so far, with cards that are lower in the stack than the range topping RTX Titan and RTX 2080 Ti.



Beyond all that, the Radeon 7 does look good. So what, it's just a shrunk and higher clocked Vega. But it has twice the ROPS as previous, a huge increase that hamstrung the first Vega, and it has faster HBM, and twice as much as last time as well.

It's $700 because NVIDIA allowed them to price it at that; I can hardly blame AMD for taking advantage of NVIDIA's absofuckinglutely insane pricing.
Posted on Reply
#232
Valantar
Slizzo said:
Guys, you realize that currently, the only implementation of RTRT on desktop is Windows 10, with 1809 update, using DXR as the library to display said technology? You know, the library that anyone can use to accelerate ray tracing? It's not "RTX API". It's Microsofts' DXR API, that is a part of DirectX. Once AMD implements Radeon Rays, they'll be using the same library.



Loving my Q6 so far. Been a great TV



Man, your guys reliance on DLSS being an end all solution is bewildering. Personally, I'd never rely on a scaling solution in my gaming. I'd rather just run at native res, for better or worse. Beyond that, we're already getting 1080p 60fps in the only RTX supported title so far, with cards that are lower in the stack than the range topping RTX Titan and RTX 2080 Ti.



Beyond all that, the Radeon 7 does look good. So what, it's just a shrunk and higher clocked Vega. But it has twice the ROPS as previous, a huge increase that hamstrung the first Vega, and it has faster HBM, and twice as much as last time as well.

It's $700 because NVIDIA allowed them to price it at that; I can hardly blame AMD for taking advantage of NVIDIA's absofuckinglutely insane pricing.
The main difference here: AMD's range tops out at $700. Regardless of relative and absolute performance, that's within reason, I'd say. I paid that much for my Fury X, and I'm happy with it, but it would take a serious improvement to make me pay as much again, at least for a couple of years yet. Nvidia's range, on the other hand, tops out at either $1200 or $2500, depending if you include the Titan. I don't think it counts (it's not GeForce), but apparently rabid fanboys such as the above example disagree (edit: to clarify, not the post quoted above, but the one you've all noticed if you've read the last page of posts). That is well beyond "within reason". People were pissed when Nvidia pushed Titan pricing to $1200, yet now they're eating up the 2080 ti at the same price. Just shows how easily accustomed one gets to insanity.
Posted on Reply
#233
kapone32
I bought a Water block for my Sapphire Vega 64 card. I do seriously hope that the layout is the same as when I ran Fire Strike this morning
Valantar said:
The main difference here: AMD's range tops out at $700. Regardless of relative and absolute performance, that's within reason, I'd say. I paid that much for my Fury X, and I'm happy with it, but it would take a serious improvement to make me pay as much again, at least for a couple of years yet. Nvidia's range, on the other hand, tops out at either $1200 or $2500, depending if you include the Titan. I don't think it counts (it's not GeForce), but apparently rabid fanboys such as the above example disagree. That is well beyond "within reason". People were pissed when Nvidia pushed Titan pricing to $1200, yet now they're eating up the 2080 ti at the same price. Just shows how easily accustomed one gets to insanity.
Exactly, and people are talking about DLSS and Ray Tracing without Physx, Hairworks and SLI which Nvidia took over and hogged for themselves only to see the technology not used because of the way they do things vs AMD with Vulkan which let to DX12 or Freesync which suddenly Nvidia is supporting. Of course with a comment for Jensen making it seem like Freesync is only good for NVidia cards..
Posted on Reply
#234
Casecutter
Like I said "no bad cards just bad pricing" let hope this spurs more competiveness!
Posted on Reply
#235
Sasqui
Slizzo said:
It's Microsofts' DXR API, that is a part of DirectX. Once AMD implements Radeon Rays, they'll be using the same library.
Yes, yes and yes. It's a component of DX in the latest Windows 10 version that everyone has access to.

The catch is that game developers have to write code to implement it, and the hardware has to be able to process it as well. Currently, these games are in the works to support it, at least to some degree or another:
  • Assetto Corsa Competizione from Kunos Simulazioni/505 Games.
  • Atomic Heart from Mundfish.
  • Battlefield V from EA/DICE.
  • Control from Remedy Entertainment/505 Games.
  • Enlisted from Gaijin Entertainment/Darkflow Software.
  • Justice from NetEase.
Posted on Reply
#236
mindbomb
theoneandonlymrk said:
What so screen tearing, which freesync was made for wouldn't give the game away if it's not working??.
People would notice.
The screen tearing is fixed with freesync, people aren't disputing that. What nvidia is claiming is that many freesync displays have dealbreaking image artifacts over their variable refresh range. Obviously, they have an incentive to upsell people to gsync monitors, but I think they might be right anyway. Monitor reviews are sporadic, and often focus on fixed refresh rate performance. It is completely plausible to me that problems have went unnoticed.

Vayra86 said:
Actually no, there are in depth Youtube vids about just about everything and when there are doubts, some reddit or YT or Twitch channel will explode because there's a new daily shitstorm to click on.
There have been problems noted. This pcperspective article notices excessive ghosting on early freesync monitors.
https://www.pcper.com/reviews/Displays/AMD-FreeSync-First-Impressions-and-Technical-Discussion/Gaming-Experience-FreeSync-
They released a follow up article claiming the issue was fixed, but the ghosting was replaced by arguably more noticeable overshoot artifacts.

PCmonitors.info noted that the Samsung c24fg70 had overshoot artifacts with freesync at 100hz, and flickering on the desktop with freesync on.
https://pcmonitors.info/reviews/samsung-c24fg70/

Monitor reviews aren't as exciting as gpu or cpu reviews, and these things can easily go unnoticed by the community. Those overshoot and ghosting issues would be solved by dynamic overdrive, something nvidia requires for all gsync monitors.
Posted on Reply
#237
theoneandonlymrk
mindbomb said:
The screen tearing is fixed with freesync, people aren't disputing that. What nvidia is claiming is that many freesync displays have dealbreaking image artifacts over their variable refresh range. Obviously, they have an incentive to upsell people to gsync monitors, but I think they might be right anyway. Monitor reviews are sporadic, and often focus on fixed refresh rate performance. It is completely plausible to me that problems have went unnoticed.



There have been problems noted. This pcperspective article notices excessive ghosting on early freesync monitors.
https://www.pcper.com/reviews/Displays/AMD-FreeSync-First-Impressions-and-Technical-Discussion/Gaming-Experience-FreeSync-

They released a follow up article claiming the issue was fixed, but the ghosting was replaced by arguably more noticeable overshoot artifacts.

PCmonitors.info noted that the Samsung c24fg70 had overshoot artifacts with freesync at 100hz, and flickering on the desktop with freesync on.

https://pcmonitors.info/reviews/samsung-c24fg70/

Monitor reviews aren't as exciting as gpu or cpu reviews, and these things can easily go unnoticed by the community.
So you have 450 ish more examples (of different monitor) to find and Huang is then right, crack on.
He did say they're All broken.

And you Are hear trying to back his words, Have you seen this issue in the flesh??????

Everyone with some sense would expect the odd monitor or even line of monitor to have issues possibly but to say they're all broken ,well.

My Samsung works without artefacts etc but tbf it is a 45/75hz one so I can see it's not that model.
Posted on Reply
#238
Razrback16
"Underwhelming. The performance is lousy."

Kinda funny since that's exactly what I thought when reading the Turing reviews.
Posted on Reply
#239
mtcn77
mindbomb said:
The screen tearing is fixed with freesync, people aren't disputing that. What nvidia is claiming is that many freesync displays have dealbreaking image artifacts over their variable refresh range. Obviously, they have an incentive to upsell people to gsync monitors, but I think they might be right anyway. Monitor reviews are sporadic, and often focus on fixed refresh rate performance. It is completely plausible to me that problems have went unnoticed.



There have been problems noted. This pcperspective article notices excessive ghosting on early freesync monitors.
https://www.pcper.com/reviews/Displays/AMD-FreeSync-First-Impressions-and-Technical-Discussion/Gaming-Experience-FreeSync-
They released a follow up article claiming the issue was fixed, but the ghosting was replaced by arguably more noticeable overshoot artifacts.

PCmonitors.info noted that the Samsung c24fg70 had overshoot artifacts with freesync at 100hz, and flickering on the desktop with freesync on.
https://pcmonitors.info/reviews/samsung-c24fg70/

Monitor reviews aren't as exciting as gpu or cpu reviews, and these things can easily go unnoticed by the community. Those overshoot and ghosting issues would be solved by dynamic overdrive, something nvidia requires for all gsync monitors.
There is no need to rationalize all that much after the debate, here you go;
Posted on Reply
#240
Xx Tek Tip xX
Raevenlord said:
"underwhelming product"
Ironically, the 2080 ti is exactly this.
Posted on Reply
#241
Manoa
Smartcom5 said:
Yup, like RTX and DLSS …



Smartcom
loool

Anymal said:

You werent there when the first T&L gpu came out, a!
mybe he wasn't, but it looks like you weren't there when the first shading GPU (R300) came out

Markosz said:
<div class="youtube-embed" data-id="ZcF36_qMd8M"><img src="https://i.ytimg.com/vi/ZcF36_qMd8M/hqdefault.jpg" /><div class="youtube-play"></div><a href="https://www.youtube.com/watch?v=ZcF36_qMd8M" target="_blank" class="youtube-title"></a></div> - and this was 3 years ago... they only got worse.

AMD doesn't lock features behind hardware, they make open-source platforms.
they don't do it for you, or should I say us, they do it becouse it help them save money from software development expenses, it is also the reason they have shit gpu drivers
Posted on Reply
#242
Vayra86
mindbomb said:
The screen tearing is fixed with freesync, people aren't disputing that. What nvidia is claiming is that many freesync displays have dealbreaking image artifacts over their variable refresh range. Obviously, they have an incentive to upsell people to gsync monitors, but I think they might be right anyway. Monitor reviews are sporadic, and often focus on fixed refresh rate performance. It is completely plausible to me that problems have went unnoticed.



There have been problems noted. This pcperspective article notices excessive ghosting on early freesync monitors.
https://www.pcper.com/reviews/Displays/AMD-FreeSync-First-Impressions-and-Technical-Discussion/Gaming-Experience-FreeSync-
They released a follow up article claiming the issue was fixed, but the ghosting was replaced by arguably more noticeable overshoot artifacts.

PCmonitors.info noted that the Samsung c24fg70 had overshoot artifacts with freesync at 100hz, and flickering on the desktop with freesync on.
https://pcmonitors.info/reviews/samsung-c24fg70/

Monitor reviews aren't as exciting as gpu or cpu reviews, and these things can easily go unnoticed by the community. Those overshoot and ghosting issues would be solved by dynamic overdrive, something nvidia requires for all gsync monitors.
Sorry but no. There are terrible monitors with Gsync too. Has nothing to do with VRR and everything with the panel and board behind that.

Tftcentral has it all in case you are curious. Not once has Freesync been considered related to ghosting or artifacting. The only gripe is limited ranges.
Posted on Reply
#243
Manoa
Valantar said:

-While I agree that Vega as an architecture is far better suited for professional applications (we've seen this proven plenty of times), apparently AMD feels confident enough in the gaming-related improvements when doubling the memory bandwidth and doubling the number of ROPs to give it a go. I suggest we wait for reviews.
vega IS far better suited for (professional) compute applications
this is not about what AMD feels
memory bandwidth gives nothing to gaming performance on vega
it is not double the ROPs, they are still 64
but guess what ? I also suggest we waith for reviews

Fluffmeister said:
Ultimately this is good news, finally Vega can compete with the 1080 Ti. Let's wait for the review and see if TSMC's 7nm is the silver bullet many had hoped for.
it's not really 7 nm, actual size is closer to intel's 10 nm.

I think I know whay he says freesync doesn't work "at all" on "all" monitors, I suspect he considers the concept of a monitor not running the same hertz range in freesync as it does without it.
which makes him right about that since it is a lie sold by specifications and makes people believe freesync is full range, this says something to many people here that say AMD is not as dirty as nvidea...

[USER=21173]Sasqui[/USER] Assetto CC have shit for graphics ray tracing is not going to do it any good
Posted on Reply
#244
Valantar
Manoa said:
vega IS far better suited for (professional) compute applications
Yes, that's what I said. Kind of weird to try to make a point against me be repeating what I said like that. Is a snarky tone supposed to make it mean something different? This is a fact. Doesn't mean that it can't work for gaming. I was very surprised to see Vega 20 arrive as a gaming card, but apparently it performs well for this also. Nothing weird about that.
Manoa said:
this is not about what AMD feels
Considering they're the only ones with any knowledge of how this performs in various workloads, it very much is. Launching it is also 100% AMD's decision, so, again this comes down to how AMD feels about it.
Manoa said:
memory bandwidth gives nothing to gaming performance on vega
Source?
Manoa said:
it is not double the ROPs, they are still 64
That seems to be true, given the coverage of how this was an error in initial reports. My post was made before this was published, though.

Manoa said:
it's not really 7 nm, actual size is closer to intel's 10 nm.
Node names are largely unrelated to feature size in recent years - there's nothing 10nm in Intel's 10nm either - but this is still a full node shrink from 14/12nm. Intel being slightly more conservative with node naming doesn't change anything about this.

Manoa said:
I think I know whay he says freesync doesn't work "at all" on "all" monitors, I suspect he considers the concept of a monitor not running the same hertz range in freesync as it does without it.
If that's the case, he should look up the definition of "work". At best, that's stretching the truth significantly, but I'd say it's an outright lie. What you're describing there is not "not working".
Manoa said:
which makes him right about that since it is a lie sold by specifications and makes people believe freesync is full range, this says something to many people here that say AMD is not as dirty as nvidea...
Why is that? AMD helped create an open standard. How this is implemented is not up to them, as long as the rules of the standard are adhered to (and even then, it's VESA that owns the standard, not AMD). AMD has no power to police how monitor manufacturers implement Freesync - which is how it should be, and not Nvidia's cost-adding gatekeeping. AMD has never promised more than tear-free gaming (unless you're talking FS2, which is another thing entirely), which they've delivered. AMD even has a very helpful page dedicated to giving accurate information on the VRR range of every single FreeSync monitor.

Also, your standard for "as dirty as" seems... uneven, to say the least. On the one hand, we have "created an open standard and didn't include strict quality requirements or enforce this as a condition for use of the brand name", while on the other we have (for example) "attempted to use their dominant market share to strong-arm production partners out of using their most famous/liked/respected/recognized brand names for products from their main competitor". Do those look equal to you?
Posted on Reply
#245
Manoa
you mean feels as in "it good enough to go out to the market" ?
yhe sure it can work in gaming, first vegas proved that :), im saying it like that becouse wrong chip for the wrong purpose, mybe better to augment polaris ?
you right AMD is better becouse they use the open standards approach, that's not the problem. for me the problem is that they are not doing it for good reasons, for us, it becouse they are doing it for themselfs...
thanks man that very elaborate response, you cleaned up a few things :)
I didn't know the freesync problems are becouse of the monitors not AMD or the freesync system itself
everyone are liers today about the nm eh ?
Posted on Reply
#246
Aquinus
Resident Wat-man
Valantar said:
Source?
I have a Vega 64 and I assure you that memory bandwidth isn't helping gaming. 20% overclock on HBM yields no tangible performance benefit with mine. These cards are constrained by power consumption, not memory bandwidth.
Posted on Reply
#247
HTC
Camm said:
This strikes me more that Huang is somewhat scared of what's coming down the pipeline.

Nvidia got forced out of its lucrative Gsync business because otherwise it wouldn't be able to certify against HDMI 2.1.

Nvidia's again pushed out of the console space (yes we know theres the Switch, but the Switch won't be defining GPU trends), which will continue to see Nvidia technology only bolted on, rather than core to any game experience. In 6-12 months we will see two new consoles built on AMD Navi GPU's, and corresponding cards for desktop. This is new, as traditionally console GPU's have lagged behind desktop tech, which will invariably benefit AMD for optimisation.

And lastly, with Intel entering the GPU space, Nvidia is left out in the cold with a lack of x86 ability, and only so much of that can be countered with its investment in RISC.

Full disclosure - I own a 2080 Ti, but to me this comes across as a temper tantrum rather than anything meaningful
Have not heard of this: source link, please?
Posted on Reply
#248
Valantar
HTC said:
Have not heard of this: source link, please?
The HDMI 2.1 spec includes VRR support as a requirement, which would prevent Nvidia from certifying its cards against that spec if they still limited VRR support to G-sync. I guess a "middle of the road" option would be possible by supporting VRR on non-G-sync displays only if they are HDMI 2.1 displays, but that would just lead to (yet another) consumer uproar against Nvidia. Let alone the fact that (at least according to all reports I've seen) the HDMI 2.1 implementation of VRR is technically very, very close to AMD's existing extension of VESA Adaptive Sync To FreeSync over HDMI. Nvidia would have no technical excuse whatsoever to not support FreeSync.
Posted on Reply
#249
champsilva
GinoLatino said:
I don't think so! The difference is that nVidia was stable and just dropped, while AMD had a "bubble" and then came back to usual values, is even higher than "before the bubble".
nVidia : red
AMD : green


According to this 2018 was the boom to RTG group

[IMG]https://marketrealist.imgix.net/uploads/2019/01/A5_Semiconductors_AMD_GPU-gaming.png?w=660&fit=max&auto=format[/IMG]
Posted on Reply
#250
FordGT90Concept
"I go fast!1!11!1!"
Discreet Vega did really well by that chart.
Posted on Reply
Add your own comment