This is just simply not true. First of all, if you want a real world test go search for the blind test they did on Linus Tech Tips Youtube. They grabbed one of their buddies who has never used 120hz (or played BF3, the game they used) and were trying to see if the average gamer could tell the difference. The results were interesting enough they did a second test to see if Linus could tell the difference as someone who games at 120hz and is familiar with the game.
The long and short is this: If you haven't ever used it you are less likely to be able to tell the difference at a glance, that doesn't mean it isn't there, it means that you aren't prepared to notice the difference because you're ignorant. Just like people said the same thing about 1080p vs 720p, or for older people VHS vs DVD. The fact of the matter is that whenever you get something of higher quality you don't often see the difference until you get used to the higher quality. Once you adjust to the higher quality the difference when you see something worse is much more drastic. Anyone today could see a difference between DVD and VHS or Bluray and DVD because we have adjusted to the higher quality level. It's the same thing.
I used to work in sales and most customers have a hard time telling 1080i from 1080p until they get it in their homes for a month or two. Same with calibrated TVs, if you go into a store and look at the oversaturated too bright screens it all looks great until you use a calibrated TV for a few months...then the ones in the store look pixelated and you can see the whites/blacks crushing. Audio is the same way. I'm a second generation audiophile, I cringe when most people have me listen to their "killer new home theater system" or their "awesome new car stereo" because it sounds awful. However, to them it's better than the even worse sound they had before from their TV or their factory stereo. Just because you've "used it" doesn't mean you're qualified to say it's a placebo effect only noticable by the best FPS gamers in the world. That's a preposterous statement, it's noticable by and idiot with a mouse and a copy of ANY shooter ever, they just need to play on it long enough to adapt.
120hz is a HUGE difference. It's not subtle, it's not a "placebo" effect. It is a real and noticable difference in motion blur, it is half the lag time and especially with lightboost it is the most accurate picture you can get without a CRT. I get that you might think that 1440p outweighs 120hz, I disagree. Most people can't run 1440p with max settings at a reasonable framerate, a lot of people can't even run 1080p at max settings. If given a choice of one or the other I'll take 120hz anyday. With that being said I'm going to overclock a 1440p korean panel myself, so I'll do both, but that isn't for everyone and I'm willing to open it up and fix any backlight bleed.
Edit: As an aside, I'll make the same argument about "input lag" or "lag" in general. If all you've ever played on was a shitty gaming setup then that new IPS panel with 30ms of processing lag probably seems fine to you, it doesn't mean it's awesome for gaming, it means you don't know any better. If you went to a house with fiber internet (I'm drooling over Google fiber right now, wish it was available in my area), a high speed gaming monitor, a nice gaming rig with appropriate mouse and surface you would notice a major difference after using it for a couple weeks. You'd never be able to go back, it's both a blessing and a curse to be able to afford nice things. All I'm saying is that everything adds up to your experience, if you're fine with how your things are...AWESOME. CONGRATS. But I'll thank you to not come into a tech forum where people look for advice and often make purchase decisions and make subjective claims about a technology because you haven't put the time in to really become an expert on a topic.