wait is 1440x900 even a standard HDTV size?! lol.
It wouldn't be 1440x900, it would be 1600x900. There are 1600x900 panels, but they aren't common. But 1600x900 scales pretty well to both 720p and 1080p.
Xbone has 32 MiB ESRAM which is about the same performance of the GDDR5 in PS4. I think that's going to compensate for DDR3's slower speed quite substantially.
What people don't talk about is the fact PS4 uses GDDR5 for its main memory. Yes, it is higher bandwidth, but it also more susceptible to error and has higher latency. The GDDR5 may be a boon for PS4's GPU component but it will cripple the CPU component. Microsoft took a similar approach with Xbox 360 and it worked well; the same should be expected of Xbone.
I think it's the shader difference that is translating to higher resolutions. Xbone has nothing to compensate for that difference except (maybe) higher clockspeeds but that isn't clear yet.
The 32MB of ESRAM is not going to make that much of a difference. Yes, it is fast, but there isn't enough of it. It is going to act like nothing more than a L3 cache shared between the GPU and CPU. Microsoft designed it with the hope that it would make-up for the shortfalls of using DDR3, but the fact is it doesn't. 32MB just isn't enough to be really useful. And saddly, the inclusion of the ESRAM on the APU die might be the reason that they are only using 2/3 the number of shaders. They gambled with using DDR3 and ESRAM and sacrificing shaders, and they lost.
Besides the GDDR5 on the PS4 is providing 170+GB/s of bandwidth, if the rumored 1375MHz clock speed is correct. The ESRAM is only doing ~200GB/s according to Microsoft, so the ESRAM is not actually that much faster than the GDDR5, and isn't going to make up for the slow DDR3 which is only capable of 60GB/s.