Redundant question? CGI means computer generated image, so all gaming graphics that aren't pre-rendered are CGI.
Pedantic bent aside, it won't ever happen. The short of it is that we currently use different shading on games than on a movie. Think of it this way, in a basic college level physics class bodies aren't assumed to be deformable, and friction is almost never a factor in calculations. In this way, a night's homework might be 30 problems. At the graduate level we have a more precise understanding of how bodies interact, deform, and how energy is lost in the system. The graduate might have a single problem, which the 30 smaller questions approximate. Different means of calculating the same thing yield vastly different computational loads.
If you want graphics to be pretty and smooth, you run a solution that is 90% of the desired goal. That last 10% makes things beautiful (even the detractors of Pocahontas with blue cats, I mean Avatar, can't deny it was pretty), but the cost to accurately calculate it is several orders of magnitude more work. Whenever the average consumer has a server farm for rendering even the simplest things, this will be a possibility. Given what Intel and MS are pushing, I don't think computational horsepower is going to be scraping those lofty realms in the next decade or two.
To make the point even finer, Anti-aliasing and its ilk won't even appear at Pixar. It's a fundamental component of most video games.