I like what somebody said in the article linked:
Somebody that isn't a fucking idiot wrote:The speed of light is a fixed constant that makes latency a non-negotiable problem for any sort of effort like OnLive and an army of developers are busily explaining on the internet right now how there is no way to avoid dropping frames from any game due to it, return rates, etc.
C is fixed. The rate of data transmission across optical is generally 33-66% of C depending on conditions (and not counting packet routing etc).
Server side "proper" game is a dream. We've seen these sorts of claims before about instant compression and the like and they always fall apart because whatever the coolness of the compression it is never a 0ms operation to unzip, interpret and re-submit instructions back to it. It doesn't matter how lean the packets are.
Server-side gaming can never beat the locality of a processor for transmission speeds, which is why it is best suited for network gaming between two local machines. At least in that scenario latency can be corrected and adjusted, but it can't if the whole thing is server-side.
C is a physical constant. End of argument.
And another thing, even if our bandwidth can download the entire contents of the Interwebs in .000027 whatever-the-fuck-is-smaller-than-nanoseconds, all it takes is some amount of lag to fuck up the gameplay, because several thousand people are downloading the entire contents of the Interwebs 20 times each, or watching some streaming holovideo on their 1166400p SEHDTVs, or whatever else they do in the 2050s.
Just look at the different between 56K modems and broadband. Sure, you can download pages at different speeds, but what is still there? Lag. Latency is still a constant. You could be watching a movie, and look, something lags and you have to stop doing what you're doing. And that's WITH a buffer. The buffer hides all of the hiccups in-between.
And if all of that hasn't sunk in, let's just look at the FPS argument. (Frames per second, dammit!) Just to make sure that it doesn't look like a jerky piece of shit, you'll need to get a 30-60fps frame rate. So, 1/60s is going to be 16ms. It currently takes my connection 75 fucking milliseconds just to send a 64 byte ping packet on a round-trip to Google, the most popular web site on the fucking planet with more servers than ET games buried in the Nevada desert. Even if I doubled that to 32ms, that's not going to cut it.
Also, they'd have to split up that huge HDTV frame into 1440-byte chucks. That's a lot of packets just for ONE frame. If we took an 1280x768 frame, which is about a 1MB each, that would be 683 packets per frame. If any of those packets fuck up, there is no re-transmission in UDP, so you would have to piece that frame together the best you can, and wait for the next one to come along, because asking for stale frames doesn't make sense.
There's a damn good reason why video and UDP don't mix. It's a stupid idea.