My Vision,     
     My World

2008.11.17


2008.08.07


2008.06.21


2008.06.05


2008.05.07


2008.02.12


2007.12.18


2007.10.07


2007.06.16-03


2007.06.16-02


Archive

Archived Post

OnLive Examined

This year’s ongoing Game Developer’s Conference has seen the announcement of OnLive, the self-proclaimed “future of gaming.” There has since been much interest, particularly in the feasibility of the service. I provide my analysis.

The basic concept
OnLive is a video game-centric take on a general computer architecture concept known as ‘dumb terminals’, ‘thin clients’, or more envogue, ‘cloud computing.’ The idea is fairly simple: reduce cost by simplifying the terminals that users interact with and concentrate computing power into centralized servers. The client’s terminal provides basic video display and I/O to the user. The input from the user’s input devices is sent across the network to a server running the application the user is interacting with. This server performs the computations, and sends the results back to the client terminal, which displays them to the user. In the case of OnLive, the input is your keyboard/mouse or controller, the network is the Internet, the application is a game, and the return data is a high-definition video stream. The terminal may be either your PC, or a (presumably inexpensive) thin client which OnLive refers to as a ‘MicroConsole’ which you can connect to your TV.

The exciting stuff
So why should anyone care? There are really two major advantages to this sort of service: cost and portability. Because your terminal need only process input and decode a video stream from the OnLive service, even cheap, low-power computers could theoretically play the latest games with all of the highest-detail graphical settings enabled. If console games are eventually supported, you could play them without needing to spend hundreds of dollars to buy a console. Moreover, you could move from your desktop to your laptop to a friend’s computer and always be able to get access to any game you have purchased on the service

The technical challenges
Much of the response to this announcement has been skepticism as to the technical feasibility of the service. In particular, there are big challenges in terms of latency, network bandwidth, and load balancing, which some people have labeled as ‘impossible’. Others have made the counterargument that it must be feasible, because the company has spent 7 years working on the challenge, has former big-shots from WebTV and Netscape on its executive board, announced plans to launch the service this winter, and provided hands-on demos to attendees of GDC. However, lest we forget the last high-profile internet games device, the Infinium Labs Phantom, also had well-known executives and yet all its demos and announcements seemed to be an empty show to drum up money from venture capitalists. Therefore, it’s not enough to assume feasibility based on these grounds without actually examining the technical challenges.

Latency
When playing a game, we’re used to having our mouse movements and button presses near-instantly responded to. Timing sensitive games, Dance Dance Revolution, Guitar Hero, or high-speed shooters can be negatively effected by tens of milliseconds of lag between button press and display. OnLive has two sources of latency to deal with: the time it takes to transmit data to and from the server on the network and the time it takes to encode the game’s raw video into a compressed format for playback on the terminal. The first of these is equivalent to ‘ping’ in online multiplayer games and is a function of the distance and number of hops between your terminal and the server and the congestion of the network. It can be improved by moving the server closer to the terminal. The second is a function of the processor being used to encode the video.

OnLive claims that they can have a video encoding solution that can compress a 720p video frame with a latency of 1 ms. Comparatively, a typical codec might require hundreds of milliseconds. Even commercial dedicated encoders designed for low-latency, such as Haivision’s Mako-HD, only claim as low as 70 ms, which the manufacturer says is the ‘lowest h.264 encoding time ever recorded’. There’s no physical limitation here and h.264 belongs to a class of ‘embarrassingly parallel’ applications, so it is theoretically possible that a highly-parallel ASIC or FPGA-based implementation might achieve this 1 ms figure (though this might be quite expensive to implement for multiple users). Alternatively they may be using a lower-quality codec or simply measuring the latency in a different (misleading?) way.

Network latency is limited by physical laws, though the largest contribution typically comes from packet processing at network hops. This can be reduced by using multiple servers distributed across the country, with terminals being served by the nearest server. This is the sort of approach used by content-delivery services and many larger websites. Even so, network latency is often 80-100 ms or higher and worse under congested circumstances. My current ping to Google and Youtube ranges from 80-350 ms. This large variation can be attributed to the fact that the Internet is a ‘best-effort’ network. Unlike your cable television or telephone service, there are no deterministic guarantees for latency. If you’ve ever used Skype, you know that sometimes the latency - even for simple voice communications - can often be so high that it makes it difficult to have a conversation. OnLive has stated that its service is usable at distances of 1000 miles (which may be a liberal usage of ‘usable’). Further, they state that they will pursue peering agreements with ISPs, which might allow them to place servers directly at the ISP’s connection point - a strategy which Google is also pursuing and that is used by companies such as Akamai. If the servers can be placed at one of the closest hops, this could reduce the latency to 10-20 ms, which is very tolerable. On the other hand, it would require very many servers.

The important question is, how much latency is tolerable? Above 90 ms is said to be visually noticeable. For high-speed games, under 50 ms is desirable, but it’s important to remember that any lag will be additive to the lag that is already introduced by the video processing in your LCD screen (unless you’re still using a CRT television). LCD latencies often run about 30-40 ms. Therefore it will be very difficult to achieve a non-noticeable lag when using OnLive. Some people may be willing to tolerate this lag in order to take advantage of the service.

Bandwidth
OnLive has stated that a 3-5 Mbps internet connection will be necessary to receive the video stream from the server. This indicates that we can expect some visual degradation due to compression, as a 5 Mbps+ bitrate is typically required for ‘visually lossless’ compression. Most of the ISPs in my area offer fast enough connections, but a bigger concern may be the recent announcements that many ISPs have instituted monthly caps on data transfer - generally somewhere around 50-100 GBs per month. At 5 Mbps, you are eating up over 2 GB of data transfer per hour. If you’re using OnLive for several hours a day, you’re going to chew through that transfer limit quickly, even without considering the other Youtube/Hulu/Netflix streaming you may be doing. Other than Verizon, almost every major ISP has some sort of cap instituted or announced. I have uncapped internet, but only because I pay for the most expensive plan the ISP offers. Peering at ISP datacenters may be the solution again, as ISPs might offer contracts not to charge for OnLive bandwidth coming from their own site. At this point, OnLive could practically be viewed as a service like Digital Cable.

Load balancing
So far we’ve been been hand-waving the architecture of the OnLive servers and merely assuming that they can handle the load of many users per server. However, their computational model is much different than that of, say, Google. As I’ve already mentioned at length, latency is a big issue, so there is no value to be gained from a supercomputer architecture consisting of clusters of cheap processors. Moreover, there aren’t really ’server-class’ GPUs that offer higher graphics performance than the graphics cards available to consumers, and it is very difficult to support multiple users per GPU. There is GPU virtualization research being done by VMWare and others, but my impression is that it is not production-ready. If this is the case, the servers would need to have a separate high-end graphics card for every active user. Since this service is targeted only at people in the US (due to the latency issues), it can expected that a lot of the users will be trying to play during evening hours and few users will be playing during business hours or the middle of the night. This would lead to a large amount of inefficiency in hardware utilization. It would probably be too expensive to provision a GPU for every user, so there is a possibility that there may be waiting lines to use the service during peak hours. Furthermore, when high-profile game titles are released (Grand Theft Auto 5, Madden 2009, etc), how will the service be able to handle the peak in demand?

My thoughts
OnLive’s demos have shown that the service is possible in ideal situations (50 miles from server, no load balancing, no bandwidth concerns), but can it function in the wild? These demos do not prove that the service can handle the challenges I’ve described. While I’m excited about the possibilities, I’m taking a wait-and-see attitude towards this service. I’ve signed up for their beta program, which is scheduled to start in the summer. The best indicator on the realistic chances of this service will be to see whether things proceed on schedule through the beta towards the launch this winter. For a service that has been in development for 7 years, one would expect that they have their timeline in place, and any Phantom-like delays will be telling.

Even if the OnLive system works, it is unlikely to function with complete parity to running games directly on your home PC or game console. Some loss of visual fidelity in video compression seems likely and increased input lag is a certainty. Hardcore gamers may shrug it off as a waste of time, but maybe that’s not entirely the point. Is the simplicity and convenience of dumb terminal over the complexity and expense of a gaming PC worth sacrificing some fidelity? For many people, the answer may be ‘yes’. Youtube offered horrible video quality for most of their history, and yet came out victorious over services like Stage6. I believe there are definitely a sizable number of people who are willing to sacrifice fidelity for convenience. The balance that OnLive strikes between quality of experience, convenience, and cost is what will determine success, failure, and….maybe…the future of gaming.

collapse Michael Says:

Three Thoughts:
1. I agree about the issue with transfer caps. It’s also worth noting that the cap hasn’t always been in place, meaning it’s a response to larger data transfer by users, rather than an archaic restriction that is no longer applicable as the internet grows and changes. As a side note, I think “archaic” when referring to the internet means, like, 5 years ago.

2. I think you touch well on an interesting apparent paradox - bigger, better; smaller, more convenient. People are buying bigger televisions with higher resolution screens, and then paying more for HD content. At the same time, people watch poor-quality YouTube videos on their iPhone and seem to enjoy it very much. The Wii (with comparatively poor graphics) and the DS (obvious handheld restrictions) are the top selling systems. Maybe this cloud thing has a chance, even if it offers reduced graphic quality.

3. Any word on the price? It seems like it could be good if you already have a PC, play a lot of games, and really like multiplayer games. Seems lame if you have to buy hardware, the games aren’t plentiful or cheap, and multiplayer is filled with lag.

collapse t-kun Says:

2. I think the reduced graphical fidelity is less of an issue; Even compressed 720p video will look better than a Wii. It is worth noting that PC gamers have the option of using much higher than 720p on their own machines, but 720p is probably perfectly acceptable for most games. The real quality issue, I think will be the input lag factor. An analogy for this might be to consider a situation in which all youtube videos have a slight synchronization problem between their video and audio.

3. I don’t know about the subscription cost of the service, but something I didn’t really mention is that you pay an additional cost to buy or rent each game over the top of the subscription cost. The hardware requirement - for now - is simply that your computer can playback 720p video, which just about every computer produced within the last two years can do (the service will also be available on Intel-based Macs as well as their MicroConsole). The nicer aspect to the pricing is that you will not need to upgrade your system to take advantage of newer games. The only feasible increase in specifications they could impose on the users would be to move up to 1080p video or possibly a more compute-intensive codec, and video decoding acceleration is already well-supported on modern systems.

 
 
collapse colecago Says:

Yeah I saw some stuff about this as well. I don’t think its gonna take off. Just too much data to be transferred back and forth. The ISP’s are going to ruin this with caps and throttling, and their other transgressions. For example, I pay for 6mb service, I called them up regarding speed and they said “anything between 3-6mb is acceptable).

 

You must be logged in to post a comment.