Intel research: Beautiful theories for cloud streaming - and sobering numbers


The final lecture of the Cloud Gaming Conference 2012 came from Germany. More precisely, by Daniel Pohl, Research Scientist at Intel. In his refreshingly technically sound contribution Pohl treated three points that were in the 15 previous sessions as well not mentioned.

"Games that are streamed from the cloud, are currently the same as those on the local machine," said Daniel Pohl. "It's more in the cloud computing power, so that they could score with graphically better versions." The applicable proceedings there are ray tracing, voxel rendering and photo-realistic rendering. Voxel rendering scores in a state shown by Pohl example with "millions of trees and billions of grasses", photorealistic rendering passes with Intel Embree kernel technology also in "normal" CPU realm - the results are impressive graphics that are not yet can be calculated in real time.


The situation is different with Ray Tracing: busy Since 2004, Pohl with ray-tracing versions of popular 3D shooters like Quake 3 and Return to Castle Wolfenstein. The latter title is in the Intel labs smoothly and in real time - on Xeon Phi Koprozessorkarten, which are equipped with more than 50 cores and more than 8 GB of GDDR5 memory. A cloud computer can control up to eight of these cards, multiple machines can be combined to form clusters. The card is not available, code-named Knights Corner scaled linearly with screen size and frame rate: A map produced two tablet streams, for a high-end PC four cards are used. However, the hardware cost is not low. "Maybe that would be something for premium services or local entertainment center that can sell itself from PCs and consoles," said Pohl.

From the future back to the present: latency is one of the major problems of cloud streaming. "In this case, most immediately think of the Internet, but the home PC contains many hidden latency producers," said Pohl. The USB port is for example, read only every eight milliseconds, wireless keyboards or degrade systems like Microsoft's Kinect motion control that value: "I have OpenNI with a latency of 60 ms measured" Pohl, so. In addition, the local computer process the input of the player and the resulting decompressed image from the cloud.

In the cloud itself, the data streams are not always optimized latency what Intel tried with Direct-I/O-Technologie to get a grip. But even with speed-optimized drivers, better data processing and latency-free monitors - conventional screens have up to 45 milliseconds latency native - the total latency decreases loud Pohl at most of approximately 234 to 115 milliseconds. For a further reduction could provide an increased refresh rate, which bisects the buffering the resulting latency. In Pohl's example would be 97 milliseconds so on. But with 120 instead of 60 Hz increases the required bandwidth - the real problem with cloud streaming.

"Cloud gaming services like OnLive demand for a resolution of 1280 × 720 pixels at 60 fps at least 5 Mbit / s," calculated Pohl. Translated this means, however, for the user, for example, the U.S. cable giant Comcast for about 110 hours per month, the cloud gaming line closes down. Only: "use this resolution after a Steam survey in July 2012, only a fifth of all PC gamers Almost a third's the big deal with a horizontal resolution of 1920 dots.." And for 2013 to take Intel predictions manufacturer of premium notebooks and desktops, already the 4K mark in the eye, when ten times more pixel data would flow through the lines. At 250 GB / month were just eleven hours Cloud gaming per month left. Sobering numbers, which do not quite fit into the beautiful theories of cloud streaming experts - but many of them had left the meeting before Pohl's already talk again.


No comments:

Post a Comment