subscribe

Intel Focuses on VR at Press Event

Intel held a press event that it said was the most technically complex press event that it had ever staged as it had set up high powered VR systems for all of the press that was able to get into the event.

The main speaker was Brian Krzanich, CEO of Intel who started by saying that Intel is thinking hard about the extension of technology beyond the ‘classic consumer electronics’ devices.

Krzanich asked “What is driving these new technology innovations?”. Moore’s law is alive and well, Intel believes and he showed a system that was said to be running a chip with 10nm process technology (as a riposte to Qualcomm, which announced its latest chips earlier in the day – Qualcomm Moves to 10nm with 820/821) and he said that Moore’s Law would still be true for at least his career. Intel will have 10nm process technology in production before the end of 2017.

VR is a big topic for Intel and, like a lot of technology, the development of VR is being hardware-led. By 2020, Intel believes that every individulal will create around 1.4GB of data per day in graphics and communication content, more than double what is produced at the moment. He showed a demonstration of a 3D video recording that used 38 individual 5K cameras to capture soccer “completely” in high resolution 3D and that level of capture creates as much as 2 terabytes per minute of data.

Intel’s volumetric soccer video uses 38 cameras and produces 2TB of data per minute

Virtual Travel – An Early VR Application

Krzanich said that he believes that there will be three compelling initial applications for the immersive video and content. First will be travel and Intel planned to show how an Oculus Rift could be used in that way at the CES press event. The second application will be ‘work’, then will come gaming.

Why do we travel?, he asked. We do it for new experiences and adventures. Not many will climb Mt Everest or sky dive (well, maybe a few in the room). Krzanich got those lucky enough to actually get into the event to to put on their headsets at that point. The demo was of flying freely above the desert, followed by a parachute jump, all with 360 degree video.

For the last 20 years, video has been basically the same, he continued. We have gone to higher resolution and colour, but it’s still basically just flat video – so Intel is going towards volumetric video. To help with this Intel has bought a company, HypeVR, that specialises in immersive video. Krzanich then introduced HypeVR founder Ted Schilowitz. The firm is a computer vision company working to advance ‘what is video’ from just being a flat screen experience to being able to allow viewers to interact with live video.

Walking About in Video

The first demo was of a ‘walkabout experience’ in a waterfall in Vietnam. The demo had real perspective; moving the viewpoint showed different content and, for example, parallax and occlusion looked correct. Viewers could look behind objects, or down on objects which is not possible using simple stereo3D. It was, frankly, quite impressive, even on the main screen display.

3D Video HypeVR VietnamThis 3D Video was captured by HypeVR in Vietnam and there is 3GB per frame of video

Krzanich was keen to point out that what was being shown was not CGI, but video at a high frame rate and very high definition. Each frame needed 3GB of data. There are huge amounts of data (which, of course, suits Intel), but the technology allows viewers to freely move around in a video.

VR Can Help with Work

Then he looked at VR at work. There is a lot of work that is dangerous – how can you reduce the danger, he asked? An example is that solar panels need inspection – drones can be used, but they can be hard to inspect easily and may even be dangerous if the panels are in the middle of the desert – certainly it’s slow and expensive. So Krzanich showed a demo of 360 video being captured from a live drone in the desert, in real time and streamed to the press conference. Krzanich said that the use of this kind of technology could also be helpful for search and rescue with 360 degree cameras using drones to feed back video to searchers who could be back in a centre.

Intel acquired Voke which has developed technology that can use 6-8 cameras with a wide angle of view. This allows multiple vantage points for viewing – for example from different seats at a sports or cultural event or, for example, from behind the scenes. There was a plan to show a live basketball event during the conference, but the timing was out of phase with the game and the players were off the court at half time when Krzanich wanted to show it! Unfortunately, that meant that the experiment didn’t quite work out….

Voke VR will be available on Rift later – but it is available now on the Samsung Gear VR now. Concerts and other content including sports events will be available on Voke later.

Intel Has a Sports Group

In 2016, Intel started a sports group in the company and the firm believes immersive video will ‘truly transform’ the experience of sports. It can be used in different sports and can even help with refereeing, by allowing viewing from any direction. Intel is working with La Liga in Spain to install three different camera positions in some venues to allow immersive viewing.

Krzanich then talked about gaming and mixed reality, which Intel has been working on Project Alloy (for headsets) and on Realsense for depth sensing. Project Alloy is developing well, he said. At the moment, the hardware setup for VR is complex, with external sensors for tracking the user and hardwired connections etc, but Intel wants to make it simpler. Using RealSense with Project Alloy, you can put all the technology you need into an untethered headset.

intel project alloy

Next, the back of the stage opened and there was a demo of a ‘regular’ living room, with two players who were untethered and without any sensors. The room had been scanned and input to the system. Using the Project Alloy technology, the environment was completely changed. The walls became sky and furniture became different objects in the game. The players could see each other as they started shooting at flying targets coming in. They could hide in the room behind real objects, but these objects looked different in the game.

Project Alloy will be in real products by Q4 2017, supplied by OEM partners, and Krzanich said that Intel will give the technology to ‘any companies’ that want to develop it.

He then turned to ‘traditional’ gaming which is where gamers want the highest power systems for the best possible experience. He showed a trailer of Arizona Sunshine – which is “a bit bloody”, he said. There were lots of zombies and gore, but the graphics seemed quite good.

Finally, the demo went back to the basketball game, live. Sadly, the content was not available on the streams, only on the VR headsets.

A viewer comments

Steve Sechrist was able to get into the event and use a headset. Here are his comments.

Entering the intel press day event was a bit of a surreal experience, even for a twenty-year CES veteran—yikes 20 years!

The giant room at the Mandalay Bay conference facility was filled with plush leather chairs, I found more than welcome after a day of fleet dragging in endless media event lines. If they stopped there, it would have been enough to satisfy even the most hard bitten journalist in the crowd; but Intel then gave us all access to the latest Oculus Rift, hard wire connected to one of several Intel-based PC systems. A brief tutorial followed on how to properly adjust the device for comfort and fit as we all waited (some slept) for the show to begin.

The Oculus Rift has come a long way in comfort, and sitting there, one barely noticed the umbilical cord attached to the PC, unless standing up or moving around more than usual. In several of the live demonstrations, we were encouraged to do just that, as the events were showing off several forms of real time streaming. Intel used this to help sell the point of VR technology’s use (not “us”) in both sporting, and entertainment events, as well as B2B inspection, and other field use from distant locals.

For the most part the image resolution still looked a bit low. It was clearly not like the full-HD quality seen on a newer LCD or OLED flat screen, but the 360 degree real-time stream seemed robust with low latency even with rather fast head movements. With each new stream, I tested this with quick head movements both side-to-side and looking up and down. At one point were encouraged to stand, and look around objects to see what was behind, or notice how the images conformed to our new perspective.

Overall it was a pretty impressive demo and most journalists i talked to seemed to be equally impressed. I did mention to the Intel media rep one suggestion for future demo’s. It would have been nice to have a bit of interaction with those folks in the field, or at the site of live streaming. That would help to establish and underscore the live aspect of the event and bring a bit of the human touch to the whole demo. Steven Sechrist