Google published the Spotlight Stories on Android and iPhone, which features 360-degrees film. With the app available now in iOS, users of Apple devices can also enjoy its featured films, including the Windy Day by former filmmakers from Pixar. Three other stories are also available with the application, including Buggy Night, Help, and Duet (a film also directed by Fast and Furious director, Justin Lin).

However, it is still doubtful whether 360 immersive film will success or not. My concerns are as follows:

  1. The fatigue caused by long-time holding devices (iPhone, iPad and even Oculus Rift and other head-mounted displays. Traditional film allow people to sit on sofa, enjoying popcorns and soda meanwhile. But a `heavy’ device might frighten away many potential users.
  2. The question is: how to evaluate the immersive among phone, tablet, cardboard, Oculus Rift and so on? I believe phone < tablet < cardboard < Rift… but why?
  3. How much human efforts are put into the movies? With Google Jump, it is easy to manufacture 360 movies. Nevertheless, to gain vivid movie experience, much efforts should be made into cutting and stitching adjacent camera views.

Anyway, Google still outlook a promising step towards future VR world.

This reminds me of a recent SIGGRAPH paper: High-Quality Streamable Free-Viewpoint Video by Microsoft. Collet et al. presented the first end-to-end solution to create high-quality freeviewpoint video encoded as a compact data stream. Their system records performances using a dense set of RGB and IR video cameras, generates dynamic textured surfaces, and compresses these to a streamable 3D video format. Four technical advances contribute to high fidelity and robustness: multimodal multi-view stereo fusing RGB, IR, and silhouette information; adaptive meshing guided by automatic detection of perceptually salient areas; mesh tracking to create temporally coherent subsequences; and encoding of tracked textured meshes as an MPEG video stream. Quantitative experiments demonstrate geometric accuracy, texture fidelity, and encoding efficiency. They release several datasets with calibrated inputs and processed results to foster future research.