Version: 2017.2
HoloLens
싱글 패스 스테레오 렌더링(Single-Pass Stereo rendering)

Unity Google VR 비디오 비동기 재투영

비디오 비동기 재투영이란?

Async Reprojection Video is a layer (referred to as an “external surface”) that an application can use to feed video frames directly into the async reprojection system. The main advantages to using the API are:

  1. API가 없으면 비디오가 한 번 샘플링되어 앱의 컬러 버퍼로 렌더링된 다음 왜곡 보정을 수행하기 위해 컬러 버퍼가 다시 샘플링됩니다. 이렇게 되면 이중 샘플링 결함이 발생합니다. 외부 표면은 EDS 작성자에 비디오를 직접 전달하므로 한 번만 샘플링되어 비디오 화질이 개선됩니다.

  2. 외부 표면 API를 사용하면 비디오 프레임 속도가 앱 프레임 속도와 달라집니다. 애플리케이션이 새 프레임을 렌더링하는 데 1초 정도 걸릴 수 있지만, 사용자가 머리를 움직일 때 검은색 막대가 보이는 정도에 불과하며 비디오는 계속 정상적으로 재생됩니다. 비디오 프레임이 저하되는 것을 크게 줄이면서 AV 동기화를 유지할 수 있습니다.

  3. 애플리케이션은 DRM 비디오를 재생하려는 것임을 표시할 수 있고, API는 보호된 비디오를 표시하고 비동기 재투영 프레임 속도를 유지하는 보호된 경로를 생성할 수 있습니다.

알려진 문제:

  1. 비디오 비동기 재투영을 사용할 때 카메라는 원점(0,0,0)에서 시작해야 합니다. 카메라 위치가 0,0,0으로 설정되어 있지 않으면 오류가 발생할 수 있습니다.

  2. 공개적으로 사용할 수 있는 비동기 재투영용 C# 인터페이스가 없습니다. 공개 API는 Java 전용입니다.

비동기 비디오 재투영 활성화

비동기 비디오 재투영은 Daydream VR 기기 설정의 일부분입니다.

Click the grey arrow next to Daydream and then check the “Enable Video Surface” box to enable use of the Async Video Reprojection feature

Select the “Use Protected Memory”option ONLY if you require memory protection for all of your content as enabling this means that it is enabled for the lifetime of the application.

API 문서

To take advantage of the Google VR API you will need to extend the UnityPlayerActivity. For more information, see documentation on Extending the UnityPlayerActivity

Because Java plug-ins cannot directly access Objects in your scene, you will need to provide a simple API to your C# code that will allow you to pass a transform to the Java side as well as to tell your Java code when to start rendering.

Note: This code is not complete. It contains no implementation of a video player as that is a client specific implementation detail. It also doesn’t have any playback controls, which would have to be implemented as objects in the scene and actions on those objects would need to call into Java.

Unity 에디터에서 Java를 사용하고 UnityPlayerActivity를 확장하는 데 대한 내용은 Unity 에디터에서 Android 개발 문서를 참조하십시오.

For information about the Google Video Async Reprojection system, refer to the Android Developer Network documentation on Video Viewports.

Java sample code:

package com.unity3d.samplevideoplayer;

import com.unity3d.player.GoogleVrVideo;

import com.unity3d.player.GoogleVrApi;

import android.app.Activity;

import android.os.Bundle;

import android.util.Log;

import android.view.Surface;

public class GoogleAVRPlayer implements GoogleVrVideo.GoogleVrVideoCallbacks {

    private static final String TAG = GoogleAVRPlayer.class.getSimpleName();

    private MyOwnVideoPlayer videoPlayer;

    private boolean canPlayVideo = false;

    private boolean isSceneLoaded = false;

    // API you present to your C# code to handle initialization of your

    // video system.

    public void initVideoPlayer(UnityPlayerActivity activity) {

     // Initialize Video player and any other support you need…

     // Register this instance as the Google Vr Video Listener to get

     // lifetime and control callbacks.

         GoogleVrVideo gvrv = GoogleVrApi.getGoogleVrVideo();

         if (gvrv != null) gvrv.registerGoogleVrVideoListener(this); 

    }

 

    // API you present to your C# code to start your video system

    // playing a video.

    public void play()

    {

           if (canPlayVideo && videoPlayer != null && videoPlayer.isPaused())

            videoPlayer.play();

    }

    // API you present to your C# code to stop your video system

    // playing a video

    public void pause()

    {

            if (canPlayVideo && videoPlayer != null && !videoPlayer.isPaused())

            videoPlayer.pause();        

    }

    // Google Vr Video Listener

    @Override

    public void onSurfaceAvailable(Surface surface) {

     // Google Vr has a surface available for you to render into.

     // Use this surface with your video player as needed.

         if (videoPlayer != null){

            videoPlayer.setSurface(surface);

            canPlayVideo = true;

            if (isSceneLoaded)

            {

                videoPlayer.play();

            }

          }

    }

    @Override

    public void onSurfaceUnavailable() {

     // The Google Vr Video Surface is going away. You need to remove

     // it from anything you have holding it and stop your video player.

        if (videoPlayer != null){

            videoPlayer.pause();

            canPlayVideo = false;

          }

     }

    @Override

    public void onFrameAvailable() {

     // Handle Google Vr frame available callback

    }

}

Unity C# Sample code:

using System;

using System.Collections;

using System.Collections.Generic;

using System.Text;

using UnityEngine;

public class GoogleVRVideo : MonoBehaviour {

 private AndroidJavaObject googleAvrPlayer = null;

 private AndroidJavaObject googleVrVideo = null;

 void Awake()

 {

    if (googleAvrPlayer == null)

    {

      googleAvrPlayer = new AndroidJavaObject("com.unity3d.samplevideoplayer.GoogleAVRPlayer");

    }

    AndroidJavaObject googleVrApi = new AndroidJavaClass("com.unity3d.player.GoogleVrApi");

    if (googleVrApi != null) googleVrVideo = googleVrApi.CallStatic<AndroidJavaObject>("getGoogleVrVideo");

 }

 void Start()

 {

  if (googleVrVideo != null)

  {

   // We need to tell Google VR the location of the video suface in

   // world space. Since there isn't a way to get at that info from

   // Java, we can do it here and then pass the calculated matrix

   // down to the api we expose on our UnityPlayerActivity subclass.

   Matrix4x4 wm = transform.localToWorldMatrix;
   
   wm = Camera.main.transform.parent.worldToLocalMatrix * wm;
   
   wm = wm * Matrix4x4.Scale(new Vector3(0.5f, 0.5f, 1));


   // Convert 4x4 Row Ordered matrix into a 16 element column ordered

   // flat array. The transposition is to make sure that the matrix is

   // in the order that Google uses and the we flatten it to make passing

   // over the JNI boundary easier. The complication being that you have to

   // then convert it back to an 4x4 matrix on the Java side.

   float[] matrix = new float[16];

   for (int i = 0; i < 4; i++)

   {

    for (int j = 0; j < 4; j++)

    {

     matrix[i * 4 + j] = wm[j,i];

    }

   }

   googleVrVideo.Call("setVideoLocationTransform", matrix);  

  }

  

  if (googleAvrPlayer != null)

  {

    AndroidJavaClass jc = new AndroidJavaClass("com.unity3d.player.UnityPlayer");

    AndroidJavaObject jo = jc.GetStatic<AndroidJavaObject>("currentActivity");

    googleAvrPlayer.Call("initVideoPlayer", jo);

    googleAvrPlayer.Call("play");

  }

 }

}


  • 2017–08–03 일부 편집 리뷰를 거쳐 페이지 게시됨
  • Unity 2017.2에서 추가됨 NewIn20172
HoloLens
싱글 패스 스테레오 렌더링(Single-Pass Stereo rendering)