Version: Unity 6.0 (6000.0)
语言 : 中文
在 URP 中将摄像机的输出渲染到渲染纹理
URP 中的摄像机渲染顺序

在 URP 中创建渲染请求

要触发摄像机渲染到通用渲染管线 (URP) 渲染循环之外的渲染纹理,请在 C# 脚本中使用 SubmitRenderRequest API。

此示例展示了如何使用渲染请求和回调来监控这些请求的进度。您可以在示例代码部分中查看完整的代码示例。

从摄像机堆栈中渲染单个摄像机

要渲染单个摄像机而不考虑整个摄像机堆叠,请使用 UniversalRenderPipeline.SingleCameraRequest API。请执行以下步骤:

  1. 创建名为 SingleCameraRenderRequestExample 的 C# 脚本,并添加如下所示的 using 语句。

    using System.Collections;
    using UnityEngine;
    using UnityEngine.Rendering;
    using UnityEngine.Rendering.Universal;
    
    public class SingleCameraRenderRequestExample : MonoBehaviour
    {
    
    }
    
  2. 创建数组来存储要从中渲染的摄像机和要渲染到的渲染纹理。

    public class SingleCameraRenderRequestExample : MonoBehaviour
    {
        public Camera[] cameras;
        public RenderTexture[] renderTextures;
    }
    
  3. Start 方法中,添加检查以确保 camerasrenderTextures 数组有效且包含正确的数据,然后再继续运行脚本。

    void Start()
    {
        // Make sure all data is valid before you start the component
        if (cameras == null || cameras.Length == 0 || renderTextures == null || cameras.Length != renderTextures.Length)
        {
            Debug.LogError("Invalid setup");
            return;
        }
    }
    
  4. SingleCameraRenderRequest 类中创建名称为 SendSingleRenderRequests 且返回类型为 void 的方法。

  5. SendSingleRenderRequests 方法中,添加遍历 cameras 数组的 for 循环,如下所示。

    void SendSingleRenderRequests()
    {
        for (int i = 0; i < cameras.Length; i++)
        {
    
        }
    }
    
  6. for 循环中,在名称为 request 的变量中创建 UniversalRenderPipeline.SingleCameraRequest 类型的渲染请求。然后,使用 RenderPipeline.SupportsRenderRequest 检查活动渲染管线是否支持此渲染请求类型。

  7. 如果活动渲染管线支持此渲染请求,将摄像机输出的目标设置为 renderTextures 数组中匹配的渲染纹理。然后使用 RenderPipeline.SubmitRenderRequest 提交渲染请求。

    void SendSingleRenderRequests()
    {
        for (int i = 0; i < cameras.Length; i++)
        {
            UniversalRenderPipeline.SingleCameraRequest request =
                new UniversalRenderPipeline.SingleCameraRequest();
    
            // Check if the active render pipeline supports the render request
            if (RenderPipeline.SupportsRenderRequest(cameras[i], request))
            {
                // Set the destination of the camera output to the matching RenderTexture
                request.destination = renderTextures[i];
                    
                // Render the camera output to the RenderTexture synchronously
                // When this is complete, the RenderTexture in renderTextures[i] contains the scene rendered from the point
                // of view of the Camera in cameras[i]
                RenderPipeline.SubmitRenderRequest(cameras[i], request);
            }
        }
    }
    
  8. SendSingleRenderRequest 方法上方,创建名为 RenderSingleRequestNextFrameIEnumerator 接口。

  9. RenderSingleRequestNextFrame 中,等待主摄像机完成渲染,然后调用 SendSingleRenderRequest。等待帧结束,然后使用 StartCoroutine 在协程中重新启动 RenderSingleRequestNextFrame

    IEnumerator RenderSingleRequestNextFrame()
    {
        // Wait for the main camera to finish rendering
        yield return new WaitForEndOfFrame();
    
        // Enqueue one render request for each camera
        SendSingleRenderRequests();
    
        // Wait for the end of the frame
        yield return new WaitForEndOfFrame();
    
        // Restart the coroutine
        StartCoroutine(RenderSingleRequestNextFrame());
    }
    
  10. Start 方法中,使用 StartCoroutine 在协程中调用 RenderSingleRequestNextFrame

    void Start()
    {
        // Make sure all data is valid before you start the component
        if (cameras == null || cameras.Length == 0 || renderTextures == null || cameras.Length != renderTextures.Length)
        {
            Debug.LogError("Invalid setup");
            return;
        }
    
        // Start the asynchronous coroutine
        StartCoroutine(RenderSingleRequestNextFrame());
    }
    
  11. 在 Editor 中,在场景中创建一个空的游戏对象,并将 SingleCameraRenderRequestExample.cs 添加为组件

  12. 在检视面板 (Inspector) 窗口中,将要从中渲染的摄像机添加到 cameras 列表,并将要渲染到的渲染纹理添加到 renderTextures 列表。

注意:cameras 列表中的摄像机数量和 renderTextures 列表中的渲染纹理数量必须相同。

现在,当进入播放模式时,添加的摄像机会渲染到添加的渲染纹理。

检查摄像机何时完成渲染

要检查摄像机何时完成渲染,请使用 RenderPipelineManager API 中的任何回调。

以下示例使用了 RenderPipelineManager.endContextRendering 回调。

  1. using System.Collections.Generic 添加到 SingleCameraRenderRequestExample.cs 文件的顶部。

  2. Start 方法末尾,订阅 endContextRendering 回调。

    void Start()
    {
        // Make sure all data is valid before you start the component
        if (cameras == null || cameras.Length == 0 || renderTextures == null || cameras.Length != renderTextures.Length)
        {
            Debug.LogError("Invalid setup");
            return;
        }
    
        // Start the asynchronous coroutine
        StartCoroutine(RenderSingleRequestNextFrame());
            
        // Call a method called OnEndContextRendering when a camera finishes rendering
        RenderPipelineManager.endContextRendering += OnEndContextRendering;
    }
    
  3. 创建名称为 OnEndContextRendering 的方法。当触发 endContextRendering 回调时,Unity 会运行此方法。

    void OnEndContextRendering(ScriptableRenderContext context, List<Camera> cameras)
    {
        // Create a log to show cameras have finished rendering
        Debug.Log("All cameras have finished rendering.");
    }
    
  4. 要从 endContextRendering 回调中取消订阅 OnEndContextRendering 方法,请将 OnDestroy 方法添加到 SingleCameraRenderRequestExample 类。

    void OnDestroy()
    {
        // End the subscription to the callback
        RenderPipelineManager.endContextRendering -= OnEndContextRendering;
    }
    

此脚本现在可以像以前一样工作,但会向控制台窗口记录有关哪些摄像机已完成渲染的消息。

示例代码

using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.Rendering;
using UnityEngine.Rendering.Universal;

public class SingleCameraRenderRequest : MonoBehaviour
{
    public Camera[] cameras;
    public RenderTexture[] renderTextures;

    void Start()
    {
        // Make sure all data is valid before you start the component
        if (cameras == null || cameras.Length == 0 || renderTextures == null || cameras.Length != renderTextures.Length)
        {
            Debug.LogError("Invalid setup");
            return;
        }

        // Start the asynchronous coroutine
        StartCoroutine(RenderSingleRequestNextFrame());
        
        // Call a method called OnEndContextRendering when a camera finishes rendering
        RenderPipelineManager.endContextRendering += OnEndContextRendering;
    }

    void OnEndContextRendering(ScriptableRenderContext context, List<Camera> cameras)
    {
        // Create a log to show cameras have finished rendering
        Debug.Log("All cameras have finished rendering.");
    }

    void OnDestroy()
    {
        // End the subscription to the callback
        RenderPipelineManager.endContextRendering -= OnEndContextRendering;
    }

    IEnumerator RenderSingleRequestNextFrame()
    {
        // Wait for the main camera to finish rendering
        yield return new WaitForEndOfFrame();

        // Enqueue one render request for each camera
        SendSingleRenderRequests();

        // Wait for the end of the frame
        yield return new WaitForEndOfFrame();

        // Restart the coroutine
        StartCoroutine(RenderSingleRequestNextFrame());
    }

    void SendSingleRenderRequests()
    {
        for (int i = 0; i < cameras.Length; i++)
        {
            UniversalRenderPipeline.SingleCameraRequest request =
                new UniversalRenderPipeline.SingleCameraRequest();

            // Check if the active render pipeline supports the render request
            if (RenderPipeline.SupportsRenderRequest(cameras[i], request))
            {
                // Set the destination of the camera output to the matching RenderTexture
                request.destination = renderTextures[i];
                
                // Render the camera output to the RenderTexture synchronously
                RenderPipeline.SubmitRenderRequest(cameras[i], request);

                // At this point, the RenderTexture in renderTextures[i] contains the scene rendered from the point
                // of view of the Camera in cameras[i]
            }
        }
    }
}
在 URP 中将摄像机的输出渲染到渲染纹理
URP 中的摄像机渲染顺序