huawei AREngine uses builtin rendering pipeline to draw preview flow by default. The custom rendering pipeline widely used in the industry today is a little outdated. Looking at the implementation of ARFoundation, it already supports the seamless switching between builtin and urp by default, so I thought about using urp to draw the preview flow of AREngine.
The preview stream implementation of builtin:
First of all, the preview stream information is directly transmitted from GPU to glsl(shader). This does not need to go through cpu. For example, it is very efficient to convert to yuv format. For example, the implementation of the code in the official c# is a lot of redundant code, which is completely unnecessary and can be deleted.
The real transmission of camera information comes from the use of the special envoy's tag samplerExternalOES in glsl:
uniform samplerExternalOES _MainTex;
For details on how Android obtains preview streams and binds textures, please refer to link.
Before using samplerExternalOES texture, you need to turn on camera. This step can be realized by using either the sessioncomponent of AREngine or the WebCamera of unity. After obtaining the texture, how to display it on the screen?
AREngine uses CommandBuffer Blit, which is consistent with ARCore, and binds two events: BeforeForwardOpaque and BeforeGBuffer, which correspond to forward rendering and delayed rendering respectively.
m_VideoCommandBuffer = new CommandBuffer(); m_VideoCommandBuffer.Blit(BackGroundMaterial.mainTexture, BuiltinRenderTextureType.CurrentActive, BackGroundMaterial); m_Camera.AddCommandBuffer(CameraEvent.BeforeForwardOpaque, m_VideoCommandBuffer); m_Camera.AddCommandBuffer(CameraEvent.BeforeGBuffer, m_VideoCommandBuffer);
Implementation of preview flow by urp rendering
If you want to implement the above code in the urp pipeline, put backgroundrenderer CS cannot be installed in scene. The urp Renderer Feature allows us to add additional rendering channels to the URP Renderer, and supports us to rewrite Asset configuration, so that we can customize the rendering order, rendered objects, materials, and so on. For a detailed introduction to the detailed Render Feature, please refer to the official of unity curriculum.
Here, create a new RenderFeature of ARBackgroundRenderFeature and configure it in the pipeline of Render:
The material of ARBackground is passed into the corresponding RenderPass. Because there is only forward rendering, the renderpasevent here selects BeforeRenderingOpaques. A CommandBuffer is encapsulated in the Execute function to draw the preview flow. Specific implementation reference:
public class ARBackgroundRenderPassFeature : ScriptableRendererFeature { [System.Serializable] public class Settings { public Material material; public RenderPassEvent Event = RenderPassEvent.AfterRenderingOpaques; } class CustomRenderPass : ScriptableRenderPass { private Settings _settings; public CustomRenderPass(Settings sts) { _settings = sts; renderPassEvent = sts.Event; } // Here you can implement the rendering logic. // Use <c>ScriptableRenderContext</c> to issue drawing commands or execute command buffers // https://docs.unity3d.com/ScriptReference/Rendering.ScriptableRenderContext.html // You don't have to call ScriptableRenderContext.submit, the render pipeline will call it at specific points in the pipeline. public override void Execute(ScriptableRenderContext context, ref RenderingData renderingData) { if (_settings.material != null) { CommandBuffer cmd = CommandBufferPool.Get(); cmd.Blit(_settings.material.mainTexture, BuiltinRenderTextureType.CurrentActive, _settings.material); context.ExecuteCommandBuffer(cmd); cmd.Clear(); CommandBufferPool.Release(cmd); } } // Cleanup any allocated resources that were created during the execution of this render pass. public override void OnCameraCleanup(CommandBuffer cmd) { } } }
Since the platform specified in the glsl shader is gles3, the correct execution here can be guaranteed on the Android mobile side, and the result obtained on the PC platform is purple.
#ifdef SHADER_API_GLES3 #pragma only_renderers gles3 #extension GL_OES_EGL_image_external_essl3 : require #endif
However, an additional shader can be used to simulate the running results of the mobile phone. Only one image is displayed on the pc platform instead of the preview stream on the mobile phone. The author has EditorBackground.shader Upload to github , readers can deal with it according to their own use needs. In order to get closer to the results on the mobile phone, the shader used still uses glsl language, not the more commonly used cg or hlsl. Therefore, when the pc is running, it also needs to switch to the OpenGL platform, which only needs to be processed in EditorSetting as shown in the following figure:
After the Graphic Setting switch is completed, pay attention to restart unity.
epilogue
The AREngine project is completely switched to the urp rendering pipeline, which is not only the processing of preview stream, but also the other parts involved in rendering need to be changed accordingly. Readers can adjust according to their actual project needs, and they also expect to communicate with the author about the experience of encountering or solving relevant problems in the process of using urp.