Platform Specific Rendering Differences
Unity runs on various platforms, and in some cases there are differences in how things behave. Most of the time Unity hides the differences from you, but sometimes you can still bump into them.
Render Texture Coordinates
Vertical texture coordinate conventions differ between Direct3D, OpenGL and OpenGL ES:
- In Direct3D, the coordinate is zero at the top, and increases downwards.
- In OpenGL and OpenGL ES, the coordiante is zero at the bottom, and increases upwards.
Most of the time this does not really matter, except when rendering into a RenderTexture. In that case, Unity internally flips rendering upside down when rendering into a texture on Direct3D, so that the conventions match between the platforms.
One case where this does not happen, is when Image Effects and Anti-Aliasing is used. In this case, Unity renders to screen to get anti-aliasing, and then "resolves" rendering into a RenderTexture for further processing with an Image Effect. The resulting source texture for an image effect is not flipped upside down on Direct3D (unlike all other Render Textures).
If your Image Effect is a simple one (processes one texture at a time), this does not really matter, because Graphics.Blit takes care of that.
However, if you're processing more than one RenderTexture together in your Image Effect, most likely they will come out at different vertical orientations (only in Direct3D, and only when anti-aliasing is used). You need to manually "flip" the screen texture upside down in your vertex shader, like this:
// On D3D when AA is used, the main texture & scene depth texture // will come out in different vertical orientations. // So flip sampling of the texture when that is the case (main texture // texel size will have negative Y). #if SHADER_API_D3D9 if (_MainTex_TexelSize.y < 0) uv.y = 1-uv.y; #endif
Check out Edge Detection scene in Shader Replacement sample project for an example of this. Edge detection there uses both screen texture and Camera's Depth+Normals texture.
Using OpenGL Shading Language (GLSL) shaders with OpenGL ES 2.0
OpenGL ES 2.0 provides only limited native support for OpenGL Shading Language (GLSL), for instance OpenGL ES 2.0 layer provides no built-in parameters to the shader.
Unity implements built-in parameters for you exactly in the same way as OpenGL does, however following built-in parameters are missing:
- gl_ClipVertex
- gl_SecondaryColor
- gl_DepthRange
- halfVector property of the gl_LightSourceParameters structure
- gl_FrontFacing
- gl_FrontLightModelProduct
- gl_BackLightModelProduct
- gl_BackMaterial
- gl_Point
- gl_PointSize
- gl_ClipPlane
- gl_EyePlaneR, gl_EyePlaneS, gl_EyePlaneT, gl_EyePlaneQ
- gl_ObjectPlaneR, gl_ObjectPlaneS, gl_ObjectPlaneT, gl_ObjectPlaneQ
- gl_Fog
iPad2 and MSAA and alpha-blended geometry
There is a bug in apple driver resulting in artifacts when MSAA is enabled and alpha-blended geometry is drawn with non RGBA colorMask. To prevent artifacts we force RGBA colorMask when this configuration is encountered, though it will render built-in Glow FX unusable (as it needs DST_ALPHA for intensity value). Also, please update your shaders if you wrote them yourself (see "Render Setup -> ColorMask" in Pass Docs).
Page last updated: 2011-05-30