0/ 56/ /0

This time we will share several technical topics which are related to Program development. It is recommended to read for 15 minutes. Any unique insights or discoveries, please feel free to contact us or discuss.


Texture Steaming

Q1: The project has been upgraded to 2018.4 and it will be online this year.


1. Unity has added a StreamingMipmap stream, I want to ask whether the loadedMipmapLevel is the Mipmap level currently being used by the mapping. The official explanation is that “Which mipmap level is currently loaded by the streaming system“. I tried it, to approach or move away from an object, this value is fixed, if not, how can I get the Mipmap level of the current mapping?


2. In the version Unity 2018.4, can I use the Job System? The official says that this is an experiment, which may cause a crash. I don’t know if you enable this or StreamingMipmap in your online projects?

This value is accurate on the real machine, but there will be some loading bugs on the engine side. We also find this problem during the test, when the project is re-established and re-imported, it will be displayed correctly on the real machine.


At UWA DAY 2019, we explained in detail some of the problems we encountered in testing Texture Streaming and a lot of Streaming logic explanations. If you are interested in it, you can refer to:, you can save a lot of unnecessary testing and groping time.


Regarding your second question, this is difficult to answer, and it has not yet been seen that it is used in online projects. There is an article here, which is recommended for you and the team who are using TextureStremingJob: TextureStreamingJob crash analysis

This answer is provided by UWA.


Q2: In Unity, you can get the parallel light direction by _WorldSpaceLightPos0. Unity regards this uniform as useful thing, so it automatically pass it in.


I search for a long time in Unreal’s material node, and I don’t find a node similar to Light Vector indicating the direction of parallel light. In other words, in Unreal, parallel light can only be imported from outside through Blueprint. I know that Blueprint + Material Parameter Collections can meet my requirements, just like SetGlobalVector.


So I want to ask if I can get the direction of parallel light directly from the material node, there is no need to import it from the outside.

There is a method, as shown in the image, create a custom node, and then input the code as shown below in the parameters, and make it as Material Function, then it can be reused everywhere.

But personally, I don’t recommend it, obviously this is a hack method. It relies on the underlying Shader implementation, but cannot guarantee that it will work on every version (4.22 test).


The reason is because Unreal assumes that the main nodes provided by the material editor can be used for all lighting processes (forward shading/defer shading), and the setting of obtaining Light Direction is obviously not in line with Deferred supporting ultra-multiple light sources. Shading pipeline, so the official itself will not provide nodes.


So by the way, this method can only be used in Forward Shading. Unlike Unity, for the Shader written by ourselves, Epic officially does not care about compatibility. Maybe in this version, it can be used, but in the next version, it cannot be used if the engine Shader is changed (but this code should be able to adapt to some versions).


Therefore, the plan you mentioned is officially encouraged.



#ifdef EyeAdaptationStruct
return TranslucentBasePass.Shared.Forward.DirectionalLightDirection;
return OpaqueBasePass.Shared.Forward.DirectionalLightDirection;
return 0;

Thanks to Zhao Wenyong for providing the answer above.


Q3: When SkinnedMeshRenderer with sub-mesh is using BakeMesh to create afterimage, I find that only the first mesh in the sub-mesh can be baked out. There are four sub-grids on this character. Does anyone meet a similar problem? (Unity version: 2017.4.18f1)

The original method is: draw a Mesh directly with Graphics, but this kind of result does not support sub-grid.

Graphics.DrawMesh(ghostMesh.mesh, ghostMesh.position, ghostMesh.rotation, ghostMesh.material, 0, Camera.main);

Later the method is: instantiate a GameObject, and then hang the material with the same number as sub-grids, then it can be displayed normally.

public void InstanceGhost() {
    if (mesh.subMeshCount > 0) {
        ghostGo = new GameObject("ghostGo");
        SkinnedMeshRenderer smr = ghostGo.AddComponent<SkinnedMeshRenderer>();
        smr.sharedMesh = mesh;
        Material[] mats = new Material[smr.sharedMesh.subMeshCount];

        for (int i = 0; i < mesh.subMeshCount; i++) {
            mats[i] = material;

        smr.materials = mats;

        ghostGo.transform.parent = GetGhostRoot();
        ghostGo.transform.position = position;
        ghostGo.transform.rotation = rotation;

Thanks to Fan Shiqing for providing the answer above.


Q4: I would like to ask that from what point do you start post-processing optimization? Can you share it?

Unity PostProcess has limited optimization space on the code, and you can remove some details on the code according to the use of the project. Our previous projects didn’t have much code optimization, but mainly to solve the bug.


In use, you can plan the post-effects. The most important ones in the post-effects are Bloom and ColorGrading. Among them, Bloom has a high performance overhead, so it can be opened in high-quality images. In mid-quality images, only ColorGrading is used to make the picture consistent. In low configuration, you can turn off post-processing. Then there are many other effects which can enhance the picture, such as depth of field, SSAO, anti-aliasing, Chromatic Aberration, Vignette, etc., they can be designed to turn on and off according to the situation, and it is turned off by default.

Thanks to Wen Ya for providing the answer above.

If you are using PostProcess Stack V2, you can refer to this video:


It talks about the various options which have a large impact on performance.

Thanks to deviljz for providing the answer above.


Q5: There is currently a need to separate the baking information from the scene. In other words, the prefab needs to show shadows in any scene.


I search the information on the Internet and find that I only need to record the lighting data lightmapIndex and lightmapScaleOffset of the prefab and set the lightmap loading to lightmapsetting.


But I find in the testing process: the setting here is successful;

It is also set here, but it does not take effect. The entire prefab is still black.

If I set this to any default Data. Then it will take effect if I perform the previous operation. Why?

You need to manually assign the value after opening the scene or instantiating Prefab, than it will take effect.

Thanks to Zheng Xiao for providing the answer above.

This is the 73th UWA Technology Sharing for Unity Development. As we all know, Our life has a limit but knowledge has none. These problems are only the tip of the iceberg and there are more technical problems during our program development which deserve to discuss. Welcome to join UWA Q&A community, let‘s explore and share knowledge together!




Post a Reply