Merge pull request #2511 from zhuxudong/document/post-process

[Document] post-processing related
This commit is contained in:
鹅叔
2025-01-24 11:53:11 +08:00
committed by GitHub
10 changed files with 562 additions and 207 deletions

View File

@@ -15,11 +15,11 @@ Structurally, each Engine can contain one or more active scenes (currently the e
Right-click in the **[Assets Panel](/en/docs/assets/interface)** (or the + sign at the top right of the assets panel) to create a scene, double-click the scene to switch to it:
![scene-switch](https://gw.alipayobjects.com/zos/OasisHub/eef870a7-2630-4f74-8c0e-478696a553b0/2024-03-19%25252018.04.02.gif)
<Image src="https://gw.alipayobjects.com/zos/OasisHub/eef870a7-2630-4f74-8c0e-478696a553b0/2024-03-19%25252018.04.02.gif" />
### Properties Panel
<img src="https://gw.alipayobjects.com/zos/OasisHub/2eaad4b1-d3e3-4c17-ae7f-58b488cd3606/image-20240718190944508.png" alt="image-20240718190944508" style="zoom:50%;" />
<Image src="https://gw.alipayobjects.com/zos/OasisHub/5770f3d1-6840-438b-adf0-d759048fac6b/image-20250124110620126.png" />
### Ambient Light
@@ -33,29 +33,24 @@ For details, please refer to the [Background Tutorial](/en/docs/graphics/backgro
For details, please refer to the [Shadow Tutorial](/en/docs/graphics/light/shadow/).
### Post-Processing
For details, please refer to the [Post-Processing Tutorial](/en/docs/graphics/postProcess/postProcess/).
### Fog
You can add **linear, exponential, exponential squared** 3 types of fog to the entire scene:
![Fog](https://gw.alipayobjects.com/zos/OasisHub/224fbc16-e60c-47ca-845b-5f7c09563c83/2024-03-19%25252018.08.23.gif)
<Image src="https://gw.alipayobjects.com/zos/OasisHub/224fbc16-e60c-47ca-845b-5f7c09563c83/2024-03-19%25252018.08.23.gif" />
## Script Usage
| Property Name | Description |
| :-------------------------------------------- | :---------- |
| [scenes](/apis/core/#SceneManager-scenes) | Scene list |
| Property Name | Description |
| :---------------------------------------- | :---------- |
| [scenes](/apis/core/#SceneManager-scenes) | Scene list |
| Method Name | Description |
| :--------------------------------------------------- | :---------- |
| [addScene](/apis/core/#SceneManager-addScene) | Add scene |
| [removeScene](/apis/core/#SceneManager-removeScene) | Remove scene|
| [mergeScenes](/apis/core/#SceneManager-mergeScenes) | Merge scenes|
| [loadScene](/apis/core/#SceneManager-loadScene) | Load scene |
| Method Name | Description |
| :-------------------------------------------------- | :----------- |
| [addScene](/apis/core/#SceneManager-addScene) | Add scene |
| [removeScene](/apis/core/#SceneManager-removeScene) | Remove scene |
| [mergeScenes](/apis/core/#SceneManager-mergeScenes) | Merge scenes |
| [loadScene](/apis/core/#SceneManager-loadScene) | Load scene |
### Loading a Scene
@@ -64,11 +59,9 @@ If you want to load a **Scene** asset as a scene in the application, you can use
```typescript
const sceneUrl = "...";
engine.resourceManager
.load({ type: AssetType.Scene, url: "..." })
.then((scene) => {
engine.sceneManager.addScene(scene);
});
engine.resourceManager.load({ type: AssetType.Scene, url: "..." }).then((scene) => {
engine.sceneManager.addScene(scene);
});
```
### Getting Scene Objects
@@ -126,12 +119,12 @@ Call `scene.destroy()` to destroy a scene. The destroyed scene will also be auto
### Entity Tree Management
| Method Name | Description |
| :------------------------------------------------------- | :---------------------------------------------------------------------------------------------------------- |
| [createRootEntity](/apis/core/#Scene-createRootEntity) | The newly created _scene_ does not have a root entity by default and needs to be created manually |
| [addRootEntity](/apis/core/#Scene-addRootEntity) | You can directly create a new entity or add an existing entity |
| [removeRootEntity](/apis/core/#Scene-removeRootEntity) | Remove the root entity |
| [getRootEntity](/apis/core/#Scene-getRootEntity) | Find the root entity, you can get all root entities or a single entity object. Note that all entities are read-only arrays and cannot change length or order |
| Method Name | Description |
| :-- | :-- |
| [createRootEntity](/apis/core/#Scene-createRootEntity) | The newly created _scene_ does not have a root entity by default and needs to be created manually |
| [addRootEntity](/apis/core/#Scene-addRootEntity) | You can directly create a new entity or add an existing entity |
| [removeRootEntity](/apis/core/#Scene-removeRootEntity) | Remove the root entity |
| [getRootEntity](/apis/core/#Scene-getRootEntity) | Find the root entity, you can get all root entities or a single entity object. Note that all entities are read-only arrays and cannot change length or order |
```typescript
const engine = await WebGLEngine.create({ canvas: "demo" });
@@ -161,4 +154,7 @@ const cameraEntity = rootEntity.createChild("camera");
cameraEntity.addComponent(Camera);
```
```
```

View File

@@ -81,6 +81,7 @@ The functionality corresponding to each property is as follows:
| | [msaaSamples](/apis/core/#Camera-msaaSamples) | Number of samples for multi-sample anti-aliasing, effective only when the standalone canvas is enabled, such as `enableHDR`, `enablePostProcess`, `opaqueTextureEnabled`. |
| | [enableHDR](/apis/core/#Camera-enableHDR) | Whether to enable HDR rendering, allowing the shader's output color to be stored using floating-point numbers, providing a wider range of values for post-processing and other scenarios. |
| | [enablePostProcess](/apis/core/#Camera-enablePostProcess) | Whether to enable post-processing. For post-processing configuration, see [Post-Processing Tutorial](/en/docs/graphics/postProcess/postProcess). |
| | [postProcessMask](/apis/core/#Camera-postProcessMask) | Post-processing mask, which determines the effective post-processing components. For post-processing configuration, [Post-Processing Tutorial](/en/docs/graphics/postProcess/postProcess). |
### Culling Mask

View File

@@ -0,0 +1,177 @@
---
order: 2
title: Custom Post-Processing
---
In the post-processing system, the effect ([Effect](/apis/core/#PostProcessEffect)) is responsible for maintaining the data layer, and the pipeline ([Pass](/apis/core/#PostProcessPass)) is responsible for writing the rendering logic. In the pipeline, calling the [getBlendEffect](/apis/core/#PostProcessManager-getBlendEffect) method can get the final post-processing data after **global/local** mixing.
```mermaid
graph TD;
subgraph Pass
A1[Custom Pass ...]
A2[Uber Pass]
A3[Custom Pass ...]
end
subgraph Effect
B1[Bloom Effect]
B2[Tonemapping Effect]
B3[Custom Effect]
end
A1 --> A2 --> A3
B1 --> B2 --> B3
A1 ---|Blend| B1
```
The engine has built-in [PostProcessUberPass](/apis/core/#PostProcessUberPass), which is used with [BloomEffect](/apis/core/#BloomEffect) and [TonemappingEffect](/apis/core/#TonemappingEffect) data. If you want to customize post-processing effects, we need to create a new Pass and then create an Effect based on whether the data needs to be fused.
## A Demo
Here we will simply implement a grayscale image post-processing effect~
<Comparison
leftSrc="https://gw.alipayobjects.com/zos/OasisHub/f5d3ea3d-47a4-4618-b3d6-0ea46321e786/image-20250115162952605.png"
leftText="Uber Pass"
rightSrc="https://gw.alipayobjects.com/zos/OasisHub/75ccba72-b70b-49f6-812a-e00305e89201/image-20250115163026345.png"
rightText="Uber Pass + Custom Pass"
/>
### 1. Add script
Let's create a script first. Next, we will write our custom post-processing Shader and Pass in this script file.
<Image src="https://gw.alipayobjects.com/zos/OasisHub/6df5fd1c-a24c-4e32-a62e-841b61fe76c7/image-20250124111456360.png" />
We refer to [Adding post-processing methods](/en/docs/graphics/postProcess/postProcess/#1-adding-a-post-processing-component) to add global or local post-processing components, and hang the script on this entity:
<Image src="https://gw.alipayobjects.com/zos/OasisHub/62a36df2-fdc5-4c13-9a32-c41f963d18b5/image-20250124112043221.png" />
### 2. Shader Writing
The algorithm is not special, but you need to pay attention to `renderer_BlitTexture` This built-in variable is the post-processing rendering result of the previous Pass. Here it is the result after Bloom and Tonemapping. We display this result in grayscale.
```ts showLineNumbers {14} filename="CustomPostProcessPass.ts"
const customShader = Shader.create(
"Gray Scale Shader",
`
attribute vec4 POSITION_UV;
varying vec2 v_uv;
void main() {
gl_Position = vec4(POSITION_UV.xy, 0.0, 1.0);
v_uv = POSITION_UV.zw;
}
`,
`
varying vec2 v_uv;
uniform sampler2D renderer_BlitTexture;
void main(){
vec4 color = texture2D(renderer_BlitTexture, v_uv);
float grayScale = 0.299 * color.r + 0.587 * color.g + 0.114 * color.b;
gl_FragColor = vec4(vec3(grayScale), 1.0);
}
`
);
```
### 3. Create a new Pass
We create a new Pass, directly [Blitter](/apis/core/#Blitter) to the screen in the [onRender hook](/apis/core/#PostProcessUberPass-onRender), and then add this Pass to the engine.
```ts showLineNumbers {10,15} filename="CustomPostProcessPass.ts"
class CustomPass extends PostProcessPass {
private _blitMaterial: Material;
constructor(engine: Engine) {
super(engine);
this._blitMaterial = new Material(this.engine, customShader);
}
onRender(_, srcTexture: Texture2D, dst: RenderTarget): void {
Blitter.blitTexture(this.engine, srcTexture, dst, undefined, undefined, this._blitMaterial, 0);
}
}
const customPass = new CustomPass(engine);
engine.addPostProcessPass(customPass);
```
The execution order of Pass is executed after Uber Pass by default, that is [`PostProcessPassEvent.AfterUber`](/apis/core/#PostProcessPassEvent-AfterUber), we can also manually modify the execution order of the pipeline:
```ts filename="CustomPostProcessPass.ts"
customPass.event = PostProcessPassEvent.BeforeUber;
```
Whether the Pass is effective is determined by the Pass's [isActive](/apis/core/#PostProcessUberPass-isActive) by default. We can also modify the effectiveness logic, such as whether the intensity is greater than 0:
```ts showLineNumbers {2-9} filename="CustomPostProcessPass.ts"
class CustomPass extends PostProcessPass {
override isValid(postProcessManager: PostProcessManager): boolean {
if (!this.isActive) {
return false;
}
const customEffectBlend = postProcessManager.getBlendEffect(CustomEffect);
return customEffectBlend?.intensity > 0;
}
}
```
### 4. Blend data
The above steps 2 and 3 can already customize the post-processing effect. Here is an advanced version of Blend data.
Take `intensity` as an example. We define a `CustomEffect`, which is specifically used to fuse intensity. The data to be fused is also very simple. The engine has encapsulated a series of post-processing parameters, such as [floating point type parameters](/apis/core/#PostProcessEffectFloatParameter).
```ts showLineNumbers {2,7} filename="CustomPostProcessPass.ts"
class CustomEffect extends PostProcessEffect {
intensity = new PostProcessEffectFloatParameter(0.8);
}
// Add this effect to the post-processing component. postProcess can be created separately,
// or it can be the same as Bloom and other effects. It depends on the Blend requirements~
postProcess.addEffect(CustomEffect);
```
After defining the data, you need to change the `onRender` hook of the custom Pass to get the Blend data:
```ts showLineNumbers {3-6} filename="CustomPostProcessPass.ts"
class CustomPass extends PostProcessPass {
onRender(camera: Camera, srcTexture: Texture2D, dst: RenderTarget): void {
const postProcessManager = camera.scene.postProcessManager;
const customEffectBlend = postProcessManager.getBlendEffect(CustomEffect);
if (customEffectBlend) {
this._blitMaterial.shaderData.setFloat("u_intensity", customEffectBlend.intensity.value);
}
Blitter.blitTexture(this.engine, srcTexture, dst, undefined, undefined, this._blitMaterial, 0);
}
}
```
As you can see, we continuously set the fused intensity in the `onRender` hook of the custom pipeline, and then consume this data in the shader:
```ts showLineNumbers {8,12} filename="CustomPostProcessPass.ts"
const customShader = Shader.create(
"Gray Scale Shader",
`
......
`,
`
......
uniform float u_intensity;
void main(){
......
gl_FragColor = vec4(mix(color.rgb, vec3(grayScale), u_intensity), 1.0);
}
`
);
```
If your post-processing component is in local mode, we can also use [Blend Distance](/apis/core/#PostProcess-blendDistance) to set the distance at which the camera is close to the collision body to start blending:
<Image src="https://gw.alipayobjects.com/zos/OasisHub/76b084c1-f517-47d7-8067-7f838db002f8/2025-01-15%25252017.38.30.gif" />

View File

@@ -3,95 +3,94 @@ order: 0
title: Post Process Overview
---
The post-processing system can "process" the results of scene rendering.
The post-processing system can "process" the results rendered by the camera.
<Image
src="https://gw.alipayobjects.com/zos/OasisHub/3a50ed18-c2d4-4b33-a4e6-af79f2c273f8/2024-07-18%25252018.08.30.gif"
figcaption="Close post-processing"
<Comparison
leftSrc="https://gw.alipayobjects.com/zos/OasisHub/3a50ed18-c2d4-4b33-a4e6-af79f2c273f8/2024-07-18%25252018.08.30.gif"
leftText="OFF"
rightSrc="https://gw.alipayobjects.com/zos/OasisHub/4bd5f985-1b82-4aca-b6fa-fd521aab8f57/2024-07-18%25252018.15.30.gif"
rightText="ON"
/>
<Image
src="https://gw.alipayobjects.com/zos/OasisHub/4bd5f985-1b82-4aca-b6fa-fd521aab8f57/2024-07-18%25252018.15.30.gif"
figcaption="Open post-processing"
/>
## Post-Processing Configuration
## Use post-processing
There are two modes for post-processing:
### 1. Post-processing configuration
- Global Mode: Affects all cameras in the current scene.
- Local Mode: Only effective when the camera is within the collision range of the post-processing entity.
Post-processing configuration is uniformly placed in [scene](/en/docs/core/scene) panel, in order to prevent performance waste, the default **turn off**, users only need to turn on the main switch to activate all post-processing effects:
The post-processing component has the following properties to control effects, modes, blending distances, etc.:
<Image src="https://gw.alipayobjects.com/zos/OasisHub/50f6a2aa-0463-4b66-b54e-edff71187077/image-20240718193530098.png" />
| Property | Description |
| :-- | :-- |
| [isGlobal](/apis/core/#PostProcess-isGlobal) | Controls whether this post-processing component is in global or local mode. |
| [Blend Distance](/apis/core/#PostProcess-blendDistance) | In local mode, controls how close the camera must be to the collider before blending effects begins. |
| [Priority](/apis/core/#PostProcess-priority) | When there are multiple post-processing components in the scene, a higher priority means blending/overriding starts later. |
| [Layer](/apis/core/#PostProcess-layer) | Used with the camera's [post-processing mask](/apis/core/#Camera-postProcessMask) to determine which post-processing components are effective. |
| [Add Effect](/apis/core/#PostProcess-addEffect) | Adds post-processing effects. |
## Mixing rules
- In global mode, the mixing rules are based on the post-processing component with the highest priority.
- In local mode, it will start from the **mixing distance** of the camera from the collision body, and transition from the mixed value of the previous post-processing component to the value of the current post-processing component. If there is no previous one, the defined value of the post-processing effect will be used as the starting value.
For example, the intensity is defined as `0`, which is defined by [PostProcessEffectFloatParameter](/apis/core/#PostProcessEffectFloatParameter):
- PostProcessing Component 1: Global mode, intensity is `0.5`, after blending it is `0.5`
- PostProcessing Component 2: Local mode, intensity is `1`, the camera distance to the collider is half of the blend distance, after blending it is `mix( 0.5, 1, 1 - distance / blendDistance)` = `0.75`
<Callout type="info">
For specific post-processing effect configuration, please refer to [Post-processing Effect
If you want to customize post-processing, please refer to
[Documentation](/en/docs/graphics/postProcess/customPostProcess/#3-blend-data)
</Callout>
## Using Post-Processing
### 1. Adding a Post-Processing Component
In the hierarchy panel, several modes for global and local post-processing are preset; simply select and add to use.
<Image src="https://gw.alipayobjects.com/zos/OasisHub/a6e9a327-1823-4dde-94a8-89bb4bf02e3a/2025-01-15%25252011.59.50.gif" />
Of course, you can also manually add post-processing components. Local mode needs to be used with colliders:
<Image src="https://gw.alipayobjects.com/zos/OasisHub/6eb3b8d5-27d1-419d-8bf6-65371861fb97/2025-01-15%25252014.51.12.gif" />
<Callout type="info">
For specific post-processing effects configuration, please refer to the [Post-Processing Effects
List](/en/docs/graphics/postProcess/effects)
</Callout>
<Callout type="warning">
As of version 1.3, the engine does not expose public APIs (because after supporting post-processing extensions, the
APIs may change). We recommend that users perform post-processing operations in the editor. If you want to use the
internal experimental interface, you can call the following code:
</Callout>
### 2. Camera Switch
```typescript
// Get the post-processing manager
// @ts-ignore
const postProcessManager = scene._postProcessManager;
// Get BloomEffect
const bloomEffect = postProcessManager._bloomEffect as BloomEffect;
// Get TonemappingEffect
const tonemappingEffect = postProcessManager._tonemappingEffect as TonemappingEffect;
The camera preview area is controlled by the **Camera Component**. In the camera component, the following properties will also affect the post-processing effects:
// Activate the main switch
postProcessManager.isActive = true;
// Adjust BloomEffect properties
bloomEffect.enabled = true;
bloomEffect.downScale = BloomDownScaleMode.Half;
bloomEffect.threshold = 0.9;
bloomEffect.scatter = 0.7;
bloomEffect.intensity = 1;
bloomEffect.tint.set(1, 1, 1, 1);
// Adjust TonemappingEffect properties
tonemappingEffect.enabled = true;
tonemappingEffect.mode = TonemappingMode.ACES;
```
### 2. Camera switch
The camera preview area is controlled by the **camera component**. In the camera component, the following properties will affect the post-processing effect:
- **Post-processing switch**: You can turn on or off the camera's post-processing effect. The overall switch and specific configuration of post-processing are in the [scene](/en/docs/core/scene) panel.
- **HDR switch**: In HDR mode, the output color is allowed to be stored using floating point numbers, which can obtain a wider range of values for scenes such as [bloom effects](/en/docs/graphics/postProcess/effects).
- **MSAA configuration**: You can adjust the settings of multi-sampling anti-aliasing to improve the quality of the picture such as aliasing.
<Image src="https://gw.alipayobjects.com/zos/OasisHub/3232935d-a765-4da4-b08e-021aac61458e/image-20240718210947199.png" />
| Property | Description |
| :-- | :-- |
| [Post-Processing Switch](/apis/core/#Camera-enablePostProcess) | Enables or disables the post-processing effects of the camera. |
| [HDR Switch](/apis/core/#Camera-enableHDR) | In HDR mode, allows output colors to be stored using floating-point numbers, providing a broader range of values for [bloom effects](/en/docs/graphics/postProcess/effects) and other scenarios. |
| [MSAA Configuration](/apis/core/#Camera-msaaSamples) | Adjusts the multi-sampling anti-aliasing settings to improve image quality, such as reducing jagged edges. |
| [Post-Processing Mask](/apis/core/#Camera-postProcessMask) | Works with the post-processing component's [layer](/apis/core/#PostProcess-layer) to determine which post-processing components are effective. |
<Callout type="info">
For more camera configurations, refer to [Camera Component](/en/docs/graphics/camera/component)
For more camera configurations, refer to the [Camera Component](/en/docs/graphics/camera/component)
</Callout>
### 3. Viewport Switch
In addition to the camera preview area, the viewport can also see the post-processing effect. The camera in the viewport is independent, but it also has post-processing switches like the camera component (same as above, also pay attention to the switches in the post-processing configuration); the switches in the viewport only affect the view window and do not affect the actual effect of the project export:
In addition to the camera preview area, the viewport can also display post-processing effects. The camera in the viewport is independent but has post-processing configurations similar to the camera component.
<Image src="https://gw.alipayobjects.com/zos/OasisHub/f9f13d02-931f-4638-af91-4a007007c99f/image-20240718193359413.png" />
<Callout type="warning">
The switch in the viewport only affects the view window and does not impact the actual effect exported in the project.
</Callout>
## Recommended configuration for mobile
<Image src="https://gw.alipayobjects.com/zos/OasisHub/e3c55184-f51c-4a7a-9a12-ad490774dc26/image-20250115151324628-20250115151336116.png" />
Generally speaking, some post-processing configurations in the red box in the figure below will affect performance:
## Best Practices
<Image src="https://gw.alipayobjects.com/zos/OasisHub/7e5e272c-fc1e-45cd-92b0-a687c58826c7/image-20240719104328198.png" />
And some camera configurations:
<Image src="https://gw.alipayobjects.com/zos/OasisHub/5d96cd31-2e12-43eb-8493-f8751e40eb82/image-20240719112101652.png" />
- Regarding the `HDR` switch in the camera, if most of the pixel calculations in the scene do not exceed 1 (for example, HDR maps are not used), try not to turn on HDR. After turning it on, the engine will first render to `R11G11B10_UFloat` format In RenderTarget, rendering to the screen again has performance overhead.
- Regarding the `MSAA` option in the camera, it is recommended to adjust this value only when post-processing is turned on and the anti-aliasing performance is strictly required. The larger the value, the greater the performance overhead.
- In the bloom effect, `Down Scale` defaults to `Half`, that is, the initial downsampling resolution is half of the canvas. If the accuracy requirement is not so high, you can switch to `Quarter` and save 1/4 of the canvas.
- In the tone mapping effect, although `ACES` has better color contrast and saturation, the calculation is more complicated, which may cause serious frame drops on low-end models. You can try to use `Neutral` as an alternative.
- Regarding the `HDR` switch in the camera, if the majority of pixels in the scene do not exceed 1 (e.g., no HDR textures used), avoid enabling HDR. When enabled, the engine first renders to a `R11G11B10_UFloat` format RenderTarget before rendering to the screen, incurring performance overhead.
- Regarding the `MSAA` option in the camera, only adjust this value when post-processing is enabled and strict on anti-aliasing performance. The higher the value, the greater the performance overhead.
- In bloom effects, `Down Scale` defaults to `Half`, meaning the initial downsampled resolution is half that of the canvas. If high precision is not required, switch to `Quarter`, saving to a quarter of the canvas.
- In tone mapping effects, while `ACES` offers better color contrast and saturation, it is computationally complex and may cause significant frame drops on low-end devices. Consider using `Neutral` as an alternative.

View File

@@ -3,7 +3,9 @@ title: Shader API【Experimental】
---
<Callout type="warning">
This version is currently experimental and can only be used in the `editor`. If you want to use it in `Pro Code`, you need to import the `@galacean/engine-toolkit` package. Please note that the API may change in the next version, and we will notify you in time.
This version is currently experimental and can only be used in the `editor`. If you want to use it in `Pro Code`, you
need to import the `@galacean/engine-shader-shaderlab` package. Please note that the API may change in the next
version, and we will notify you in time.
</Callout>
Similar to functions, classes, and properties in Typescript, Shader code also has its own set of APIs. This article can help you write your own Shader based on these APIs and `ShaderLab` syntax.
@@ -96,10 +98,6 @@ SubShader "Default" {
2. Modify the lighting model in `DemoPass.glsl`. As a demo, we only demonstrate modifying the direct light part:
<Callout type="warning">
The editor currently does not support displaying the contents of `ForwardPassPBR.glsl` because it is in another repository. However, you can copy the `/Internal/Shader/Advanced/IridescenceForwardPass` file for modification.
</Callout>
```ts showLineNumbers {7-8}
// DemoPass.glsl
#include "Common.glsl"
@@ -151,7 +149,7 @@ float f2 = pow2(0.5);
### Common
Provides common macros like `PI`, and general methods like `gammaToLinear`, `pow2`, etc. See [source code](https://github.com/galacean/engine-toolkit/blob/main/packages/shaderlab/src/shaders/Common.glsl) for details.
Provides common macros like `PI`, and general methods like `gammaToLinear`, `pow2`, etc. See [source code](https://github.com/galacean/engine/blob/main/packages/shader-shaderlab/src/shaders/Common.glsl) for details.
### Fog
@@ -215,7 +213,7 @@ mat3 getTBNByDerivatives(vec2 uv, vec3 normal, vec3 position, bool isFrontFacing
### Shadow
Provides shadow-related functions. See [source code](https://github.com/galacean/engine-toolkit/blob/main/packages/shaderlab/src/shaders/Shadow.glsl) for details.
Provides shadow-related functions. See [source code](https://github.com/galacean/engine/blob/main/packages/shader-shaderlab/src/shaders/Shadow.glsl) for details.
```glsl
// 获取级联阴影所属层级比如级联数量设为4则返回 03
@@ -252,23 +250,24 @@ In addition to the general API, PBR also encapsulates a series of APIs such as t
### AttributesPBR
Encapsulates all the Attribute variables needed for PBR. See [source code](https://github.com/galacean/engine-toolkit/blob/main/packages/shaderlab/src/shaders/shadingPBR/AttributesPBR.glsl) for details.
Encapsulates all the Attribute variables needed for PBR. See [source code](https://github.com/galacean/engine/blob/main/packages/shader-shaderlab/src/shaders/shadingPBR/AttributesPBR.glsl) for details.
### VaryingsPBR
Encapsulates all the Varyings variables needed for PBR. See [source code](https://github.com/galacean/engine-toolkit/blob/main/packages/shaderlab/src/shaders/shadingPBR/VaryingsPBR.glsl) for details.
Encapsulates all the Varyings variables needed for PBR. See [source code](https://github.com/galacean/engine/blob/main/packages/shader-shaderlab/src/shaders/shadingPBR/VaryingsPBR.glsl) for details.
### LightDirectPBR
```
Encapsulates direct light calculations based on the BRDF lighting model. For more details, see [source code](https://github.com/galacean/engine-toolkit/blob/main/packages/shaderlab/src/shaders/shadingPBR/LightDirectPBR.glsl).
````
Encapsulates direct light calculations based on the BRDF lighting model. For more details, see [source code](https://github.com/galacean/engine/blob/main/packages/shader-shaderlab/src/shaders/shadingPBR/LightDirectPBR.glsl).
Generally, you can call it directly:
```glsl
// Evaluate direct lighting
evaluateDirectRadiance(varyings, surfaceData, brdfData, shadowAttenuation, color.rgb);
```
````
The following function overload macros are provided to override key calculations of the lighting model:
@@ -277,18 +276,20 @@ The following function overload macros are provided to override key calculations
#define FUNCTION_DIFFUSE_LOBE diffuseLobe
#define FUNCTION_SPECULAR_LOBE specularLobe
#define FUNCTION_CLEAR_COAT_LOBE clearCoatLobe
#define FUNCTION_SHEEN_LOBE sheenLobe
void surfaceShading(Varyings varyings, SurfaceData surfaceData, BRDFData brdfData, vec3 incidentDirection, vec3 lightColor, inout vec3 color);
void surfaceShading(Varyings varyings, SurfaceData surfaceData, BRDFData brdfData, vec3 incidentDirection, vec3 lightColor, inout vec3 totalDiffuseColor, inout vec3 totalSpecularColor);
void diffuseLobe(Varyings varyings, SurfaceData surfaceData, BRDFData brdfData, vec3 attenuationIrradiance, inout vec3 diffuseColor);
void specularLobe(Varyings varyings, SurfaceData surfaceData, BRDFData brdfData, vec3 incidentDirection, vec3 attenuationIrradiance, inout vec3 specularColor);
float clearCoatLobe(Varyings varyings, SurfaceData surfaceData, BRDFData brdfData, vec3 incidentDirection, vec3 color, inout vec3 specularColor);
void sheenLobe(Varyings varyings, SurfaceData surfaceData, BRDFData brdfData, vec3 incidentDirection, vec3 attenuationIrradiance, inout vec3 diffuseColor, inout vec3 specularColor);
```
<Callout type="info">Refer to the PBR template extension above for the overload method.</Callout>
### LightInDirectPBR
Encapsulates [ambient light](/en/docs/graphics/light/ambient) calculations based on the BRDF lighting model. For more details, see [source code](https://github.com/galacean/engine-toolkit/blob/main/packages/shaderlab/src/shaders/shadingPBR/LightInDirectPBR.glsl).
Encapsulates [ambient light](/en/docs/graphics/light/ambient) calculations based on the BRDF lighting model. For more details, see [source code](https://github.com/galacean/engine/blob/main/packages/shader-shaderlab/src/shaders/shadingPBR/LightIndirectPBR.glsl).
Generally, you can call it directly:
@@ -303,15 +304,17 @@ The following function overload macros are provided to override key calculations
#define FUNCTION_DIFFUSE_IBL evaluateDiffuseIBL
#define FUNCTION_SPECULAR_IBL evaluateSpecularIBL
#define FUNCTION_CLEAR_COAT_IBL evaluateClearCoatIBL
#define FUNCTION_SHEEN_IBL evaluateSheenIBL
void evaluateDiffuseIBL(Varyings varyings, SurfaceData surfaceData, BRDFData brdfData, inout vec3 diffuseColor);
void evaluateSpecularIBL(Varyings varyings, SurfaceData surfaceData, BRDFData brdfData, float radianceAttenuation, inout vec3 specularColor);
float evaluateClearCoatIBL(Varyings varyings, SurfaceData surfaceData, BRDFData brdfData, inout vec3 specularColor);
void evaluateSheenIBL(Varyings varyings, SurfaceData surfaceData, BRDFData brdfData, float radianceAttenuation, inout vec3 diffuseColor, inout vec3 specularColor);
```
### VertexPBR
Some methods required by the PBR vertex shader, such as obtaining UV coordinates after TilingOffset, obtaining world coordinates, normals, tangents after skeletal and BS operations, etc. For more details, see [source code](https://github.com/galacean/engine-toolkit/blob/main/packages/shaderlab/src/shaders/shadingPBR/VertexPBR.glsl).
Some methods required by the PBR vertex shader, such as obtaining UV coordinates after TilingOffset, obtaining world coordinates, normals, tangents after skeletal and BS operations, etc. For more details, see [source code](https://github.com/galacean/engine/blob/main/packages/shader-shaderlab/src/shaders/shadingPBR/VertexPBR.glsl).
```glsl showLineNumbers {2, 4}
Varyings varyings;
@@ -336,11 +339,15 @@ gl_Position = renderer_MVPMat * vertexInputs.positionOS;
### BRDF
The key file of the PBR lighting model, encapsulating general calculation functions related to BRDF, as well as the `SurfaceData` structure and `BRDFData` structure used for subsequent lighting model calculations. For more details, see [source code](https://github.com/galacean/engine-toolkit/blob/main/packages/shaderlab/src/shaders/shadingPBR/BRDF.glsl).
The key file of the PBR lighting model, encapsulating general calculation functions related to BRDF, as well as the `SurfaceData` structure and `BRDFData` structure used for subsequent lighting model calculations. For more details, see [source code](https://github.com/galacean/engine/blob/main/packages/shader-shaderlab/src/shaders/shadingPBR/BRDF.glsl).
### BTDF
Provides transmission and refraction related functions, see [source code](https://github.com/galacean/engine/blob/main/packages/shader-shaderlab/src/shaders/shadingPBR/BTDF.glsl)。
### FragmentPBR
Contains a large number of variables passed from the CPU, such as metallic, roughness, maps, etc., and initializes the `SurfaceData` structure through `getSurfaceData`. For more details, see [source code](https://github.com/galacean/engine-toolkit/blob/main/packages/shaderlab/src/shaders/shadingPBR/FragmentPBR.glsl).
Contains a large number of variables passed from the CPU, such as metallic, roughness, maps, etc., and initializes the `SurfaceData` structure through `getSurfaceData`. For more details, see [source code](https://github.com/galacean/engine/blob/main/packages/shader-shaderlab/src/shaders/shadingPBR/FragmentPBR.glsl).
```glsl showLineNumbers
BRDFData brdfData;
@@ -354,4 +361,4 @@ initBRDFData(surfaceData, brdfData);
### Finally
In addition to the functionality and calling methods of key APIs, you can refer to the [ForwardPassPBR](https://github.com/galacean/engine-toolkit/blob/main/packages/shaderlab/src/shaders/shadingPBR/ForwardPassPBR.glsl) on the official website for the organization of the entire file.
In addition to the functionality and calling methods of key APIs, you can refer to the [ForwardPassPBR](https://github.com/galacean/engine/blob/main/packages/shader-shaderlab/src/shaders/shadingPBR/ForwardPassPBR.glsl) on the official website for the organization of the entire file.

View File

@@ -15,11 +15,11 @@ Scene 作为场景单元,可以方便的进行实体树管理,尤其是大
在 **[资产面板](/docs/assets/interface)** 右键(或资产面板右上角 + 号)创建场景,双击场景可以切换过去:
![scene-switch](https://gw.alipayobjects.com/zos/OasisHub/eef870a7-2630-4f74-8c0e-478696a553b0/2024-03-19%25252018.04.02.gif)
<Image src="https://gw.alipayobjects.com/zos/OasisHub/eef870a7-2630-4f74-8c0e-478696a553b0/2024-03-19%25252018.04.02.gif" />
### 属性面板
<img src="https://gw.alipayobjects.com/zos/OasisHub/2eaad4b1-d3e3-4c17-ae7f-58b488cd3606/image-20240718190944508.png" alt="image-20240718190944508" style="zoom:50%;" />
<Image src="https://gw.alipayobjects.com/zos/OasisHub/5770f3d1-6840-438b-adf0-d759048fac6b/image-20250124110620126.png" />
### 环境光
@@ -33,25 +33,20 @@ Scene 作为场景单元,可以方便的进行实体树管理,尤其是大
详情请参照[阴影教程](/docs/graphics/light/shadow/)。
### 后处理
详情请参照[后处理教程](/docs/graphics/postProcess/postProcess/)。
### 雾化
可以给整个场景增加 **线性、指数、指数平方** 3 种雾化:
![Fog](https://gw.alipayobjects.com/zos/OasisHub/224fbc16-e60c-47ca-845b-5f7c09563c83/2024-03-19%25252018.08.23.gif)
<Image src="https://gw.alipayobjects.com/zos/OasisHub/224fbc16-e60c-47ca-845b-5f7c09563c83/2024-03-19%25252018.08.23.gif" />
## 脚本使用
| 属性名称 | 解释 |
| :--------------------------------------- | :------- |
| 属性名称 | 解释 |
| :---------------------------------------- | :------- |
| [scenes](/apis/core/#SceneManager-scenes) | 场景列表 |
| 方法名称 | 解释 |
| :------------------------------------------------- | :------- |
| 方法名称 | 解释 |
| :-------------------------------------------------- | :------- |
| [addScene](/apis/core/#SceneManager-addScene) | 添加场景 |
| [removeScene](/apis/core/#SceneManager-removeScene) | 移除场景 |
| [mergeScenes](/apis/core/#SceneManager-mergeScenes) | 合并场景 |
@@ -64,11 +59,9 @@ Scene 作为场景单元,可以方便的进行实体树管理,尤其是大
```typescript
const sceneUrl = "...";
engine.resourceManager
.load({ type: AssetType.Scene, url: "..." })
.then((scene) => {
engine.sceneManager.addScene(scene);
});
engine.resourceManager.load({ type: AssetType.Scene, url: "..." }).then((scene) => {
engine.sceneManager.addScene(scene);
});
```
### 获取场景对象
@@ -126,12 +119,12 @@ engine.sceneManager.addScene(destScene);
### 实体树管理
| 方法名称 | 解释 |
| :---------------------------------------------------- | :--------------------------------------------------------------------------------------------------- |
| [createRootEntity](/apis/core/#Scene-createRootEntity) | 新创建的 _scene_ 默认没有根实体,需要手动创建 |
| [addRootEntity](/apis/core/#Scene-addRootEntity) | 可以直接新建实体,或者添加已经存在的实体 |
| [removeRootEntity](/apis/core/#Scene-removeRootEntity) | 删除根实体 |
| [getRootEntity](/apis/core/#Scene-getRootEntity) | 查找根实体,可以拿到全部根实体,或者单独的某个实体对象。注意,全部实体是只读数组,不能改变长度和顺序 |
| 方法名称 | 解释 |
| :-- | :-- |
| [createRootEntity](/apis/core/#Scene-createRootEntity) | 新创建的 _scene_ 默认没有根实体,需要手动创建 |
| [addRootEntity](/apis/core/#Scene-addRootEntity) | 可以直接新建实体,或者添加已经存在的实体 |
| [removeRootEntity](/apis/core/#Scene-removeRootEntity) | 删除根实体 |
| [getRootEntity](/apis/core/#Scene-getRootEntity) | 查找根实体,可以拿到全部根实体,或者单独的某个实体对象。注意,全部实体是只读数组,不能改变长度和顺序 |
```typescript
const engine = await WebGLEngine.create({ canvas: "demo" });

View File

@@ -81,6 +81,7 @@ camera.enableHDR = true;
| | [msaaSamples](/apis/core/#Camera-msaaSamples) | 多样本抗锯齿采样样本数量,仅当独立画布开启时才能生效,如 `enableHDR``enablePostProcess``opaqueTextureEnabled`。 |
| | [enableHDR](/apis/core/#Camera-enableHDR) | 是否启用 HDR 渲染,允许 shader 输出的颜色使用浮点数进行存储,可以得到更大范围的值,用于后处理等场景。 |
| | [enablePostProcess](/apis/core/#Camera-enablePostProcess) | 是否启用后处理,后处理配置详见[后处理教程](/docs/graphics/postProcess/postProcess)。 |
| | [postProcessMask](/apis/core/#Camera-postProcessMask) | 后处理遮罩,决定生效的后处理组件,后处理配置详见[后处理教程](/docs/graphics/postProcess/postProcess)。 |
### 裁剪遮罩

View File

@@ -0,0 +1,177 @@
---
order: 2
title: 自定义后处理
---
在后处理系统中,特效([Effect](/apis/core/#PostProcessEffect)) 负责数据层的维护,管线([Pass](/apis/core/#PostProcessPass))负责渲染逻辑的编写,在管线中调用 [getBlendEffect](/apis/core/#PostProcessManager-getBlendEffect) 方法可以拿到经过 **全局/局部** 混合后的最终后处理数据。
```mermaid
graph TD;
subgraph Pass
A1[Custom Pass ...]
A2[Uber Pass]
A3[Custom Pass ...]
end
subgraph Effect
B1[Bloom Effect]
B2[Tonemapping Effect]
B3[Custom Effect]
end
A1 --> A2 --> A3
B1 --> B2 --> B3
A1 ---|Blend| B1
```
引擎内置了 [PostProcessUberPass](/apis/core/#PostProcessUberPass),搭配 [BloomEffect](/apis/core/#BloomEffect) 和 [TonemappingEffect](/apis/core/#TonemappingEffect) 的数据使用,如果想要自定义后处理特效,我们需要新建一个 Pass然后根据是否需要融合数据来创建 Effect。
## 一个 Demo
这里就简单地实现一个灰度图的后处理效果吧~
<Comparison
leftSrc="https://gw.alipayobjects.com/zos/OasisHub/f5d3ea3d-47a4-4618-b3d6-0ea46321e786/image-20250115162952605.png"
leftText="Uber Pass"
rightSrc="https://gw.alipayobjects.com/zos/OasisHub/75ccba72-b70b-49f6-812a-e00305e89201/image-20250115163026345.png"
rightText="Uber Pass + Custom Pass"
/>
### 1. 添加脚本
我们先创建一个脚本,接下去我们要在这个脚本文件里面编写我们的自定义后处理 Shader 和 Pass。
<Image src="https://gw.alipayobjects.com/zos/OasisHub/6df5fd1c-a24c-4e32-a62e-841b61fe76c7/image-20250124111456360.png" />
我们参考[添加后处理方式](/docs/graphics/postProcess/postProcess/#1添加后处理组件)添加全局或者局部后处理组件,并且将脚本挂到这个实体上面:
<Image src="https://gw.alipayobjects.com/zos/OasisHub/62a36df2-fdc5-4c13-9a32-c41f963d18b5/image-20250124112043221.png" />
### 2. Shader 编写
算法没有特殊的, 需要注意 `renderer_BlitTexture` 这个内置变量就是上一个 Pass 的后处理渲染结果,在此处就是 Bloom 和 Tonemapping 后的结果,我们针对这个结果进行了灰度显示。
```ts showLineNumbers {14} filename="CustomPostProcessPass.ts"
const customShader = Shader.create(
"Gray Scale Shader",
`
attribute vec4 POSITION_UV;
varying vec2 v_uv;
void main() {
gl_Position = vec4(POSITION_UV.xy, 0.0, 1.0);
v_uv = POSITION_UV.zw;
}
`,
`
varying vec2 v_uv;
uniform sampler2D renderer_BlitTexture;
void main(){
vec4 color = texture2D(renderer_BlitTexture, v_uv);
float grayScale = 0.299 * color.r + 0.587 * color.g + 0.114 * color.b;
gl_FragColor = vec4(vec3(grayScale), 1.0);
}
`
);
```
### 3. 新建 Pass
我们新建一个 Pass在 [onRender 钩子](/apis/core/#PostProcessUberPass-onRender) 里面直接 [Blitter](/apis/core/#Blitter) 到屏幕,然后将这个 Pass 添加到引擎中。
```ts showLineNumbers {10,15} filename="CustomPostProcessPass.ts"
class CustomPass extends PostProcessPass {
private _blitMaterial: Material;
constructor(engine: Engine) {
super(engine);
this._blitMaterial = new Material(this.engine, customShader);
}
onRender(_, srcTexture: Texture2D, dst: RenderTarget): void {
Blitter.blitTexture(this.engine, srcTexture, dst, undefined, undefined, this._blitMaterial, 0);
}
}
const customPass = new CustomPass(engine);
engine.addPostProcessPass(customPass);
```
Pass 的执行顺序默认在 Uber Pass 的后面执行,即 [`PostProcessPassEvent.AfterUber`](/apis/core/#PostProcessPassEvent-AfterUber),我们也可以手动修改管线的执行顺序:
```ts filename="CustomPostProcessPass.ts"
customPass.event = PostProcessPassEvent.BeforeUber;
```
Pass 是否生效默认是根据 Pass 的 [isActive](/apis/core/#PostProcessUberPass-isActive) 来决定的,我们也可以修改生效逻辑,比如强度是否大于 0
```ts showLineNumbers {2-9} filename="CustomPostProcessPass.ts"
class CustomPass extends PostProcessPass {
override isValid(postProcessManager: PostProcessManager): boolean {
if (!this.isActive) {
return false;
}
const customEffectBlend = postProcessManager.getBlendEffect(CustomEffect);
return customEffectBlend?.intensity > 0;
}
}
```
### 4. 融合数据
上述 2、3 步骤已经能够自定义后处理效果,这里再来一个进阶版的融合数据。
拿 `intensity` 举例,我们定义一个 `CustomEffect`,专门用来融合强度,需要融合的数据也很简单,引擎已经封装了一系列后处理参数,如[浮点类型参数](/apis/core/#PostProcessEffectFloatParameter)。
```ts showLineNumbers {2,7} filename="CustomPostProcessPass.ts"
class CustomEffect extends PostProcessEffect {
intensity = new PostProcessEffectFloatParameter(0.8);
}
// 将这个 effect 添加到后处理组件中postProcess 可以是单独新建的,
// 也可以是跟 Bloom 等 effect 同一个,具体看融合的需求~
postProcess.addEffect(CustomEffect);
```
定义好数据后,需要在自定义 Pass 的 `onRender` 钩子中,改成获取融合数据:
```ts showLineNumbers {3-6} filename="CustomPostProcessPass.ts"
class CustomPass extends PostProcessPass {
onRender(camera: Camera, srcTexture: Texture2D, dst: RenderTarget): void {
const postProcessManager = camera.scene.postProcessManager;
const customEffectBlend = postProcessManager.getBlendEffect(CustomEffect);
if (customEffectBlend) {
this._blitMaterial.shaderData.setFloat("u_intensity", customEffectBlend.intensity.value);
}
Blitter.blitTexture(this.engine, srcTexture, dst, undefined, undefined, this._blitMaterial, 0);
}
}
```
可以看到,我们在自定义管线的 `onRender` 钩子中,不断设置融合后的强度,然后在 shader 中消费这个数据即可:
```ts showLineNumbers {8,12} filename="CustomPostProcessPass.ts"
const customShader = Shader.create(
"Gray Scale Shader",
`
......
`,
`
......
uniform float u_intensity;
void main(){
......
gl_FragColor = vec4(mix(color.rgb, vec3(grayScale), u_intensity), 1.0);
}
`
);
```
如果你的后处理组件是局部模式,我们还可以通过 [Blend Distance](/apis/core/#PostProcess-blendDistance) 来设置相机靠近碰撞体多少距离时开始混合:
<Image src="https://gw.alipayobjects.com/zos/OasisHub/76b084c1-f517-47d7-8067-7f838db002f8/2025-01-15%25252017.38.30.gif" />

View File

@@ -3,83 +3,84 @@ order: 0
title: 后处理总览
---
后处理系统可以对场景渲染的结果进行“加工”。
后处理系统可以对相机渲染的结果进行“加工”。
<Comparison
leftSrc="https://gw.alipayobjects.com/zos/OasisHub/3a50ed18-c2d4-4b33-a4e6-af79f2c273f8/2024-07-18%25252018.08.30.gif"
leftText="关闭后处理"
leftText="OFF"
rightSrc="https://gw.alipayobjects.com/zos/OasisHub/4bd5f985-1b82-4aca-b6fa-fd521aab8f57/2024-07-18%25252018.15.30.gif"
rightText="开启后处理"
rightText="ON"
/>
## 后处理配置
后处理有两种模式:
- 全局模式:影响当前场景中的所有相机。
- 局部模式:只有当相机靠近后处理实体的碰撞体范围时才生效。
后处理组件拥有以下属性,可以控制后处理的特效、模式、混合距离等:
| 属性 | 作用 |
| :-- | :-- |
| [全局模式](/apis/core/#PostProcess-isGlobal) | 控制这个后处理组件是全局还是局部模式。 |
| [混合距离](/apis/core/#PostProcess-blendDistance) | 局部模式时,用来控制相机靠近碰撞体多少距离时,开始混合特效。 |
| [优先级](/apis/core/#PostProcess-priority) | 当场景中有多个后处理组件时,优先级越高,越后面开始覆盖/混合。 |
| [层级](/apis/core/#PostProcess-layer) | 配合相机的[后处理遮罩](/apis/core/#Camera-postProcessMask) 使用,决定生效的后处理组件。 |
| [添加特效](/apis/core/#PostProcess-addEffect) | 添加后处理特效。 |
## 混合规则
- 全局模式时,混合规则是以优先级大的后处理组件为准。
- 局部模式时,会从相机距离碰撞体的 **混合距离** 开始, 从上一个后处理组件混合后的值过渡到当前后处理组件的值,如果没有上一个,则以后处理特效的定义值作为起始值。
举一个例子,强度的定义值为 `0`, 即通过 [PostProcessEffectFloatParameter](/apis/core/#PostProcessEffectFloatParameter) 定义了这个强度:
- 后处理组件 1全局模式强度为 `0.5`,混合后为 `0.5`
- 后处理组件 2局部模式强度为 `1` 相机距离碰撞体为混合距离的一半,混合后为 `mix( 0.5, 1, 1 - distance / blendDistance)` = `0.75`
<Callout type="info">
如果想要自定义后处理,请参考 [文档](/docs/graphics/postProcess/customPostProcess/#3-融合数据)
</Callout>
## 使用后处理
### 1.后处理配置
### 1.添加后处理组件
后处理配置统一放在 [场景](/docs/core/scene) 面板下,为了防止性能浪费,默认**关闭总开关**,用户只需要打开总开关,就能激活所有后处理效果:
在层级面板,预设了全局、局部后处理的几种模式,选择添加即可直接使用。
<Image src="https://gw.alipayobjects.com/zos/OasisHub/50f6a2aa-0463-4b66-b54e-edff71187077/image-20240718193530098.png" />
<Image src="https://gw.alipayobjects.com/zos/OasisHub/a6e9a327-1823-4dde-94a8-89bb4bf02e3a/2025-01-15%25252011.59.50.gif" />
当然,也可以手动添加后处理组件,局部模式需要搭配碰撞体进行使用:
<Image src="https://gw.alipayobjects.com/zos/OasisHub/6eb3b8d5-27d1-419d-8bf6-65371861fb97/2025-01-15%25252014.51.12.gif" />
<Callout type="info">具体的后处理效果配置,请参考 [后处理效果列表](/docs/graphics/postProcess/effects)</Callout>
<Callout type="warning">
截止 1.3 版本,引擎没有暴露公共 API因为支持后处理拓展后API
可能会产生变动),我们建议用户在编辑器进行后处理操作。如果想使用内部实验接口,可以调用以下代码:
</Callout>
```typescript
// 获取后处理管理器
// @ts-ignore
const postProcessManager = scene._postProcessManager;
// 获取 BloomEffect
const bloomEffect = postProcessManager._bloomEffect as BloomEffect;
// 获取 TonemappingEffect
const tonemappingEffect = postProcessManager._tonemappingEffect as TonemappingEffect;
// 激活总开关
postProcessManager.isActive = true;
// 调整 BloomEffect 属性
bloomEffect.enabled = true;
bloomEffect.downScale = BloomDownScaleMode.Half;
bloomEffect.threshold = 0.9;
bloomEffect.scatter = 0.7;
bloomEffect.intensity = 1;
bloomEffect.tint.set(1, 1, 1, 1);
// 调整 TonemappingEffect 属性
tonemappingEffect.enabled = true;
tonemappingEffect.mode = TonemappingMode.ACES;
```
### 2.相机开关
相机预览区受**相机组件**控制,相机组件中,以下属性会影响后处理效果:
相机预览区受**相机组件**控制,相机组件中,以下属性会影响后处理效果:
- **后处理开关**:可以开启或关闭相机的后期处理效果,后处理的总开关和具体配置在 [场景](/docs/core/scene) 面板。
- **HDR 开关**HDR 模式下,允许输出颜色使用浮点数进行存储,可以得到更大范围的值,用于[泛光特效](/docs/graphics/postProcess/effects)等场景。
- **MSAA 配置**:可以调整多重采样抗锯齿的设置,改善锯齿等画面质量。
<Image src="https://gw.alipayobjects.com/zos/OasisHub/3232935d-a765-4da4-b08e-021aac61458e/image-20240718210947199.png" />
| 属性 | 作用 |
| :-- | :-- |
| [后处理开关](/apis/core/#Camera-enablePostProcess) | 可以开启或关闭相机的后期处理效果。 |
| [HDR 开关](/apis/core/#Camera-enableHDR) | HDR 模式下,允许输出颜色使用浮点数进行存储,可以得到更大范围的值,用于[泛光特效](/docs/graphics/postProcess/effects)等场景。 |
| [MSAA 配置](/apis/core/#Camera-msaaSamples) | 可以调整多重采样抗锯齿的设置,改善锯齿等画面质量。 |
| [后处理遮罩](/apis/core/#Camera-postProcessMask) | 配合后处理组件的[层级](/apis/core/#PostProcess-layer) 使用,决定生效的后处理组件。 |
<Callout type="info">相机更多配置参考 [相机组件](/docs/graphics/camera/component)</Callout>
### 3.视图区开关
除了相机预览区,视图区也能看到后处理效果。视图区的相机是独立的,但是也和相机组件一样拥有后处理等开关(同上,也要注意后处理配置中的开关);视图区的开关只会影响视图窗口,并不会影响项目导出的真实效果:
除了相机预览区,视图区也能看到后处理效果。视图区的相机是独立的,但是也和相机组件一样拥有后处理等配置。
<Image src="https://gw.alipayobjects.com/zos/OasisHub/f9f13d02-931f-4638-af91-4a007007c99f/image-20240718193359413.png" />
<Callout type="warning">视图区的开关只会影响视图窗口,并不会影响项目导出的真实效果</Callout>
<Image src="https://gw.alipayobjects.com/zos/OasisHub/e3c55184-f51c-4a7a-9a12-ad490774dc26/image-20250115151324628-20250115151336116.png" />
## 最佳实践
一般来说,下图红框内的一些后处理配置会影响到性能:
<Image src="https://gw.alipayobjects.com/zos/OasisHub/7e5e272c-fc1e-45cd-92b0-a687c58826c7/image-20240719104328198.png" />
以及相机的部分配置:
<Image src="https://gw.alipayobjects.com/zos/OasisHub/5d96cd31-2e12-43eb-8493-f8751e40eb82/image-20240719112101652.png" />
- 关于相机中 `HDR` 开关,如果场景中绝大部分像素计算没有超过 1比如没有使用 HDR 贴图), 尽量别开启 HDR开启后引擎会先渲染到 `R11G11B10_UFloat` 格式的 RenderTarget 中,再渲染到屏幕上,有性能开销。
- 关于相机中的 `MSAA` 选项,仅当开启了后处理,且对锯齿表现要求严格的情况下,才建议调整这个值,值越大,性能开销越大。
- 泛光特效中,`Down Scale` 默认为 `Half`,即初始降采样的分辨率为画布的一半,如果对精度要求没那么高,可以切换为 `Quarter`,节省为画布的 1/4。

View File

@@ -3,8 +3,8 @@ title: Shader API【实验】
---
<Callout type="warning">
目前这个版本还处于实验性质,仅可在`编辑器`中使用。如果您想在 `Pro Code` 中使用,需要引入 `@galacean/engine-toolkit`
包。请注意,下个版本的 API 可能会发生变更,届时我们会及时通知您。
目前这个版本还处于实验性质,仅可在`编辑器`中使用。如果您想在 `Pro Code` 中使用,需要引入
`@galacean/engine-shader-shaderlab` 包。请注意,下个版本的 API 可能会发生变更,届时我们会及时通知您。
</Callout>
类似于 Typescript 中的函数、类、属性, Shader 代码也有一套自己的 API。本文可以帮助你如何基于这些 API 和 `ShaderLab` 语法,编写自己的 Shader。
@@ -97,11 +97,6 @@ SubShader "Default" {
2. 修改 `DemoPass.glsl` 中的光照模型,作为 Demo我们只演示修改直接光部分
<Callout type="warning">
目前编辑器还不支持展示 `ForwardPassPBR.glsl` 的内容,因为它在别的仓库。不过您可以先拷贝
`/Internal/Shader/Advanced/IridescenceForwardPass` 这个文件进行修改。
</Callout>
```ts showLineNumbers {7-8}
// DemoPass.glsl
#include "Common.glsl"
@@ -153,7 +148,7 @@ float f2 = pow2(0.5);
### Common
提供了`PI` 等常用宏,`gammaToLinear`、`pow2` 等通用方法,详见[源码](https://github.com/galacean/engine-toolkit/blob/main/packages/shaderlab/src/shaders/Common.glsl)。
提供了`PI` 等常用宏,`gammaToLinear`、`pow2` 等通用方法,详见[源码](https://github.com/galacean/engine/blob/main/packages/shader-shaderlab/src/shaders/Common.glsl)。
### Fog
@@ -217,7 +212,7 @@ mat3 getTBNByDerivatives(vec2 uv, vec3 normal, vec3 position, bool isFrontFacing
### Shadow
提供了阴影相关的函数,详见[源码](https://github.com/galacean/engine-toolkit/blob/main/packages/shaderlab/src/shaders/Shadow.glsl)。
提供了阴影相关的函数,详见[源码](https://github.com/galacean/engine/blob/main/packages/shader-shaderlab/src/shaders/Shadow.glsl)。
```glsl
// 获取级联阴影所属层级比如级联数量设为4则返回 03
@@ -254,15 +249,15 @@ void calculateBlendShape(Attributes attributes, inout vec4 position, inout vec3
### AttributesPBR
封装了 PBR 所需要的所有 Attribute 变量,详见[源码](https://github.com/galacean/engine-toolkit/blob/main/packages/shaderlab/src/shaders/shadingPBR/AttributesPBR.glsl)。
封装了 PBR 所需要的所有 Attribute 变量,详见[源码](https://github.com/galacean/engine/blob/main/packages/shader-shaderlab/src/shaders/shadingPBR/AttributesPBR.glsl)。
### VaryingsPBR
封装了 PBR 所需要的所有 Varyings 变量,详见[源码](https://github.com/galacean/engine-toolkit/blob/main/packages/shaderlab/src/shaders/shadingPBR/VaryingsPBR.glsl)。
封装了 PBR 所需要的所有 Varyings 变量,详见[源码](https://github.com/galacean/engine/blob/main/packages/shader-shaderlab/src/shaders/shadingPBR/VaryingsPBR.glsl)。
### LightDirectPBR
封装了基于 BRDF 光照模型的直接光计算,详见[源码](https://github.com/galacean/engine-toolkit/blob/main/packages/shaderlab/src/shaders/shadingPBR/LightDirectPBR.glsl)。
封装了基于 BRDF 光照模型的直接光计算,详见[源码](https://github.com/galacean/engine/blob/main/packages/shader-shaderlab/src/shaders/shadingPBR/LightDirectPBR.glsl)。
一般来说,直接调用即可:
@@ -278,18 +273,20 @@ evaluateDirectRadiance(varyings, surfaceData, brdfData, shadowAttenuation, color
#define FUNCTION_DIFFUSE_LOBE diffuseLobe
#define FUNCTION_SPECULAR_LOBE specularLobe
#define FUNCTION_CLEAR_COAT_LOBE clearCoatLobe
#define FUNCTION_SHEEN_LOBE sheenLobe
void surfaceShading(Varyings varyings, SurfaceData surfaceData, BRDFData brdfData, vec3 incidentDirection, vec3 lightColor, inout vec3 color);
void surfaceShading(Varyings varyings, SurfaceData surfaceData, BRDFData brdfData, vec3 incidentDirection, vec3 lightColor, inout vec3 totalDiffuseColor, inout vec3 totalSpecularColor);
void diffuseLobe(Varyings varyings, SurfaceData surfaceData, BRDFData brdfData, vec3 attenuationIrradiance, inout vec3 diffuseColor);
void specularLobe(Varyings varyings, SurfaceData surfaceData, BRDFData brdfData, vec3 incidentDirection, vec3 attenuationIrradiance, inout vec3 specularColor);
float clearCoatLobe(Varyings varyings, SurfaceData surfaceData, BRDFData brdfData, vec3 incidentDirection, vec3 color, inout vec3 specularColor);
void sheenLobe(Varyings varyings, SurfaceData surfaceData, BRDFData brdfData, vec3 incidentDirection, vec3 attenuationIrradiance, inout vec3 diffuseColor, inout vec3 specularColor);
```
<Callout type="info">重载方式参考上文的 PBR 模板拓展。</Callout>
### LightInDirectPBR
封装了基于 BRDF 光照模型的[环境光](/docs/graphics/light/ambient)计算,详见[源码](https://github.com/galacean/engine-toolkit/blob/main/packages/shaderlab/src/shaders/shadingPBR/LightInDirectPBR.glsl)。
封装了基于 BRDF 光照模型的[环境光](/docs/graphics/light/ambient)计算,详见[源码](https://github.com/galacean/engine/blob/main/packages/shader-shaderlab/src/shaders/shadingPBR/LightIndirectPBR.glsl)。
一般来说,直接调用即可:
@@ -304,15 +301,17 @@ evaluateIBL(varyings, surfaceData, brdfData, color.rgb);
#define FUNCTION_DIFFUSE_IBL evaluateDiffuseIBL
#define FUNCTION_SPECULAR_IBL evaluateSpecularIBL
#define FUNCTION_CLEAR_COAT_IBL evaluateClearCoatIBL
#define FUNCTION_SHEEN_IBL evaluateSheenIBL
void evaluateDiffuseIBL(Varyings varyings, SurfaceData surfaceData, BRDFData brdfData, inout vec3 diffuseColor);
void evaluateSpecularIBL(Varyings varyings, SurfaceData surfaceData, BRDFData brdfData, float radianceAttenuation, inout vec3 specularColor);
float evaluateClearCoatIBL(Varyings varyings, SurfaceData surfaceData, BRDFData brdfData, inout vec3 specularColor);
void evaluateSheenIBL(Varyings varyings, SurfaceData surfaceData, BRDFData brdfData, float radianceAttenuation, inout vec3 diffuseColor, inout vec3 specularColor);
```
### VertexPBR
PBR 顶点着色器所需要的一些方法,比如获取 TilingOffset 之后的 UV 坐标获取骨骼、BS运算过后的世界坐标、法线、切线等详见[源码](https://github.com/galacean/engine-toolkit/blob/main/packages/shaderlab/src/shaders/shadingPBR/VertexPBR.glsl)。
PBR 顶点着色器所需要的一些方法,比如获取 TilingOffset 之后的 UV 坐标获取骨骼、BS运算过后的世界坐标、法线、切线等详见[源码](https://github.com/galacean/engine/blob/main/packages/shader-shaderlab/src/shaders/shadingPBR/VertexPBR.glsl)。
```glsl showLineNumbers {2, 4}
Varyings varyings;
@@ -337,11 +336,15 @@ gl_Position = renderer_MVPMat * vertexInputs.positionOS;
### BRDF
PBR 光照模型关键文件,封装了 BRDF 相关的通用计算函数, 以及用于后续光照模型计算的 `SurfaceData` 结构体和 `BRDFData` 结构体,详见[源码](https://github.com/galacean/engine-toolkit/blob/main/packages/shaderlab/src/shaders/shadingPBR/BRDF.glsl)。
PBR 光照模型关键文件,封装了 BRDF 相关的通用计算函数, 以及用于后续光照模型计算的 `SurfaceData` 结构体和 `BRDFData` 结构体,详见[源码](https://github.com/galacean/engine/blob/main/packages/shader-shaderlab/src/shaders/shadingPBR/BRDF.glsl)。
### BTDF
提供了透射和折射相关函数,详见[源码](https://github.com/galacean/engine/blob/main/packages/shader-shaderlab/src/shaders/shadingPBR/BTDF.glsl)。
### FragmentPBR
包含了大量 CPU 传过来的金属度、粗糙度、贴图等变量,通过 `getSurfaceData` 初始化 `SurfaceData` 结构体,详见[源码](https://github.com/galacean/engine-toolkit/blob/main/packages/shaderlab/src/shaders/shadingPBR/FragmentPBR.glsl)。
包含了大量 CPU 传过来的金属度、粗糙度、贴图等变量,通过 `getSurfaceData` 初始化 `SurfaceData` 结构体,详见[源码](https://github.com/galacean/engine/blob/main/packages/shader-shaderlab/src/shaders/shadingPBR/FragmentPBR.glsl)。
```glsl showLineNumbers
BRDFData brdfData;
@@ -355,4 +358,4 @@ initBRDFData(surfaceData, brdfData);
### 最后
除了关键 API 的功能和调用方式,关于整个文件的组织方式可以参考官网的 [ForwardPassPBR](https://github.com/galacean/engine-toolkit/blob/main/packages/shaderlab/src/shaders/shadingPBR/ForwardPassPBR.glsl)。
除了关键 API 的功能和调用方式,关于整个文件的组织方式可以参考官网的 [ForwardPassPBR](https://github.com/galacean/engine/blob/main/packages/shader-shaderlab/src/shaders/shadingPBR/ForwardPassPBR.glsl)。