Skip to content

Commit

Permalink
Merge pull request #2511 from zhuxudong/document/post-process
Browse files Browse the repository at this point in the history
[Document] post-processing related
  • Loading branch information
eyworldwide authored Jan 24, 2025
2 parents 6d27069 + 606778b commit 89f3ffd
Show file tree
Hide file tree
Showing 10 changed files with 553 additions and 198 deletions.
52 changes: 24 additions & 28 deletions docs/en/core/scene.md → docs/en/core/scene.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -15,11 +15,11 @@ Structurally, each Engine can contain one or more active scenes (currently the e

Right-click in the **[Assets Panel](/en/docs/assets/interface)** (or the + sign at the top right of the assets panel) to create a scene, double-click the scene to switch to it:

![scene-switch](https://gw.alipayobjects.com/zos/OasisHub/eef870a7-2630-4f74-8c0e-478696a553b0/2024-03-19%25252018.04.02.gif)
<Image src="https://gw.alipayobjects.com/zos/OasisHub/eef870a7-2630-4f74-8c0e-478696a553b0/2024-03-19%25252018.04.02.gif" />

### Properties Panel

<img src="https://gw.alipayobjects.com/zos/OasisHub/2eaad4b1-d3e3-4c17-ae7f-58b488cd3606/image-20240718190944508.png" alt="image-20240718190944508" style="zoom:50%;" />
<Image src="https://gw.alipayobjects.com/zos/OasisHub/5770f3d1-6840-438b-adf0-d759048fac6b/image-20250124110620126.png" />

### Ambient Light

Expand All @@ -33,29 +33,24 @@ For details, please refer to the [Background Tutorial](/en/docs/graphics/backgro

For details, please refer to the [Shadow Tutorial](/en/docs/graphics/light/shadow/).

### Post-Processing

For details, please refer to the [Post-Processing Tutorial](/en/docs/graphics/postProcess/postProcess/).

### Fog

You can add **linear, exponential, exponential squared** 3 types of fog to the entire scene:

![Fog](https://gw.alipayobjects.com/zos/OasisHub/224fbc16-e60c-47ca-845b-5f7c09563c83/2024-03-19%25252018.08.23.gif)

<Image src="https://gw.alipayobjects.com/zos/OasisHub/224fbc16-e60c-47ca-845b-5f7c09563c83/2024-03-19%25252018.08.23.gif" />

## Script Usage

| Property Name | Description |
| :-------------------------------------------- | :---------- |
| [scenes](/apis/core/#SceneManager-scenes) | Scene list |
| Property Name | Description |
| :---------------------------------------- | :---------- |
| [scenes](/apis/core/#SceneManager-scenes) | Scene list |

| Method Name | Description |
| :--------------------------------------------------- | :---------- |
| [addScene](/apis/core/#SceneManager-addScene) | Add scene |
| [removeScene](/apis/core/#SceneManager-removeScene) | Remove scene|
| [mergeScenes](/apis/core/#SceneManager-mergeScenes) | Merge scenes|
| [loadScene](/apis/core/#SceneManager-loadScene) | Load scene |
| Method Name | Description |
| :-------------------------------------------------- | :----------- |
| [addScene](/apis/core/#SceneManager-addScene) | Add scene |
| [removeScene](/apis/core/#SceneManager-removeScene) | Remove scene |
| [mergeScenes](/apis/core/#SceneManager-mergeScenes) | Merge scenes |
| [loadScene](/apis/core/#SceneManager-loadScene) | Load scene |

### Loading a Scene

Expand All @@ -64,11 +59,9 @@ If you want to load a **Scene** asset as a scene in the application, you can use
```typescript
const sceneUrl = "...";

engine.resourceManager
.load({ type: AssetType.Scene, url: "..." })
.then((scene) => {
engine.sceneManager.addScene(scene);
});
engine.resourceManager.load({ type: AssetType.Scene, url: "..." }).then((scene) => {
engine.sceneManager.addScene(scene);
});
```

### Getting Scene Objects
Expand Down Expand Up @@ -126,12 +119,12 @@ Call `scene.destroy()` to destroy a scene. The destroyed scene will also be auto

### Entity Tree Management

| Method Name | Description |
| :------------------------------------------------------- | :---------------------------------------------------------------------------------------------------------- |
| [createRootEntity](/apis/core/#Scene-createRootEntity) | The newly created _scene_ does not have a root entity by default and needs to be created manually |
| [addRootEntity](/apis/core/#Scene-addRootEntity) | You can directly create a new entity or add an existing entity |
| [removeRootEntity](/apis/core/#Scene-removeRootEntity) | Remove the root entity |
| [getRootEntity](/apis/core/#Scene-getRootEntity) | Find the root entity, you can get all root entities or a single entity object. Note that all entities are read-only arrays and cannot change length or order |
| Method Name | Description |
| :-- | :-- |
| [createRootEntity](/apis/core/#Scene-createRootEntity) | The newly created _scene_ does not have a root entity by default and needs to be created manually |
| [addRootEntity](/apis/core/#Scene-addRootEntity) | You can directly create a new entity or add an existing entity |
| [removeRootEntity](/apis/core/#Scene-removeRootEntity) | Remove the root entity |
| [getRootEntity](/apis/core/#Scene-getRootEntity) | Find the root entity, you can get all root entities or a single entity object. Note that all entities are read-only arrays and cannot change length or order |

```typescript
const engine = await WebGLEngine.create({ canvas: "demo" });
Expand Down Expand Up @@ -161,4 +154,7 @@ const cameraEntity = rootEntity.createChild("camera");

cameraEntity.addComponent(Camera);
```

```
```
1 change: 1 addition & 0 deletions docs/en/graphics/camera/component.md
Original file line number Diff line number Diff line change
Expand Up @@ -81,6 +81,7 @@ The functionality corresponding to each property is as follows:
| | [msaaSamples](/apis/core/#Camera-msaaSamples) | Number of samples for multi-sample anti-aliasing, effective only when the standalone canvas is enabled, such as `enableHDR`, `enablePostProcess`, `opaqueTextureEnabled`. |
| | [enableHDR](/apis/core/#Camera-enableHDR) | Whether to enable HDR rendering, allowing the shader's output color to be stored using floating-point numbers, providing a wider range of values for post-processing and other scenarios. |
| | [enablePostProcess](/apis/core/#Camera-enablePostProcess) | Whether to enable post-processing. For post-processing configuration, see [Post-Processing Tutorial](/en/docs/graphics/postProcess/postProcess). |
| | [postProcessMask](/apis/core/#Camera-postProcessMask) | Post-processing mask, which determines the effective post-processing components. For post-processing configuration, [Post-Processing Tutorial](/en/docs/graphics/postProcess/postProcess). |

### Culling Mask

Expand Down
177 changes: 177 additions & 0 deletions docs/en/graphics/postProcess/customPostProcess.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,177 @@
---
order: 2
title: Custom Post-Processing
---

In the post-processing system, the effect ([Effect](/apis/core/#PostProcessEffect)) is responsible for maintaining the data layer, and the pipeline ([Pass](/apis/core/#PostProcessPass)) is responsible for writing the rendering logic. In the pipeline, calling the [getBlendEffect](/apis/core/#PostProcessManager-getBlendEffect) method can get the final post-processing data after **global/local** mixing.

```mermaid
graph TD;
subgraph Pass
A1[Custom Pass ...]
A2[Uber Pass]
A3[Custom Pass ...]
end
subgraph Effect
B1[Bloom Effect]
B2[Tonemapping Effect]
B3[Custom Effect]
end
A1 --> A2 --> A3
B1 --> B2 --> B3
A1 ---|Blend| B1
```

The engine has built-in [PostProcessUberPass](/apis/core/#PostProcessUberPass), which is used with [BloomEffect](/apis/core/#BloomEffect) and [TonemappingEffect](/apis/core/#TonemappingEffect) data. If you want to customize post-processing effects, we need to create a new Pass and then create an Effect based on whether the data needs to be fused.

## A Demo

Here we will simply implement a grayscale image post-processing effect~

<Comparison
leftSrc="https://gw.alipayobjects.com/zos/OasisHub/f5d3ea3d-47a4-4618-b3d6-0ea46321e786/image-20250115162952605.png"
leftText="Uber Pass"
rightSrc="https://gw.alipayobjects.com/zos/OasisHub/75ccba72-b70b-49f6-812a-e00305e89201/image-20250115163026345.png"
rightText="Uber Pass + Custom Pass"
/>

### 1. Add script

Let's create a script first. Next, we will write our custom post-processing Shader and Pass in this script file.

<Image src="https://gw.alipayobjects.com/zos/OasisHub/6df5fd1c-a24c-4e32-a62e-841b61fe76c7/image-20250124111456360.png" />

We refer to [Adding post-processing methods](/en/docs/graphics/postProcess/postProcess/#1-adding-a-post-processing-component) to add global or local post-processing components, and hang the script on this entity:

<Image src="https://gw.alipayobjects.com/zos/OasisHub/62a36df2-fdc5-4c13-9a32-c41f963d18b5/image-20250124112043221.png" />

### 2. Shader Writing

The algorithm is not special, but you need to pay attention to `renderer_BlitTexture` This built-in variable is the post-processing rendering result of the previous Pass. Here it is the result after Bloom and Tonemapping. We display this result in grayscale.

```ts showLineNumbers {14} filename="CustomPostProcessPass.ts"
const customShader = Shader.create(
"Gray Scale Shader",
`
attribute vec4 POSITION_UV;
varying vec2 v_uv;
void main() {
gl_Position = vec4(POSITION_UV.xy, 0.0, 1.0);
v_uv = POSITION_UV.zw;
}
`,
`
varying vec2 v_uv;
uniform sampler2D renderer_BlitTexture;
void main(){
vec4 color = texture2D(renderer_BlitTexture, v_uv);
float grayScale = 0.299 * color.r + 0.587 * color.g + 0.114 * color.b;
gl_FragColor = vec4(vec3(grayScale), 1.0);
}
`
);
```

### 3. Create a new Pass

We create a new Pass, directly [Blitter](/apis/core/#Blitter) to the screen in the [onRender hook](/apis/core/#PostProcessUberPass-onRender), and then add this Pass to the engine.

```ts showLineNumbers {10,15} filename="CustomPostProcessPass.ts"
class CustomPass extends PostProcessPass {
private _blitMaterial: Material;

constructor(engine: Engine) {
super(engine);
this._blitMaterial = new Material(this.engine, customShader);
}

onRender(_, srcTexture: Texture2D, dst: RenderTarget): void {
Blitter.blitTexture(this.engine, srcTexture, dst, undefined, undefined, this._blitMaterial, 0);
}
}

const customPass = new CustomPass(engine);
engine.addPostProcessPass(customPass);
```

The execution order of Pass is executed after Uber Pass by default, that is [`PostProcessPassEvent.AfterUber`](/apis/core/#PostProcessPassEvent-AfterUber), we can also manually modify the execution order of the pipeline:

```ts filename="CustomPostProcessPass.ts"
customPass.event = PostProcessPassEvent.BeforeUber;
```

Whether the Pass is effective is determined by the Pass's [isActive](/apis/core/#PostProcessUberPass-isActive) by default. We can also modify the effectiveness logic, such as whether the intensity is greater than 0:

```ts showLineNumbers {2-9} filename="CustomPostProcessPass.ts"
class CustomPass extends PostProcessPass {
override isValid(postProcessManager: PostProcessManager): boolean {
if (!this.isActive) {
return false;
}

const customEffectBlend = postProcessManager.getBlendEffect(CustomEffect);
return customEffectBlend?.intensity > 0;
}
}
```

### 4. Blend data

The above steps 2 and 3 can already customize the post-processing effect. Here is an advanced version of Blend data.

Take `intensity` as an example. We define a `CustomEffect`, which is specifically used to fuse intensity. The data to be fused is also very simple. The engine has encapsulated a series of post-processing parameters, such as [floating point type parameters](/apis/core/#PostProcessEffectFloatParameter).

```ts showLineNumbers {2,7} filename="CustomPostProcessPass.ts"
class CustomEffect extends PostProcessEffect {
intensity = new PostProcessEffectFloatParameter(0.8);
}

// Add this effect to the post-processing component. postProcess can be created separately,
// or it can be the same as Bloom and other effects. It depends on the Blend requirements~
postProcess.addEffect(CustomEffect);
```

After defining the data, you need to change the `onRender` hook of the custom Pass to get the Blend data:

```ts showLineNumbers {3-6} filename="CustomPostProcessPass.ts"
class CustomPass extends PostProcessPass {
onRender(camera: Camera, srcTexture: Texture2D, dst: RenderTarget): void {
const postProcessManager = camera.scene.postProcessManager;
const customEffectBlend = postProcessManager.getBlendEffect(CustomEffect);
if (customEffectBlend) {
this._blitMaterial.shaderData.setFloat("u_intensity", customEffectBlend.intensity.value);
}

Blitter.blitTexture(this.engine, srcTexture, dst, undefined, undefined, this._blitMaterial, 0);
}
}
```

As you can see, we continuously set the fused intensity in the `onRender` hook of the custom pipeline, and then consume this data in the shader:

```ts showLineNumbers {8,12} filename="CustomPostProcessPass.ts"
const customShader = Shader.create(
"Gray Scale Shader",
`
......
`,
`
......
uniform float u_intensity;
void main(){
......
gl_FragColor = vec4(mix(color.rgb, vec3(grayScale), u_intensity), 1.0);
}
`
);
```

If your post-processing component is in local mode, we can also use [Blend Distance](/apis/core/#PostProcess-blendDistance) to set the distance at which the camera is close to the collision body to start blending:

<Image src="https://gw.alipayobjects.com/zos/OasisHub/76b084c1-f517-47d7-8067-7f838db002f8/2025-01-15%25252017.38.30.gif" />
Loading

0 comments on commit 89f3ffd

Please sign in to comment.