diff --git a/docs/en/animation/animator.mdx b/docs/en/animation/animator.mdx
new file mode 100644
index 0000000000..8f975d85f3
--- /dev/null
+++ b/docs/en/animation/animator.mdx
@@ -0,0 +1,155 @@
+---
+order: 3
+title: Animation Control Component
+type: Animation
+label: Animation
+---
+
+## Introduction
+The Animation Control Component ([Animator](/en/apis/core/#Animator)) is responsible for reading data from the [Animation Controller](/en/docs/animation/animatorController/) ([AnimatorController](/en/apis/core/#AnimatorController)) and playing its content.
+
+### Parameter Description
+
+| Attribute | Function Description |
+| :----------------- | :----------------------------- |
+| animatorController | Bind `AnimatorController` asset |
+
+## Editor Usage
+
+When you drag a model into the scene, the model is displayed in its initial pose and does not play any animation. At this point, you need to find the Animation Control Component ([Animator](/en/apis/core/#Animator)) on the model entity and bind an [Animation Controller](/en/docs/animation/animatorController/) asset to it.
+
+1. Find or create an Animation Control Component ([Animator](/en/apis/core/#Animator))
+
+
+The Animation Control Component ([Animator](/en/apis/core/#Animator)) of the model is on the root entity of the glTF instance, which is the first child entity under the model entity in the editor.
+
+If the model contains animations, a read-only [Animation Controller](/en/docs/animation/animatorController/) will be automatically bound for you.
+
+
+
+
+If there is no Animation Control Component ([Animator](/en/apis/core/#Animator)), you can create one as shown below
+
+
+
+2. Create an [Animation Controller](/en/docs/animation/animatorController/) asset and bind it to the model
+
+
+
+
+3. After editing the Animation Controller ([see details](/en/docs/animation/animatorController/)), you can play the animations according to the logic of the [Animation Controller](/en/docs/animation/animatorController/)
+
+## Script Usage
+
+Before using scripts, it is best to read the [Animation System Composition](/en/docs/animation/system) document to help you better understand the operating logic of the animation system.
+
+
+### Play Animation
+
+After loading the GLTF model, the engine will automatically add an Animator component to the model and include the animation clips from the model. You can directly get the Animator component on the root entity of the model and play the specified animation.
+
+```typescript
+engine.resourceManager
+ .load(
+ "https://gw.alipayobjects.com/os/bmw-prod/5e3c1e4e-496e-45f8-8e05-f89f2bd5e4a4.glb"
+ )
+ .then((asset) => {
+ const { defaultSceneRoot } = asset;
+ rootEntity.addChild(defaultSceneRoot);
+ const animator = defaultSceneRoot.getComponent(Animator);
+ animator.play("run");
+ });
+```
+
+#### Control Playback Speed
+
+You can control the animation playback speed through the [speed](/en/apis/core/#Animator-speed) property. The default value of `speed` is `1.0`. The larger the value, the faster the playback; the smaller the value, the slower the playback. When the value is negative, it plays in reverse.
+
+```typescript
+animator.speed = 2.0;
+```
+
+#### Stop/Replay
+
+You can stop and replay the animation by setting the [enabled](/en/apis/core/#Animator-enabled) property of the Animator.
+
+```typescript
+// 停止
+animator.enabled = false;
+// 重新播放
+animator.enabled = true;
+```
+
+#### Pause/Resume Playback
+You can pause and resume playback by setting the [speed](/en/apis/core/#Animator-speed) property of the Animator.
+
+```typescript
+// 暂停
+animator.speed = 0;
+// 恢复
+animator.speed = 1;
+```
+
+If you only want to pause a specific animation state, you can do so by setting its speed to 0.
+
+```typescript
+const state = animator.findAnimatorState("xxx");
+state.speed = 0;
+```
+
+#### Play a Specific Animation State
+
+You can use the [animator.play](/en/apis/core/#Animator-play) method to play a specific `AnimatorState`. For parameter details, see the [API documentation](/en/apis/core/#Animator-play).
+
+```typescript
+animator.play("run");
+```
+
+If you need to start playing at a specific moment in the animation, you can do so as follows:
+
+```typescript
+const layerIndex = 0;
+const normalizedTimeOffset = 0.5; // Normalized time
+animator.play("run", layerIndex, normalizedTimeOffset);
+```
+
+
+
+#### Transition to a Specific Animation State
+
+You can use the [animator.crossfade](/en/apis/core/#Animator-crossfade) method to transition the animation to a specified `AnimatorState`. For parameter details, see the [API documentation](/en/apis/core/#Animator-crossFade).
+
+```typescript
+animator.crossFade("run", 0.3);
+```
+
+
+
+### Get the Currently Playing Animation State
+
+You can use the [getCurrentAnimatorState](/en/apis/core/#Animator-getCurrentAnimatorState) method to get the currently playing AnimatorState. The parameter is the index of the layer where the animation state is located, `layerIndex`. For details, see the [API documentation](/en/apis/core/#Animator-getCurrentAnimatorState). After obtaining it, you can set the properties of the animation state, such as changing the default loop playback to play once.
+
+```typescript
+const currentState = animator.getCurrentAnimatorState(0);
+// 播放一次
+currentState.wrapMode = WrapMode.Once;
+// 循环播放
+currentState.wrapMode = WrapMode.Loop;
+```
+
+### Get an Animation State
+
+You can use the [findAnimatorState](/en/apis/core/#Animator-findAnimatorState) method to get an AnimatorState by its specified name. For details, see the [API documentation](/en/apis/core/#Animator-getCurrentAnimatorState). After obtaining it, you can set the properties of the animation state, such as changing the default loop playback to play once.
+
+```typescript
+const state = animator.findAnimatorState("xxx");
+// 播放一次
+state.wrapMode = WrapMode.Once;
+// 循环播放
+state.wrapMode = WrapMode.Loop;
+```
+
+### Animation Culling
+
+You can set the [cullingMode](/en/apis/core/#Animator-cullingMode) of the [Animator](/en/apis/core/#Animator) to determine whether the animation should be calculated when the entity bound to the Animator is not visible. When the animation is culled, it will not be calculated or applied to the entity, but the animation state will still update over time, ensuring it behaves correctly when it becomes visible again.
+```
diff --git a/docs/en/animation/animatorController.mdx b/docs/en/animation/animatorController.mdx
new file mode 100644
index 0000000000..ebf898b4e7
--- /dev/null
+++ b/docs/en/animation/animatorController.mdx
@@ -0,0 +1,62 @@
+---
+order: 2
+title: Animation Controller
+type: Animation
+label: Animation
+---
+
+The Animation Controller ([AnimatorController](/en/apis/core/#AnimatorController)) helps you organize a set of animations for characters or other animated objects. It visualizes the animation playback logic of animated objects like a flowchart through the [Animation State Machine](/en/docs/animation/state-machine/) ([AnimatorStateMachine](/en/apis/core/#AnimatorStateMachine)) and automatically switches animation states ([AnimatorState](/en/apis/core/#AnimatorState)) and plays the referenced [Animation Clips](/en/docs/animation/clip) ([AnimationClip](/en/apis/core/#AnimationClip)) when conditions are met.
+
+## Editor Usage
+
+Through the Animation Controller editor, users can organize the playback logic of [Animation Clips](/en/docs/animation/clip).
+
+1. Prepare the Animation Clips ([Create Animation Clips](/en/docs/animation/clip)).
+
+
+
+2. To organize the playback of these Animation Clips, we need to create an Animation Controller ([AnimatorController](/en/apis/core/#AnimatorController)) asset.
+
+
+
+3. The newly created Animation Controller has no data. We need to edit it, double-click the asset, and add an Animation State ([AnimatorState](/en/apis/core/#AnimatorState)) to it.
+
+
+4. Click AnimatorState to bind an [Animation Clip](/en/docs/animation/clip) ([AnimationClip](/en/apis/core/#AnimationClip)) to it.
+
+
+5. Bind the Animation Controller ([AnimatorController](/en/apis/core/#AnimatorController)) asset to the [Animation Control Component](/en/docs/animation/animator).
+
+
+6. At this point, you can play the `Idle` animation in the exported project through `animator.play("New State")`.
+
+
+For further usage of the Animation Controller editor, please refer to the [Animation State Machine](/en/docs/animation/state-machine/) documentation.
+
+
+## Script Usage
+
+
+ Before using scripts, it is best to read the [Animation System Composition](/en/docs/animation/system) documentation to help you better understand the operation logic of the animation system.
+
+
+### Binding the Animation Controller
+
+You can bind the Animation Controller to the [Animator](/en/apis/core/#Animator) through the [animator.animatorController](/en/apis/core/#Animator-animatorController) property. If the loaded glTF model has animation data, it will automatically bind a default AnimatorController to each glTF instance.
+
+```typescript
+animator.animatorController = new AnimatorController(engine);
+animator.play("New State");
+```
+
+#### Reusing Animation Data
+
+The [animatorController](/en/apis/core/#AnimatorController) of the Animator is a data storage class and does not contain runtime data. Based on this design, as long as the hierarchical structure and naming of the skeletal nodes of the model bound to the Animator component are the same, we can reuse the animation data of this model.
+
+```typescript
+const animator = model1.getComponent(Animator);
+// 获取动画模型的动画控制器
+animator.animatorController = animationGLTF.getComponent(Animator).animatorController;
+// 播放 animationGLTF 的动画
+animator.play("anim from animationGLTF");
+```
diff --git a/docs/en/animation/clip-for-artist.md b/docs/en/animation/clip-for-artist.md
index 4e452b4246..28e7851aa2 100644
--- a/docs/en/animation/clip-for-artist.md
+++ b/docs/en/animation/clip-for-artist.md
@@ -1,23 +1,23 @@
---
-order: 7
-title: Art Animation Slicing
+order: 8
+title: Animation Slicing
type: Animation
label: Animation
---
-An **AnimationClip** is a **combination of animations on a timeline**, which can include animations for multiple objects such as rotations, translations, scaling, and weights, like **walking, running, jumping** that can be exported as separate animation clips; Galacean engine can choose which animation clip to play, provided that the FBX or glTF exported from the modeling software contains multiple animation clips.
+Animation Clip (**AnimationClip**) is a **combination of animations on a timeline**, which can include rotations, translations, scaling, and weight animations of multiple objects. For example, **walking, running, and jumping** can be exported as 3 separate animation clips. The Galacean engine can choose which animation clip to play, provided that the FBX or glTF exported from the modeling software contains multiple animation clips.
-To reduce communication costs, this article lists several common methods for animation slicing, exporting to glTF for direct use in the Galacean engine, and also verifying functionality through the [glTF Viewer](https://galacean.antgroup.com/#/gltf-viewer) page.
+To reduce communication costs, this article lists several common methods for animation slicing and exporting glTF for direct use by the Galacean engine. You can also perform functional verification through the [glTF Preview](https://galacean.antgroup.com/engine/gltf-viewer) page.
-Blender's animation editing interface is very user-friendly, displaying nodes affected by animations clearly and showing keyframes on the timeline, making it recommended for animation slicing.
+Blender's animation editing interface is very user-friendly, clearly visualizing the nodes affected by the animation and displaying keyframes on the timeline. Therefore, it is recommended to use Blender for animation slicing.
### Blender
-1. Open Blender, import a model in a format supported by Blender, and switch to the **Animation Editing** window:
+1. Open Blender, import a model format supported by Blender, and then switch to the **Animation Editing** window:

-2. By using the "New Animation Clip" button in the image above, you can quickly duplicate the current animation clip and then perform unique operations. If the button is not displayed, make sure the **Action Editor** is open:
+2. Using the "New Animation Clip" button shown above, you can quickly copy the current animation clip and then perform unique operations. If the button is not displayed, make sure the "**Action Editor**" is open:

@@ -29,26 +29,25 @@ For example, a new animation clip named **animation_copy** is created, and only
-The exported clip timelines must be consistent, which can be configured in two places: the bottom right corner or the output properties:
+The exported clip timeline must be consistent, which can be configured in two places: the bottom right corner or the output properties:
-3. Since the timelines must be consistent, the animation clips just sliced need to be moved to the starting frame by dragging:
+3. Since the timeline must be consistent, you need to move the animation clips just sliced to the starting frame by dragging them:
-
-4. At this point, the animation slices are ready. Export as glTF or FBX, and integrate with the Galacean engine:
+4. At this point, the animation slices are ready. Export them as glTF or FBX and integrate them into the Galacean engine:
-Unity can also export animation slices, but the efficiency is relatively low.
+Unity can also export animation slices, but it is less efficient.
### Unity
@@ -58,49 +57,46 @@ Plugin: [AntG-Unity-Plugin.unitypackage.zip](https://www.yuque.com/attachments/y
2. Open Unity.
-3. Double-click the plugin, **Import** with the default selected options:
+3. Double-click the plugin and **Import** the default selected options:
-If the installation is successful, you will see the **AntG** option added to the menu bar:
+If the installation is successful, you will see an **AntG** option in the menu bar:
-4. Drag the FBX file that needs to be sliced into the resource panel:
+4. Drag the FBX file that needs slicing into the asset panel:
-5. Click on the resource to bring up the animation debugging preview box. Add animation slices and adjust the timeline **start** and **end** for each slice as needed. If the preview animation effect is abnormal, make sure the **Resample Curves** default option is not checked. After slicing, remember to click the **Apply** confirmation button in the bottom right corner.
-
+5. Click on the asset to bring up the animation debugging preview box. Add animation slices and adjust the timeline **start** and **end** for each slice as needed. If the animation preview appears abnormal, ensure that the **Resample Curves** option, which is enabled by default, is unchecked. After slicing is complete, remember to click the **Apply** button in the lower right corner to confirm.
-6. At this point, the animation slice resources have been created. Next, let's discuss how to export them. First, drag this resource into the node tree:
+6. At this point, the animation slice resource has been created. Next, we will introduce how to export it. First, drag the resource into the node tree:
-7. Then add an **Animator** Component to this node:
+7. Then add the **Animator** Component to the node:
-8. As you can see, the Animator component needs to be associated with an Animator Controller resource. Therefore, we need to create a new Animator Controller resource in the resource panel:
+8. You can see that the Animator component needs to be bound to an Animator Controller resource, so we need to create a new Animator Controller resource in the resource bar:
-9. Double-click on this controller resource and drag our previous animation slice into it:
+9. Double-click the controller resource and drag our previously created animation slice into it:
-10. The Animator Controller resource is now ready. Bind it to the Component we just created:
+10. The Animator Controller resource is now complete and can be bound to the Component we just created:
-11. Right-click on this node, select Export AntG:
+11. Right-click the node and select Export AntG:
-12. With that, the exported glTF file of the created animation slice is complete. You can visit Galacean's [glTF Viewer](https://galacean.antgroup.com/#/gltf-viewer) for functional verification.
-
-
+12. At this point, the created animation slice glTF file has been exported. You can visit Galacean's [glTF Viewer](https://galacean.antgroup.com/engine/gltf-viewer) for functionality verification.
diff --git a/docs/en/animation/clip.mdx b/docs/en/animation/clip.mdx
new file mode 100644
index 0000000000..f20549fad6
--- /dev/null
+++ b/docs/en/animation/clip.mdx
@@ -0,0 +1,227 @@
+---
+order: 1
+title: Animation Clip
+type: Animation
+label: Animation
+---
+
+**Animation Clip** is one of the core elements of the Galacean animation system. Galacean supports importing model animations designed with external design software. Each animation in the model with animations output by the designer will be parsed into individual **animation clip** assets in Galacean. We can also create additional animations through the animation clip editor.
+
+There are two common ways to obtain animation clips:
+
+1. Import models with animations created using third-party tools (such as Autodesk® 3ds Max®, Autodesk® Maya®, Blender). See [Creating Animation Clips for Artists](/en/docs/animation/clip-for-artist)
+
+
+
+2. Create animation clips in Galacean (the editor and script creation methods will be introduced below).
+
+## Animation Panel Introduction
+
+The animation clip editor currently supports editing all interpolable properties except for physics-related components. If you need to edit other properties, you need to add the corresponding components to the entity.
+
+
+
+> Note: The Galacean animation editor defaults to 60FPS. The time `0:30` shown in the figure above is at the 30th frame. If the timeline scale is `1:30`, it is at the 90th frame.
+
+## Basic Operations
+
+### Transform Component Example
+
+1. In the **[Assets Panel](/en/docs/assets/interface)**, right-click/click + to create an `Animation Clip` asset.
+
+
+
+2. Double-click the `Animation Clip` asset and select an entity as the editing object for the `Animation Clip`.
+
+
+
+Clicking the `Create` button will automatically add the [Animation Control Component](/en/docs/animation/animator) to your entity and add the animation clip to the [Animation Controller](/en/docs/animation/animatorController/).
+
+
+
+
+3. Add the properties to animate (here I added two).
+
+
+
+
+4. Add keyframes to the properties.
+
+
+When we click the add keyframe button, the keyframe will store the current value of the specified property. So when we haven't changed anything, the keyframe stores the `position` value of the entity at that moment. We want it to move to the position (3, 0, 0) after 60 frames, so first modify the cube to (3, 0, 0) through the property panel and then add a keyframe.
+
+Similarly, we also add keyframes for the `rotation` property.
+
+
+##### Recording Mode
+
+We provide a recording mode to facilitate developers in quickly adding keyframes. When recording mode is enabled, keyframes are automatically added when the corresponding properties are modified.
+
+
+
+### Text Animation Example
+
+First, you need an entity with a TextRender component.
+
+
+
+When we add properties at this point, we can see that the properties that can add keyframes have increased with the interpolable properties related to the `TextRenderer` component.
+
+
+
+We add keyframes using recording mode as described above.
+
+
+
+### Frame Animation Example
+
+In addition to numerical types, Galacean also supports animation curves of reference types. You can read [Frame Animation](/en/docs/animation/sprite-sheet/) to learn how to create frame animations.
+
+### Material Animation Example
+
+Galacean also supports animation editing of asset properties within components. If there are material assets in the component, there will be additional asset property editing in the `Inspector`.
+
+
+
+It should be noted that the default [material](/en/docs/graphics/material/material/) in the editor is not editable.
+
+
+
+So, if we want to animate the material of this cube, we need to create a new material and replace it. Then, just like above, enable recording mode and directly modify the properties.
+
+
+
+
+## More Operations
+
+### Operating Keyframes
+
+#### Modify Keyframe Time
+
+Select the keyframe and drag it.
+
+
+
+You can zoom the timeline by scrolling the `mouse wheel`.
+
+
+
+#### Modify Keyframe Value
+
+Enable recording mode, move the timeline to the specified keyframe, and then re-enter the value. If recording mode is not enabled, you need to click the add keyframe button again.
+
+
+
+#### Delete Keyframe
+
+Select the keyframe, right-click and choose delete, or press the delete/backspace key on the keyboard.
+
+
+
+### Editing Child Entities
+
+`Animation clips` can not only be applied to entities with the `Animator` component but also to their child entities.
+
+1. We add a child entity to the cube we just created.
+
+
+
+2. We can see that the properties of the child entity can be added by clicking "Add Property" again.
+
+
+
+3. Enable recording mode, edit the child entity and add keyframes.
+
+
+
+
+### Edit Animation Curves
+
+The `Animation Clip Editor` supports animation curve editing. Click the curve icon in the upper right corner to switch.
+
+
+
+The vertical axis in curve mode represents the value of the property.
+
+
+
+
+You can adjust the vertical axis scale by pressing `shift + mouse wheel`.
+
+
+
+The color of the curve corresponding to the property is the same as the color of the button.
+
+
+
+Selecting a keyframe will show two control points. Adjusting the control points will adjust the curve.
+
+
+
+You can also set built-in preset values directly by right-clicking the keyframe.
+
+
+
+Selecting a property in the property panel allows you to edit the curve of the specified property only.
+
+
+
+## Using Scripts
+
+Before using scripts, it is best to read the [Animation System Composition](/en/docs/animation/system) document to help you better understand the operating logic of the animation system.
+
+
+You can create an [AnimationClip](/en/apis/core/#AnimationClip) yourself and bind an [AnimationCurve](/en/apis/core/#AnimationCurve) to it through [addCurveBinding](/en/apis/core/#AnimationClip-addCurveBinding).
+
+```typescript
+//custom rotate clip
+const rotateClip = new AnimationClip("rotate");
+const rotateState =
+ animator.animatorController.layers[0].stateMachine.addState("rotate");
+rotateState.clip = rotateClip;
+
+const rotateCurve = new AnimationVector3Curve();
+const key1 = new Keyframe();
+key1.time = 0;
+key1.value = new Vector3(0, 0, 0);
+const key2 = new Keyframe();
+key2.time = 10;
+key2.value = new Vector3(0, 360, 0);
+rotateCurve.addKey(key1);
+rotateCurve.addKey(key2);
+rotateClip.addCurveBinding("", Transform, "rotation", rotateCurve);
+
+//custom color clip
+const colorClip = new AnimationClip("color");
+const colorState =
+ animator.animatorController.layers[0].stateMachine.addState("color");
+colorState.clip = colorClip;
+
+const colorCurve = new AnimationFloatCurve();
+const key1 = new Keyframe();
+key1.time = 0;
+key1.value = 0;
+const key2 = new Keyframe();
+key2.time = 10;
+key2.value = 1;
+colorCurve.addKey(key1);
+colorCurve.addKey(key2);
+colorClip.addCurveBinding("/light", DirectLight, "color.r", colorCurve);
+```
+
+You can also bind an AnimationCurve to your child entity `/light`, as shown in the code example above. The third parameter of `addCurveBinding` is not limited to the properties of the component; it is a path that can index the value.
+
+
+
+### Animation Events
+
+You can use [AnimationEvent](/en/apis/core/#AnimationEvent) to add events to the AnimationClip. The animation event will call the specified callback function of the component bound to the same entity at the specified time.
+
+```typescript
+const event = new AnimationEvent();
+event.functionName = "test";
+event.time = 0.5;
+clip.addEvent(event);
+```
+
+
diff --git a/docs/en/animation/layer.mdx b/docs/en/animation/layer.mdx
new file mode 100644
index 0000000000..ec13b1f315
--- /dev/null
+++ b/docs/en/animation/layer.mdx
@@ -0,0 +1,63 @@
+---
+order: 5
+title: Animation Layer
+type: Animation
+label: Animation
+---
+
+## Introduction
+Animation layers ([AnimatorControllerLayer](/en/apis/core/#AnimatorControllerLayer)) are used to organize and manage animation state machines within an animation controller ([AnimatorController](/en/apis/core/#AnimatorController)). By using animation layers, developers can combine different animation state machines, making complex animation logic easier to manage and control. Each layer has its own animation state machine, weight, and blending mode.
+
+| Property | Description |
+| :------- | :--------------------------------------------------------------------------------- |
+| Name | The name of the layer. |
+| Weight | The blending weight of the layer, default value is 1.0. |
+| Blending | The blending mode of the layer, `Additive` for additive mode, `Override` for override mode, default is `Override`. |
+
+### Additive Mode
+In additive mode, the animation of the higher layer will be added on top of the animation of the lower layer. This mode calculates the differences in animation keyframes and applies these differences to the lower layer animation, achieving an additive effect. It is often used to add details or adjustments to basic actions. For example, a character can use the `Additive` mode to add a breathing animation while walking or add a facial expression change while attacking.
+
+### Override Mode
+In override mode, the animation of the higher layer will completely override the animation of the lower layer. This means that in the final animation output, the higher layer animation will take precedence and replace the lower layer animation effect. It is often used for layered control of animations. For example, you might need to control the actions of different body parts separately. The `Override` mode can control each part separately, such as adjusting the posture or actions of the upper body while running.
+
+The blending mode of the first layer is always override mode.
+
+
+### Blending Weight
+It is used to control the influence of a specific animation layer on the final animation result. It is a floating-point value between 0 and 1, determining the blending ratio of the layer's animation in the final animation. For example, if a layer in additive mode makes the character's head turn 90 degrees, and the blending weight of that layer is 0.5, the character will only turn 45 degrees.
+
+The weight of the first layer is always 1.0.
+
+
+
+
+
+## Editor Usage
+### Additive Mode
+We add an animation layer in the editor and set `Blending` to `Additive`.
+
+You can see that this character has added a `head shaking` animation on top of the original animation.
+
+### Override Mode
+Set `Blending` to `Override`.
+
+You can see that the character's animation completely replaces the animation of the first layer.
+
+### Blending Weight
+
+You can see that the higher the weight of the animation layer, the greater the impact on the animation effect.
+
+
+## Script Usage
+
+```typescript
+const layers = animator.animatorController.layers;
+const baseLayer = layers[0];
+const additiveLayer = layers[1];
+// 叠加模式
+additiveLayer.blendingMode = AnimatorLayerBlendingMode.Additive;
+// 覆盖模式
+additiveLayer.blendingMode = AnimatorLayerBlendingMode.Override;
+additiveLayer.weight = 0.5;
+```
+
diff --git a/docs/en/animation/overview.md b/docs/en/animation/overview.md
index 2e355e0d9c..c6a8648597 100644
--- a/docs/en/animation/overview.md
+++ b/docs/en/animation/overview.md
@@ -1,18 +1,18 @@
---
order: 0
-title: Animation System Overview
+title: Overview of the Animation System
type: Animation
label: Animation
---
-The animation system of Galacean has the following features:
+Galacean's animation system has the following features:
- Parse animations from GLTF/FBX models and convert them into AnimationClip objects in Galacean
- Add animations to all components and their properties in Galacean
- Set the start/end time of animations to trim them
-- Set transitions between animations to layer multiple animations
-- Apply animations from one model to another
-- Add animation events and scripts for animation lifecycle
+- Set transitions between animations and overlay multiple animations
+- Apply animations from one model to another model
+- Add animation events and scripts for the animation lifecycle
## Animation Workflow
@@ -20,21 +20,21 @@ The overall workflow for creating interactive projects using the editor:
```mermaid
flowchart LR
- Add animation clips --> Create animator controller and import animation clips --> Add animator component to play animations
+ Add Animation Clip --> Create Animation Controller and Import Animation Clip --> Add Animation Control Component to Play Animation
```
-### 1. Add Animation Clips
+### 1. Add Animation Clip
-The animation system of Galacean is based on the concept of animation clips, which contain information on how certain objects should change their position, rotation, or other properties over time. Each animation clip can be seen as a single linear recording.
+Galacean's animation system is based on the concept of animation clips, which contain information on how certain objects should change their position, rotation, or other properties over time. Each animation clip can be considered a single linear recording.
-You can create animation clips in the editor, see [Creating Animation Clips](/en/docs/animation/clip), or import models with animations created using third-party tools (such as Autodesk® 3ds Max®, Autodesk® Maya®, Blender), see [Creating Animation Clips for Artists](/en/docs/animation/clip-for-artist), or from motion capture studios or other sources.
+You can create animation clips in the editor, see [Creating Animation Clips](/en/docs/animation/clip) for details. You can also import models with animations created using third-party tools (such as Autodesk® 3ds Max®, Autodesk® Maya®, Blender), see [Creating Animation Clips for Artists](/en/docs/animation/clip-for-artist) for details, or from motion capture studios or other sources.
-### 2. Create Animator Controller and Import Animation Clips
+### 2. Create Animation Controller and Import Animation Clip
-The animator controller is a structured system similar to a flowchart that acts as a state machine in the animation system, responsible for tracking which clip should be played and when animations should change or blend together.
+The animation controller is a structured system similar to a flowchart that acts as a state machine in the animation system, responsible for tracking which clip should be played currently and when animations should change or blend together.
-You can learn how to use it in this [Animator Controller](/en/docs/animation/animatorController/).
+You can learn how to use it in this [Animation Controller](/en/docs/animation/animatorController/).
-### 3. Add Animator Component
+### 3. Add Animation Control Component
-After creating the animator controller, we need to add the [Animator Component](/en/docs/animation/animator) to the entity and bind the animator controller asset to play animations.
+After editing the animation controller, we need to add the [Animation Control Component](/en/docs/animation/animator) to the entity and bind the animation controller asset to play the animation.
diff --git a/docs/en/animation/sprite-sheet.md b/docs/en/animation/sprite-sheet.md
index 4bd2437ac0..38da51acb4 100644
--- a/docs/en/animation/sprite-sheet.md
+++ b/docs/en/animation/sprite-sheet.md
@@ -1,32 +1,32 @@
---
-order: 4
+order: 7
title: Frame Animation
type: Animation
label: Animation
---
-Galacean supports referencing type animation curves, you can add keyframes of type assets such as (sprites). The following is the process of creating sprite animations:
+Galacean supports reference-type animation curves. You can add keyframes of asset types such as (sprites). The following image shows the process of creating sprite animations:
1. Add the `SpriteRenderer` component to the node

-2. Add a `sprite`, you can refer to [Sprite](/en/docs/graphics/2D/sprite)
+2. Add `Sprite`, you can refer to [Sprite](/en/docs/graphics/2D/sprite)

-3. Create an [animation clip](/en/docs/animation/clip) in the **[Asset Panel](/en/docs/assets/interface)**
+3. Create [Animation Clip](/en/docs/animation/clip) in the **[Assets Panel](/en/docs/assets/interface)**

-4. Enable recording mode, click on the corresponding frame in the editor, and add a `Sprite` in the `SpriteRenderer` to automatically add keyframes
+4. Enable recording mode, click on the corresponding frame number in the editor, and add `Sprite` in `SpriteRenderer` to automatically add keyframes

### Script Implementation
-Starting from version 1.1, the engine supports referencing type animation curves ([AnimationRefCurve](/apis/core/#AnimationRefCurve)), where the values of keyframes can be assets such as (sprites, materials). You can create reference type animation curves to achieve capabilities like frame animation:
+The engine supports reference-type animation curves ([AnimationRefCurve](/en/apis/core/#AnimationRefCurve)) in version 1.1. The value of the keyframe can be assets such as (sprites, materials). You can create reference-type animation curves to achieve capabilities such as frame animation:
diff --git a/docs/en/animation/state-machine.mdx b/docs/en/animation/state-machine.mdx
new file mode 100644
index 0000000000..2dee6d11b6
--- /dev/null
+++ b/docs/en/animation/state-machine.mdx
@@ -0,0 +1,269 @@
+---
+order: 4
+title: Animation State Machine
+type: Animation
+label: Animation
+---
+## Introduction
+
+The Animation State Machine ([AnimatorStateMachine](/en/apis/core/#AnimatorStateMachine)) is a tool used to control and manage animation transitions. You can use it to add various animation states and their transition rules, allowing characters or animated objects to switch naturally between different actions.
+
+### Main Components
+
+#### Animation State ([AnimatorState](/en/apis/core/#AnimatorState))
+Represents a single state within the animation state machine, i.e., the animation being played at a particular moment in the animation system, such as "standing," "running," or "jumping." Each state has a corresponding animation clip.
+
+| Attribute | Description |
+| :------- | :------------------------------------------------------------------- |
+| Name | Modify the name of the `AnimatorState`. The name must be **unique** within its layer. |
+| AnimationClip | Used to bind the `AnimationClip` asset, which stores the animation data of the model. |
+| WrapMode | Specifies whether the `AnimatorState` is looped or played once. The default is `Once`, meaning it plays once. |
+| Speed | The playback speed of the `AnimatorState`. The default value is 1.0. The higher the value, the faster the animation. |
+| StartTime | The time from which the `AnimatorState` starts playing the `AnimationClip`. The time is normalized relative to the duration of the `AnimationClip`. The default value is 0, meaning it starts from the beginning. For example, a value of 1.0 means the last frame of the `AnimationClip`. |
+| EndTime | The time at which the `AnimatorState` stops playing the `AnimationClip`. The time is normalized relative to the duration of the `AnimationClip`. The default value is 1.0, meaning it plays to the end. |
+| [StateMachineScripts](/en/apis/core/#StateMachineScript)| Allows developers to write custom script logic to execute specific code during different events of the animation state machine (such as state entry, exit, update, etc.). It is similar to Script but specifically for the animation state machine.|
+
+##### There are three special animation states in the editor
+`entry`: Used to represent the entry point of the animation state machine. When entering the animation state machine, it always first enters `entry` and then jumps to other states based on defined transition conditions. `entry` itself does not play an animation; it is mainly used to connect the starting point of the state machine to the initial animation state. Typically, you should connect it to the default state of the character or animated object, such as the character's `Idle` state.
+
+`any`: Allows any state in the animation state machine to transition to the target state under specific conditions. This is useful for handling global events or emergency animations (such as being injured or dying).
+
+The `any` state has the highest priority, so use `any` with caution as it can disrupt the normal flow of the current animation, potentially leading to unnatural animation transitions. Developers need to ensure that `any` is used for transitions only under clear conditions.
+
+`exit`: Represents the exit point of the animation state machine. When the state machine enters `exit`, it usually means the state machine has ended. The state machine will re-enter the `entry` state.
+
+#### Animation Transition ([AnimatorStateTransition](/en/apis/core/#AnimatorStateTransition))
+Used to define the transition rules and conditions between two states in the animation state machine. It determines when and how to switch from one AnimatorState to another AnimatorState.
+
+| Attribute | Description |
+| :------- | :---------------------------------------------------------|
+| Duration | Transition duration, normalized relative to the target state. The default value is 0.25 |
+| Offset | Forward offset time of the target state, normalized relative to the target state. The default value is 0 |
+| ExitTime | The start time of the transition from the initial state, normalized relative to the initial state. The default value is 0.75 |
+| Solo | Makes the selected animation transition the only active state. Other animation transitions will be ignored. |
+| Mute | Disables the selected animation transition |
+
+Solo and Mute are usually used for debugging, helping developers test and debug the animation state machine more efficiently.
+
+## Using the Animation State Machine
+### Default Playback
+
+#### Editor Usage
+Connect the animation state ([AnimatorState](/en/apis/core/#AnimatorState)) to `entry`, and the animation on it will automatically play when you run the exported project, without needing to call `animator.play`. Click on the entity bound to the animation control component ([Animator](/en/docs/animation/animator)) to preview the animation.
+
+
+
+#### Script Usage
+
+You can connect the animation state to the `entry` of the animation state machine using the [animatorStateMachine.addEntryStateTransition](/en/apis/core/#AnimatorStateMachine-addEntryStateTransition) method.
+
+```typescript
+const animatorStateMachine = animator.animatorController.layers[0].stateMachine;
+animatorStateMachine.addEntryStateTransition('Idle');
+```
+
+### Animation Transition
+
+#### Editor Usage
+Connect the two `AnimatorState` you want to transition between to achieve the animation transition effect. Click on the line between the two animations to modify the animation transition parameters and adjust the effect.
+
+
+
+Click on the line to set the parameters of the animation transition ([AnimatorStateTransition](/en/apis/core/#AnimatorStateTransition)), such as adding conditions:
+
+
+
+
+ If multiple conditions are added, the transition will only start when all conditions are met.
+
+
+#### Script Usage
+
+You can achieve transitions between animation states by adding `AnimatorTransition` to `AnimatorState`.
+
+```typescript
+const walkState = animatorStateMachine.addState('walk');
+walkState.clip = walkClip;
+const runState = animatorStateMachine.addState('run');
+runState.clip = runClip;
+const transition = new AnimatorStateTransition();
+transition.duration = 1;
+transition.offset = 0;
+transition.exitTime = 0.5;
+transition.destinationState = runState;
+walkState.addTransition(transition);
+animator.play("walk");
+```
+In this way, every time the `walk` animation is played on the layer where the animation state machine is located, it will start transitioning to the `run` animation halfway through the playback.
+
+##### Adding Conditions to Animation Transitions
+You can add conditions to animation transitions using the [animatorStateTransition.addCondition](/en/apis/core/#AnimatorStateTransition-addCondition) method.
+
+```typescript
+const idleState = animatorStateMachine.addState('idle');
+idleState.clip = idleClip;
+const walkState = animatorStateMachine.findStateByName('walk');
+const transition = new AnimatorStateTransition();
+
+transition.addCondition(AnimatorConditionMode.Greater, 'speed', 0);
+
+transition.destinationState = walkState;
+idleState.addTransition(transition);
+```
+In this way, every time the `idle` animation is played on the layer where the animation state machine is located and the `speed` parameter is greater than `0`, the animation will transition to the `walk` animation.
+
+##### Adding/Removing Parameters
+You can add parameters using the [animatorController.addParameter](/en/apis/core/#AnimatorController-addParametere) method and remove parameters using the [animatorController.removeParameter](/en/apis/core/#AnimatorController-removeParametere) method.
+
+```typescript
+const animatorController = animator.animatorController;
+animatorController.addParameter('speed', 0);
+// 删除参数
+animatorController.removeParameter('speed');
+```
+
+##### Setting Parameter Values
+You can set parameter values using the [animator.setParameterValue](/en/apis/core/#Animator-setParameterValue) method.
+
+```typescript
+class AnimationScriptExample extends Script {
+ animator: Animator;
+ onStart() {
+ this.animator = this.entity.getComponent(Animator);
+ }
+
+ onUpdate(deltaTime: number) {
+ const inputManager = this.engine.inputManager;
+
+ // 如果用户按下 W 键,则设置 speed 参数值为 1, 符合切换到走路状态的条件,动画切换到走路
+ if (inputManager.isKeyHeldDown(Keys.KeyW)) {
+ this.animator.setParameterValue('speed', 1);
+ }
+ // 如果用户松开 W 键,则设置 speed 参数值为 0, 符合切换到站立状态的条件,动画切换到站立
+ if (inputManager.isKeyUp(Keys.KeyW)) {
+ this.animator.setParameterValue('speed', 0);
+ }
+ }
+}
+```
+
+##### Getting Parameter Values
+You can get parameter values using the [animator.getParameterValue](/en/apis/core/#Animator-getParameterValue) method:
+```typescript
+class AnimationScriptExample extends Script {
+ animator: Animator;
+
+ onStart() {
+ this.animator = this.entity.getComponent(Animator);
+ }
+
+ onUpdate(deltaTime: number) {
+ const speed = this.animator.getParameterValue('speed');
+ if (speed === 1) {
+ console.log("The player is walking")
+ }
+ }
+}
+```
+
+##### Get Parameter Object
+You can get the parameter object through the [animator.getParameter](/en/apis/core/#Animator-getParameter) method:
+
+```typescript
+class AnimationScriptExample extends Script {
+ animator: Animator;
+
+ onStart() {
+ this.animator = this.entity.getComponent(Animator);
+ const speedParameter = this.animator.getParameter('speed');
+ // 设置 speed 的默认值
+ speedParameter.defaultValue = 0;
+ }
+}
+```
+
+### Animation State Machine Animation Loop
+Connect the animation state to the `exit` state, and the state machine will exit and re-enter `entry`, making the overall process loop.
+
+
+
+#### Script Usage
+
+You can connect the animation state to the state machine's `exit` through the [state.addExitTransition](/en/apis/core/#AnimatorState-addExitTransition) method.
+
+```typescript
+const runState = animatorStateMachine.addState('run');
+runState.addExitTransition();
+```
+In this way, every time you play the `Run` animation on the layer where the animation state machine is located, it will enter the `exit` state after finishing and then start again from `entry`.
+
+
+
+### State Machine Script
+You can add state machine scripts to each animation state, which allows you to receive callbacks in different events of the animation state machine (such as state entry, exit, update, etc.) [see details](/en/apis/core/#StateMachineScript).
+
+
+
+
+
+#### Script Usage
+The state machine script provides three animation state cycles:
+
+- `onStateEnter`: Callback when the animation state starts playing.
+- `onStateUpdate`: Callback when the animation state updates.
+- `onStateExit`: Callback when the animation state ends.
+
+```typescript
+class theScript extends StateMachineScript {
+ /**
+ * onStateEnter is called when a transition starts and the state machine starts to evaluate this state.
+ * @param animator - The animator
+ * @param animatorState - The state be evaluated
+ * @param layerIndex - The index of the layer where the state is located
+ */
+ onStateEnter(animator: Animator, animatorState: AnimatorState, layerIndex: number): void {
+ console.log(`Enter ${animatorState.name}`)
+ }
+
+ /**
+ * onStateUpdate is called on each Update frame between onStateEnter and onStateExit callbacks.
+ * @param animator - The animator
+ * @param animatorState - The state be evaluated
+ * @param layerIndex - The index of the layer where the state is located
+ */
+ onStateUpdate(animator: Animator, animatorState: AnimatorState, layerIndex: number): void {
+ console.log(`Update ${animatorState.name}`)
+ }
+
+ /**
+ * onStateExit is called when a transition ends and the state machine finishes evaluating this state.
+ * @param animator - The animator
+ * @param animatorState - The state be evaluated
+ * @param layerIndex - The index of the layer where the state is located
+ */
+ onStateExit(animator: Animator, animatorState: AnimatorState, layerIndex: number): void {
+ console.log(`Exit ${animatorState.name}`)
+ }
+}
+
+animatorState.addStateMachineScript(theScript)
+```
+
+If your script does not need to be reused, you can also write it like this:
+
+```typescript
+state.addStateMachineScript(
+ class extends StateMachineScript {
+ onStateEnter(
+ animator: Animator,
+ animatorState: AnimatorState,
+ layerIndex: number
+ ): void {
+ console.log("onStateEnter: ", animatorState);
+ }
+ }
+);
+```
+
+## Multi-layer Animation State Machine Blending
+Each animation layer ([AnimatorControllerLayer](/en/apis/core/#AnimatorControllerLayer)) has a state machine, and the final animation performance is blended from multiple layers. For more details on how to use animation layers (AnimatorControllerLayer), see [here](/en/docs/animation/layer).
diff --git a/docs/en/art/bake-c4d.md b/docs/en/art/bake-c4d.md
index 3c991a3dfb..f1ee105a50 100644
--- a/docs/en/art/bake-c4d.md
+++ b/docs/en/art/bake-c4d.md
@@ -5,52 +5,54 @@ type: Art
label: Art
---
-Using C4D-OC renderer baking (windows) as an example.
+Using C4D-OC renderer baking (Windows) as an example.
### What is Baking
-Baking is to express all rendered material color information in the form of a texture.
+Baking is the process of expressing all the rendered material color information in the form of a texture map.
-Baking requires two sets of models: a high poly model and a low poly model. The high poly model is used to bake textures with higher detail, while the low poly model is used in the engine with the texture. When creating them, ensure the UVs are consistent. First, layout the UVs for the low poly model, then refine it to create the high poly model. The high poly model can have more details to bake a texture with richer details.
+Baking requires two sets of models: a high-poly model and a low-poly model. The high-poly model is used to bake more detailed textures, while the low-poly model is used in the engine with the textures. Both models must have consistent UVs, so the low-poly model is created first with UVs laid out, then detailed to create the high-poly model. The high-poly model can have more details to bake richer textures.

-The left is the low poly model, and the right is the high poly model. From the wireframe information, it can be seen that the high poly model has more fine details.
+The left is the low-poly model, and the right is the high-poly model. The wiring information shows that the high-poly model has more details.

-There may be differences in wireframe details, but the visible parts of both models should be consistent. The obscured parts are not considered, so the high poly model must be derived from the low poly model to ensure consistent UVs.
+Although the wiring details differ, the visible parts of both models must be consistent. The covered parts are not considered, so the high-poly model must be derived from the low-poly model to ensure UV consistency.
### Specific Baking Process
-1. Adjust the prepared high poly model in C4D to render the desired effects. For textures used on faces, they also need to be drawn according to the overall UV layout. After adjusting the materials, you can prepare for baking.
+1. Adjust the high-poly model in C4D to render the desired effect. Textures used on the face should also be drawn according to the overall UV layout. Once the materials are adjusted, you can prepare for baking.
-2. An important point in baking is to select the camera mode and specify the tags for the cameras that need to be output, adding the camera tags unique to the OC renderer.
+2. An important point in baking is to select the camera mode. Specify the tag for the camera to be output, and add the unique camera tag of the OC renderer.
-3. Click on the added camera tag to enter the tag properties. There are many options for camera types, one of which is baking. Select baking.
+3. Click the added camera tag to enter the tag properties. There are many options for the camera type, one of which is baking. Select baking.
-4. In the baking menu, set the baking group ID to a number other than 1, here set to 2.
+4. In the baking menu, set the baking group ID to a number other than 1, here it is set to 2.
-5. Then group the models that need to be baked together, as shown in the image below, group all the required models and add the OC object tag.
+5. Then group the models to be baked. As shown below, group all the required models and add the OC object tag.
-6. Click on the tag to show the tag properties, select the object layer, then set the baking ID inside to the same value as the baking group ID, which is 2 here. Then click render, and you can bake the required images.
+6. Click the tag to display the tag properties, select the object layer, and set the baking ID inside to the same value as the baking group ID, which is 2 here. Then click render to bake the required image.
-If you are not completely satisfied with the baking results, both C4D and Substance Painter can be used to brush and modify textures. Photorealistic rendering is not the only choice; brushed textures can also be used to recreate some special styles.
+If you are not very satisfied with the baking results, both C4D and Substance Painter can be used to paint and modify the textures. Realistic rendering is not the only option; painting textures can also be used to restore some special styles.
+```markdown
+```
diff --git a/docs/en/art/lottie.md b/docs/en/art/lottie.md
index 1b76fb613b..ddd13a6770 100644
--- a/docs/en/art/lottie.md
+++ b/docs/en/art/lottie.md
@@ -1,39 +1,38 @@
---
order: 2
-title: Exporting Lottie Animations
+title: Export Lottie Animation
type: Art
label: Art
---
## What is Bodymovin
-- Bodymovin is an AE plugin that can directly output animations as code for programmers to use on various platforms.
+- Bodymovin is an AE plugin that can directly output animations as code, which can be used by programmers on various terminals.
- You can find the latest version of Bodymovin on GitHub.
-- The version of Bodymovin is equivalent to the version of the output JSON file.
+- The version of Bodymovin is equal to the version of the output JSON file.
-## How to Use Bodymovin
+## How to use Bodymovin
-- Go to the GitHub page of Bodymovin ([link: airbnb/lottie-web](https://github.com/airbnb/lottie-web)) to clone the project locally or download the .zip package.
+- Clone the project to your local machine from the Bodymovin GitHub homepage (link: airbnb/lottie-web), or download the .zip package.

-- In the "/build/extension" directory of the project directory, find the "bodymovin.zxp" file, which is the plugin package.
+- Find the "bodymovin.zxp" file in the "/build/extension" directory of the project directory, which is the plugin package.
- Download and install ZXP Installer.
- ZXP Installer link: [https://aescripts.com/learn/zxp-installer](https://aescripts.com/learn/zxp-installer)
+ ZXP Installer address: [https://aescripts.com/learn/zxp-installer](https://aescripts.com/learn/zxp-installer)

-- Open AE, click on "Edit" > "Preferences" > "General" menu item, check "Allow Scripts to Write Files and Access Network," and click OK.
+- Open AE, click "Edit" > "Preferences" > "General" menu item, check "Allow Scripts to Write Files and Access Network", and click OK.

-- Click on "Window" > "Extensions" > "Bodymovin" menu item to open the Bodymovin interface and use the plugin.
+- Click "Window" > "Extensions" > "Bodymovin" menu item to open the Bodymovin interface and use the plugin.

-- Open the Bodymovin plugin window, and you will see the project name in the list below. Select the name, set the output location for the JSON file, and click "Render."
+- Open the Bodymovin plugin window, and you will find the name of the project appearing in the list below. Select the name, set the JSON file output location, and click "Render".
-- Click on Settings in the image above to configure the exported JSON:
+- Click Settings in the image above to configure the exported JSON:
-
diff --git a/docs/en/assets/build.md b/docs/en/assets/build.md
index ddbe55d792..ded85214cc 100644
--- a/docs/en/assets/build.md
+++ b/docs/en/assets/build.md
@@ -1,46 +1,46 @@
---
order: 2
-title: Exporting Projects
+title: Project Export
type: Asset Workflow
label: Resource
---
## HTML5 Project
-The Galacean Editor project export feature allows you to download the current editor project as a frontend project to your local machine. You can configure export parameters in the editor, such as asset export settings, rendering export settings, physics export settings, etc. Based on these configurations, the editor will generate the necessary code, assets, create the corresponding `package.json`, and finally package it into a zip file for you to download.
+The Galacean Editor project export feature allows you to download the current editor project as a frontend project to your local machine. You can configure the project export parameters in the editor, such as asset export configuration, rendering export configuration, physics export configuration, etc. Based on these configurations, the editor will generate the necessary code and assets for the project, create the corresponding `package.json`, and finally package it into a zip file for you to download.
-### Export Settings
+### Export Configuration
-#### Asset Export Settings
+#### Asset Export Configuration
-Asset export settings can be used to control parameters such as resource types and quality for export. In asset export settings, you can choose the types of resources to export, such as models, textures, HDR, etc., and select parameters like export quality and format for each type. When exporting models, you can choose whether to export mesh information, skeleton information, animation information, etc.
+The asset export configuration can be used to control the types and quality of resources to be exported. In the asset export configuration, you can select the types of resources to be exported, such as models, textures, HDR, etc., and choose the export quality and format parameters for each type. When exporting models, you can choose whether to export the model's mesh information, skeleton information, animation information, etc.
-| Configuration | Description |
-| -------------- | -------------------------------------------------------------------------------------------------------------------------------- |
-| glTF Quantize | glTF compression algorithm, see [here](https://github.com/KhronosGroup/glTF/blob/main/extensions/2.0/Khronos/KHR_mesh_quantization/README.md) |
-| glTF Meshopt | glTF compression algorithm, see [here](https://github.com/KhronosGroup/glTF/blob/main/extensions/2.0/Vendor/EXT_meshopt_compression/README.md) |
-| Texture Type | Check [KTX2](https://www.khronos.org/ktx/) to enable [texture compression](/en/docs/graphics-texture-compression) optimization options |
-| Texture Format | Visible after checking [KTX2](https://www.khronos.org/ktx/), different compression formats will affect texture size and rendering quality |
-| Texture Quality| Visible after checking [KTX2](https://www.khronos.org/ktx/), can adjust texture size and rendering quality to some extent |
-| Main Scene | Choose a scene from the **[Asset Panel](/en/docs/assets/interface)** to be the main scene when the project is loaded |
+| Configuration | Description |
+| --------------- | ---------------------------------------------------------------------------------------------------------------------------------------- |
+| glTF Quantize | glTF compression algorithm, see [here](https://github.com/KhronosGroup/glTF/blob/main/extensions/2.0/Khronos/KHR_mesh_quantization/README.md) |
+| glTF Meshopt | glTF compression algorithm, see [here](https://github.com/KhronosGroup/glTF/blob/main/extensions/2.0/Vendor/EXT_meshopt_compression/README.md) |
+| Texture Type | Check [KTX2](https://www.khronos.org/ktx/) to enable [texture compression](/en/docs/graphics/texture/compression/) optimization options |
+| Texture Format | Visible after checking [KTX2](https://www.khronos.org/ktx/), different compression formats will affect the texture size and rendering quality |
+| Texture Quality | Visible after checking [KTX2](https://www.khronos.org/ktx/), can adjust the texture size and rendering quality to a certain extent |
+| Main Scene | Select a scene from the **[Asset Panel](/en/docs/assets/interface)** as the main scene after the project loads |
-#### Rendering Export Settings
+#### Rendering Export Configuration
-Rendering export settings can be used to control parameters related to the project's rendering effects and performance.
+The rendering export configuration can be used to control the rendering effects and performance parameters of the project.
-| Configuration | Description |
-| ------------------------------------------------------------------------------------------------------ | ----------------------------------------------------------------------------|
-| WebGL Mode | WebGL version, `Auto` means automatically select WebGL version based on device capabilities |
-| WebGL [Context](https://developer.mozilla.org/en-US/en/docs/Web/API/HTMLCanvasElement/getContext) settings | Anti-Alias, Alpha, Preserve Drawing Buffer, etc. |
-| Device Pixel Ratio | [Device pixel ratio](/en/docs/core/canvas) used to control canvas size |
+| Configuration | Description |
+| ---------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------- |
+| WebGL Mode | The version of WebGL, `Auto` means automatically selecting the WebGL version based on device capabilities |
+| WebGL [Context](https://developer.mozilla.org/en-US/docs/Web/API/HTMLCanvasElement/getContext) Configuration | Anti-Alias, Alpha, Preserve Drawing Buffer, etc. |
+| Device Pixel Ratio | [Device pixel ratio](/en/docs/core/canvas), used to control the canvas size |
-### Project Start
+### Project Startup
-After clicking the download button in the export panel, you will receive a compressed package of the project. After decompression and entering the folder, the directory structure (using React project as an example) is as follows:
+After clicking the download button in the export panel, you will get a compressed package of the project. After decompressing and entering the folder, the directory structure (taking a React project as an example) is as follows:
```shell
├── example # 📁 示例目录
@@ -59,9 +59,9 @@ After clicking the download button in the export panel, you will receive a compr
└── ... # 其他
```
-### Project Debug
+### Project Debugging
-Next, you can debug and preview the project locally. Run the following commands in the terminal in the folder directory one by one to see if the local effect matches the effect in the editor:
+Next, you can debug and preview the project locally. Run the following commands in the Terminal in the folder directory to see if the local effect is consistent with the effect in the editor:
```bash
npm install
@@ -72,7 +72,7 @@ npm run dev
### Project Build and Deployment
-After all preparations are done, it's time to build and deploy the project. Run the following command in the terminal in the folder directory:
+Once everything is ready, build and deploy the project. Run the following commands in the Terminal in the folder directory:
```bash
npm run build
@@ -80,7 +80,7 @@ npm run build
-You will notice that after the `build` is completed, a `dist` folder is added to the file directory (top left corner), which contains all the necessary code and resources for running. Next, you just need to upload all the contents of this folder to a CDN.
+You will find that after the `build` is completed, a `dist` folder appears in the file directory (top left), which contains all the code and resources needed for running. Next, you just need to upload all the contents of this folder to the CDN.
@@ -88,250 +88,8 @@ Then visit the corresponding address:
-> Export the project as a Vite project. For more deployment solutions, refer to the [Vite official website](https://vitejs.dev/guide/)
+> The exported project is a vite project. For more deployment solutions, refer to [vite official website](https://vitejs.dev/guide/)
## Mini Program Project
-Currently, Galacean has been adapted to Alipay and Taobao Mini Programs. This tutorial assumes that developers already have a certain level of Mini Program development skills. If not, please read the following tutorials and download the Mini Program development tools and apply for an AppId:
-
-- [Alipay Mini Program](https://opendocs.alipay.com/mini/developer)
-- [Taobao Mini Program](https://miniapp.open.taobao.com/docV3.htm?docId=119114&docType=1&tag=dev)
-
-Mini Program Project Publishing:
-
-- [Alipay Mini Program Publishing](https://opendocs.alipay.com/mini/introduce/release)
-- [Taobao Mini Program Publishing](https://developer.alibaba.com/en/docs/doc.htm?spm=a219a.7629140.0.0.258775fexQgSFj&treeId=635&articleId=117321&docType=1)
-
-### Project Export
-
-The export feature of Galacean editor for Alipay Mini Program is still under development, and the interaction method and template project may change in the future.
-
-
-
-### Project Start
-
-After clicking download, a zip file will be downloaded. The directory structure after unzipping is as follows:
-
-```shell
-.
-├── mini # 📁 小程序执行目录
-│ ├── dist # 📁 代码构建结果
-│ ├── pages # 📁 小程序页面
-│ ├── app.json # ⚙️ 项目配置文件
-│ ├── app.js # 代码入口
-├── public # 📁 公共资源目录
-│ ├── scene.json # 场景文件
-│ └── ... # 其他
-├── src # 📁 源代码目录
-├── mini.project.json # ⚙️ 工程配置文件
-├── project.json # ⚙️ 编辑器导出工程配置
-└── ... # 其他
-```
-
-Next, you can install dependencies and start the project:
-
-```shell
-npm install
-npm run dev
-```
-
-When opened in the Mini Program IDE, you will see:
-
-
-
-### Local Resource Handling {/examples}
-
-#### Ant Group Internal Users {/examples}
-
-Simply use 'Upload to CDN' (refer to the export panel options in the figure above) and use the default CDN of the group. If you want to use a custom CDN, refer to Non-Ant Group Internal Users.
-
-#### Non-Ant Group Internal Users {/examples}
-
-1. Upload public files to CDN by yourself.
-2. Modify the scene.json file or configure the baseUrl.
-
-### Package File Loading (WIP) {/examples}
-
-Currently, local file loading for mini-programs is not supported.
-
-### Known Issues {/examples}
-
-- Mini-programs do not support WebAssembly, so PhysX cannot be used as a physics backend at the moment.
-- Local file loading is not supported currently; files need to be manually uploaded to CDN.
-
-## Notes
-
-When using the editor project export feature, you need to consider the following:
-
-1. The exported project needs to run in an environment that supports WebGL.
-2. The exported project may contain a large number of resource files, so you need to optimize and compress the project to improve performance and loading speed.
-3. The exported project may contain sensitive information and data, so you need to assess and protect the security of the project to prevent information leakage and data loss.
-
----
-
-## Supplementary Information for Mini-Programs {/examples}
-
-### Using OrbitControl in Mini-Program Projects
-
-1. Import the third-party library
-
-```bash
-npm install @galacean/engine-toolkit-controls -S
-```
-
-```typescript
-import { OrbitControl } from "@galacean/engine-toolkit-controls/dist/miniprogram";
-```
-
-2. Add the component
-
-The `OrbitControl` component needs to be added to the camera node.
-
-```typescript
-cameraEntity.addComponent(OrbitControl);
-```
-
-3. Event Simulation Dispatch
-
-Since mini-programs do not support adding event listeners with `addEventListener`, you need to manually add event simulation. Also, there is a bug in multi-touch on the canvas of mini-programs, so add a view layer with the same size and position as the canvas to dispatch touch events:
-
-```html
-
-
-
-
-
-```
-
-```typescript
-import { dispatchPointerUp, dispatchPointerDown, dispatchPointerMove, dispatchPointerLeave, dispatchPointerCancel } from "@galacean/engine-miniprogram-adapter";
-
-Page({
- ...
- onTouchEnd(e) {
- dispatchPointerUp(e);
- dispatchPointerLeave(e);
- },
- onTouchStart(e) {
- dispatchPointerDown(e);
- },
- onTouchMove(e) {
- dispatchPointerMove(e);
- },
- onTouchCancel(e) {
- dispatchPointerCancel(e);
- }
-})
-```
-
-### Creating Galacean Mini-Program Projects with Pro Code
-
-> Requires Node.js version >=12.0.0.
-
-Create using yarn
-
-```bash
-yarn create @galacean/galacean-app --template miniprogram
-```
-
-Create using npm **6.x** version
-
-```
-npm init @galacean/galacean-app --template miniprogram
-```
-
-Create using npm **7.x** version
-
-```she
-npm init @galacean/galacean-app -- --template miniprogram
-```
-
-After completing the subsequent steps as prompted, you can open the project using the mini-program development tool:
-
-
-
-Select the corresponding directory, and if everything goes smoothly, you should see:
-
-
-
-### Using Galacean in Existing Projects with Pro Code
-
-This tutorial assumes you already have some development skills. If you are not familiar with mini-program development, please read the [mini-program development documentation](https://opendocs.alipay.com/mini/developer) in detail.
-
-1. Open the `Terminal` in the project directory and install dependencies:
-
-```bash
-# 使用 npm
-npm install @galacean/engine --save
-npm install @galacean/engine-miniprogram-adapter --save
-# 使用 yarn
-yarn add @galacean/engine
-yarn add @galacean/engine-miniprogram-adapter
-```
-
-2. Add the following configuration in the app.json file of the mini-program project:
-
-```json
-{
- ...
- "window": {
- ...
- "v8WorkerPlugins": "gcanvas_runtime",
- "v8Worker": 1,
- "enableSkia": "true"
- }
-}
-```
-
-3. Add a canvas tag to the axml page where interaction is needed
-
-```html
-
-```
-
-Configure the `onReady` to initialize the `canvas`. Set the id of the `canvas` for later use.
-
-4. Add a callback function in the `.js` code file of the page, use `my._createCanvas` to create the required canvas context, and then you can use galacean in the `success` callback.
-
-Note:
-
-1. Import the small program dependency using `import * as GALACEAN from "@galacean/engine/dist/miniprogram"`.
-2. Need to use `registerCanvas` from '@galacean/engine-miniprogram-adapter' to register the `canvas`.
-
-For more details, refer to the code below:
-
-```js
-import * as GALACEAN from "@galacean/engine/dist/miniprogram";
-import { registerCanvas } from "@galacean/engine-miniprogram-adapter";
-
-Page({
- onCanvasReady() {
- my._createCanvas({
- id: "canvas",
- success: (canvas) => {
- // 注册 canvas
- registerCanvas(canvas);
- // 适配 canvas 大小
- const info = my.getSystemInfoSync();
- const { windowWidth, windowHeight, pixelRatio, titleBarHeight } = info;
- canvas.width = windowWidth * pixelRatio;
- canvas.height = (windowHeight - titleBarHeight) * pixelRatio;
-
- // 创建引擎
- const engine = new GALACEAN.WebGLEngine(canvas);
- // 剩余代码和 Galacean Web 版本一致
- ...
- },
- });
- }
-})
-```
+Please refer to [Mini Program Project](/en/docs/miniProgram/miniProgame/)
diff --git a/docs/en/assets/custom.md b/docs/en/assets/custom.md
new file mode 100644
index 0000000000..5c436c5a59
--- /dev/null
+++ b/docs/en/assets/custom.md
@@ -0,0 +1,28 @@
+---
+order: 4
+title: Custom Loader
+type: Asset Workflow
+label: Resource
+---
+
+Users can also create custom loaders to load custom resources:
+
+```typescript
+@resourceLoader(FBX, ["fbx"])
+export class FBXLoader extends Loader {
+ load(item: LoadItem, resourceManager: ResourceManager): AssetPromise {
+ return new AssetPromise((resolve, reject)=> {
+ ...
+ })
+ }
+}
+```
+
+1. Use the [@resourceLoader](/en/apis/core/#resourceLoader) decorator to mark it as a _ResourceLoader_, passing in the type enum and the resource suffix to be parsed. In the example above, `FBX` is the type enum, and `["fbx"]` is the suffix of the resource to be parsed.
+2. Override the [load](/en/apis/core/#ResourceManager-load) method. The `load` method will receive `loadItem` and `resourceManager`. `loadItem` contains the basic information of the load, and `resourceManager` can help load other referenced resources.
+3. Return an [AssetPromise](/en/apis/core/#AssetPromise) object. `resolve` the parsed resource result, for example, FBX returns a specific `FBXResource`.
+4. If there is an error, `reject` the error.
+
+## Reference
+
+
diff --git a/docs/en/assets/gc.md b/docs/en/assets/gc.md
index 231ad3f023..ea8bdf4d0c 100644
--- a/docs/en/assets/gc.md
+++ b/docs/en/assets/gc.md
@@ -1,39 +1,37 @@
---
order: 4
-title: Resource Release
-type: Resource Workflow
+title: Releasing Assets
+type: Asset Workflow
label: Resource
---
-To avoid reloading resources repeatedly, once a resource is loaded, it will be cached in the _ResourceManager_. The cache itself consumes memory and video memory. When developers no longer need the cached content, they need to manually release the cached content.
+To avoid loading resources repeatedly, once a resource is loaded, it is cached in the _ResourceManager_. The cache itself occupies memory and video memory, so when developers no longer need the cached content, they need to manually release the cached content.
> Note: Resources are interdependent.
-For example, the entity shown in the diagram below contains a [MeshRenderer](/apis/core/#MeshRenderer) component, which depends on a [Material](/apis/core/#Material). _Material_ may be referenced by multiple _MeshRenderers_. If _Material_ is released, other _MeshRenderers_ that reference it will not be able to find the _Material_ and will throw an error.
+For example, the entity shown in the figure below contains a [MeshRenderer](/en/apis/core/#MeshRenderer) component, which depends on [Material](/en/apis/core/#Material). _Material_ may be referenced by multiple _MeshRenderers_. If _Material_ is released, other _MeshRenderers_ that reference it will not be able to find the _Material_ and will report an error.

-> Note: JavaScript cannot track object references. In weakly-typed languages like JavaScript, developers are not provided with memory management capabilities. All object memory is managed through garbage collection, and you cannot determine when an object will be released. Therefore, there is no [destructor](https://en.wikipedia.org/wiki/Destructor) to call for releasing referenced resources.
+> Note: JavaScript cannot track object references. Generally, in weakly typed languages like JavaScript, memory management functions are not provided to developers. All object memory is managed through garbage collection mechanisms, and you cannot determine when an object will be released, so there is no [destructor](https://zh.wikipedia.org/wiki/%E8%A7%A3%E6%A7%8B%E5%AD%90) to call for releasing referenced resources.
-`ResourceManager` provides a reference-counting-based resource release system, and developers need to manually call [gc](/apis/core/#ResourceManager-gc):
+`ResourceManager` provides a set of resource releases based on reference counting, which requires developers to manually call [gc](/en/apis/core/#ResourceManager-gc):
```typescript
engine.resourceManager.gc();
```
-## Verify Resource Release
+## Verifying Asset Release
-If you need to verify whether the resources have been released successfully, follow these steps and open the following example on a blank page:
+If you need to verify whether assets have been successfully released, you can follow the steps below and open the following example on a blank page:
-In this example, when initialized, a `Texture2D` and a `Sprite` are created to render a 2D sprite. After clicking the GC button in the top right corner, the `root` node is destroyed, and the reference counts of the texture and sprite assets are cleared. At this point, these assets will be truly destroyed. Taking memory snapshots before and after `gc` can help you better understand this process:
+In this example, `Texture2D` and `Sprite` are created to render 2D sprites during initialization. When you click the GC button in the upper right corner, the `root` node is destroyed, and the reference counts of the texture and sprite assets are cleared. At this point, these assets will be truly destroyed. Taking memory snapshots before and after `gc` can give a more intuitive understanding of this process.
-1. Before `gc`: **Developer Tools** -> **Memory** -> **Take Heap Snapshot**
-2. After `gc`: **Developer Tools** -> **Memory** -> **Take Heap Snapshot** -> **Compare** -> **Select the snapshot before gc**
+1. Before gc: **Developer Tools** -> **Memory** -> **Take Heap Snapshot**
+2. After gc: **Developer Tools** -> **Memory** -> **Take Heap Snapshot** -> **Compare** -> **Select Snapshot Before gc**
-
-
-{ /*examples*/ }
+
diff --git a/docs/en/assets/interface.md b/docs/en/assets/interface.md
index 7cecc89d10..556bb668b3 100644
--- a/docs/en/assets/interface.md
+++ b/docs/en/assets/interface.md
@@ -7,83 +7,89 @@ label: Resource
-The asset panel is an important panel in the editor that helps you manage all the assets used in the scene. In the asset panel, you can view and manage all the assets used in the scene, such as materials, textures, models, and more. Through the asset panel, you can add or remove assets, as well as categorize and organize assets for better asset management.
-
-Currently, the editor supports uploading or creating the following assets (**+** indicates composite files):
-
-| Supported Assets | Description | Exchange Format | Creation |
-| ------------------------------------------------- | ------------------------------------------------------------- | ---------------------------------------------------- | --------- |
-| Folder | Similar to an operating system folder, files can be dragged into the folder | | Create |
-| Scene | Used for entity tree management | | Create |
-| Model | 3D model files | `.gltf`+`.bin`+`.jpg`, `.glb`+`.jpg`, .`fbx`+`.jpg` | Upload |
-| Mesh | Cannot be added, can only use internal meshes and meshes in models | | - |
-| Material | Used to adjust rendering effects | | Create |
-| Texture | Upload image files to create 2D textures | `.png`,`.jpg`,` .webp` | Upload |
-| TextureCube | Used for scene sky and ambient light | `.hdr` | Upload |
-| Sprite | Can directly upload image files to create sprites (skipping the step of creating sprites and then binding textures) | `.png`,`.jpg`,` .webp` | Create/Upload |
-| SpriteAtlas | Pack multiple sprites into an atlas for optimizing 2D assets | | Create |
-| Font | Used to create 2D text | `.ttf`, `.otf`, `.woff` | Upload |
-| Script | Used to write business logic | `.ts` | Create |
-| Animation Controller | Used to organize animation clips and control animation states | | Create |
-| Animation Clip | Pre-made continuous animation data containing keyframe change information over a period of time | `.ts` | Create |
-| Animation State Machine Script | Program script used to control and manage animation state machine behavior | | Create |
-| Lottie | Supports uploading lottie files | `.json`(+`.jpg`), images support base64 embedded and standalone images | Upload |
-| Spine | Supports uploading spine files | `.json` + `.atlas` + `.jpg` | Upload |
-
-
-
-### Add Assets
-
-To add assets to your scene, you can click the add button on the asset panel or use the add option in the right-click menu of the asset panel to add new assets. After adding assets, you can edit the properties of the assets in the **[Inspector Panel](/en/docs/interface/inspector)**. The asset panel offers a wide variety of asset types, such as materials, textures, models, fonts, and more. Refer to the table above for specific details.
+The asset panel is an important panel in the editor that helps you manage all the assets used in the scene. In the asset panel, you can view and manage all the assets used in the scene, such as materials, textures, models, etc. Through the asset panel, you can add or delete assets, as well as categorize them for better organization.
+
+Currently, the editor supports the following uploaded or created assets (**+** indicates composite files):
+
+| Supported Assets | Description | Exchange Format | Creation Method |
+| ------------------------------------------------ | -------------------------------------------------------------- | ----------------------------------------------------- | ---------------- |
+| Folder | Similar to operating system folders, you can drag files into the folder | | Create |
+| Scene | Used for entity tree management | | Create |
+| Model | 3D model files | `.gltf`+`.bin`+`.jpg`, `.glb`+`.jpg`, `.fbx`+`.jpg` | Upload |
+| Mesh | Cannot be added, only internal meshes and meshes in models can be used | | - |
+| Material | Used to adjust rendering effects | | Create |
+| Texture | Upload image files to create 2D textures | `.png`,`.jpg`,` .webp` | Upload |
+| Cube Texture (TextureCube) | Used for scene sky and ambient light | `.hdr` | Upload |
+| Sprite | You can directly upload image files to create sprites (skipping the step of creating a sprite and then binding a texture) | `.png`,`.jpg`,` .webp` | Create/Upload |
+| Sprite Atlas | Packs multiple sprites into an atlas for optimizing 2D assets | | Create |
+| Font | Used to create 2D text | `.ttf`, `.otf`, `.woff` | Upload |
+| Script | Used to write business logic | `.ts` | Create |
+| Animation Controller | Used to organize animation clips and control animation states | | Create |
+| Animation Clip | Pre-made, continuous animation data containing keyframe changes over a period of time | `.ts` | Create |
+| Animation State Machine Script | Program script used to control and manage animation state machine behavior | | Create |
+| Lottie | Supports lottie file uploads | `.json`(+`.jpg`), images support base64 embedded and standalone images | Upload |
+| Spine | Supports spine file uploads | `.json` + `.atlas` + `.jpg` | Upload |
+
+### Adding Assets
+
+To add assets to the scene, you can click the add button on the asset panel or use the add option in the right-click menu of the asset panel to add new assets. After adding assets, you can edit the properties of the assets in the **[Inspector Panel](/en/docs/interface/inspector)**. The asset types in the asset panel are very rich, such as materials, textures, models, fonts, etc. Refer to the table above for details.
-You can also drag files into the asset panel to add assets. Simply select multiple files and drag them into the asset panel to add them.
+You can also drag files into the asset panel to add assets. For multiple files, you can select and drag them into the asset panel together.
-### Organize Assets
+### Organizing Assets
-Assets in the asset panel can be organized by categories for better asset management. You can create folders in the asset panel and move assets into the corresponding folders (you can also move them into folders in the left directory) to categorize them. Folders in the asset panel can be nested, allowing you to create multiple levels of folders for better asset organization.
+Assets in the asset panel can be managed by categorization for better organization. You can create folders in the asset panel and move assets into the corresponding folders (you can also move them into folders in the left directory) to achieve categorized management. Folders in the asset panel can be nested, allowing you to create multi-level folders for better organization of assets.
-The asset panel provides a user-friendly toolbar for browsing assets, helping you quickly find a specific asset or category of assets. You can also customize the browsing mode, sorting method, and thumbnail size of assets based on your preferences.
+The asset panel provides a user-friendly toolbar for browsing assets, helping you quickly find a specific asset or type of asset. You can also modify the browsing mode, sorting method, and thumbnail size of assets according to your usage habits.
-After organizing assets, each asset has a **relative path**, and you can right-click on an asset to copy its path.
+After organizing the assets, each asset has a **relative path**. You can right-click an asset to copy its path.
-This is crucial for project development because there are often cases where assets need to be asynchronously loaded in a project, meaning that certain assets (or even scenes) do not need to be loaded during initialization and can be controlled to load through scripts. For specific syntax, refer to the [Assets](/en/docs/assets-load) and [Scenes](/en/docs/core/scene) loading usage. For loading a scene, for example:
+This is important for project development because projects often need to load assets asynchronously, meaning that certain assets (or even scenes) do not need to be loaded during initialization. You can control the loading of an asset through scripts. For specific syntax, refer to the usage of [Assets](/en/docs/assets/load) and [Scenes](/en/docs/core/scene). For example, to load a scene:
```typescript
this.engine.resourceManager.load({ url: "...", type: AssetType.Scene });
```
-### Delete Assets
+### Deleting Assets
-You can delete an asset by selecting it and clicking the delete button on the asset panel or using the delete option in the right-click menu. When deleting assets, make sure to consider whether the deleted asset will affect the relationships of other nodes in the scene.
+You can delete an asset by selecting it and clicking the delete button on the asset panel or using the delete option in the right-click menu. When deleting assets, you need to be aware of whether the deleted asset will affect the association of other nodes in the scene.
-### Preview Assets
+## Copying and Pasting Assets
-After selecting an asset, the properties that can be configured for that asset will be displayed in the **Inspector Panel** on the right. Different assets have different configurable options. For example, a glTF asset will show a model preview window, while a material asset will display detailed material configuration options.
+You can right-click an asset in the asset panel to copy it, then paste it into the folder you entered:
+
+
+
+You can also use `⌘`+ `C` and `⌘`+ `V` operations.
+
+### Previewing Assets
+
+After selecting an asset, the **[Inspector Panel](/en/docs/interface/inspector)** on the right will display the configurable properties of the asset. The configurable items corresponding to different assets are different. For example, glTF assets will display a model preview window, and material assets will display detailed material configuration options.
### Using Assets
-Some assets (such as glTF assets) support dragging into the scene or node tree.
+部分资产(如 glTF 资产)支持拖拽到场景中或节点树中。
-### Keyboard Shortcuts
-
-| Shortcut | Function |
-| -------------- | ------------ |
-| `⌫` / `Delete` | Delete Resource |
-| `⌘` + `D` | Copy Resource |
-| `⌘`+ `F` | Search Resource |
+### Shortcuts
-Please paste the Markdown content you would like me to translate.
+| Shortcut | Function |
+| --------------- | ----------- |
+| `⌫` / `Delete` | Delete asset |
+| `⌘` + `D` | Duplicate asset |
+| `⌘`+ `F` | Search asset |
+| `⌘`+ `C` | Copy asset |
+| `⌘`+ `V` | Paste asset |
diff --git a/docs/en/assets/load.md b/docs/en/assets/load.md
index 4a7cdf9ce5..80ee190cba 100644
--- a/docs/en/assets/load.md
+++ b/docs/en/assets/load.md
@@ -1,17 +1,17 @@
---
order: 3
-title: Loading Resources
-type: Resource Workflow
+title: Asset Loading
+type: Asset Workflow
label: Resource
---
-In Galacean, loading resources is generally divided into three scenarios based on their usage:
+In Galacean, asset loading is generally divided into three situations based on its usage:
-- Resources are imported into the editor and used in a scene
-- Resources are imported into the editor but not used in any scene
-- Resources are not imported into the editor
+- The asset is imported into the editor and used in a scene
+- The asset is imported into the editor but not used in a scene
+- The asset is not imported into the editor
-> When loading a project using the project loader, only the resources used in the **main scene** will be loaded, other resources in the editor will not be loaded.
+> If the project loader is used to load the project, only the resources used in the **main scene** will be loaded, and other resources in the editor will not be loaded.
```typescript
await engine.resourceManager.load({
@@ -20,7 +20,7 @@ await engine.resourceManager.load({
});
```
-> Similarly, when loading a specific scene using the scene loader, only the resources used in **that scene** will be loaded, other resources will not be loaded by default.
+> Correspondingly, if the scene loader is used to load a scene, the scene loader will only load the resources used in **that scene**, and other resources will not be loaded by default.
```typescript
const scene = await engine.resourceManager.load({
@@ -30,7 +30,7 @@ const scene = await engine.resourceManager.load({
engine.sceneManager.activeScene = scene;
```
-> For resources that are not used in any scene, you can load them using the [resourceManager.load](/apis/core/#Engine-resourceManager#load) method mounted on the Engine instance.
+> As for those assets that are not used in the scene, you can use [resourceManager.load](/en/apis/core/#Engine-resourceManager#load) mounted on the Engine instance to load the resources.
```typescript
// 若只传入 URL ,引擎会依据后缀推断加载的资产类型,如 `.png` 对应纹理, `.gltf` 则对应模型
@@ -52,20 +52,20 @@ const [texture2D, glTFResource] = await this.engine.resourceManager.load([
]);
```
-The following will specifically introduce loading resources at runtime:
+The following will specifically introduce how to load resources at runtime:
-- Resource Paths
-- Loading Progress
-- Canceling Loading
-- Retrieving Loaded Assets
+- Resource path
+- Loading progress
+- Cancel loading
+- Get loaded assets
-## Resource Paths
+## Resource Path
-Resource URL paths support **relative paths**, **absolute paths**, and **virtual paths**:
+The URL path of the resource supports **relative paths**, **absolute paths**, and **virtual paths**:
-- Relative paths are relative to the runtime root path. If there is an error in the path, adjustments can be made based on the loading error information in the developer tools.
+- Relative paths are relative to the runtime root path. If the path is incorrect, you can adjust it based on the loading error information in the developer tools.
- Absolute paths specify the complete file location, such as `https://xxxx.png`, and also include `blob` and `base64`.
-- Virtual paths are paths in the editor's asset files, usually in the format `Assets/sprite.png`.
+- Virtual paths are the paths in the asset files of the editor, generally `Assets/sprite.png`.
```typescript
// 加载相对路径下的资源
@@ -84,27 +84,29 @@ this.engine.resourceManager.load({
this.engine.resourceManager.load("Assets/texture.png");
```
-> In the editor, you can quickly copy the relative path of an asset by going to **[Asset Panel](/en/docs/assets/interface)** -> **Right-click Asset** -> **Copy relative path**.
+> In the editor, you can quickly copy the relative path of the asset through **[Asset Panel](/en/docs/assets/interface)** -> **Right-click asset** -> **Copy relative path**.
+
+
### baseUrl
-The `ResourceManger` now also supports setting a `baseUrl`:
+`ResourceManger` currently also supports setting `baseUrl`:
```typescript
engine.resourceManager.baseUrl = "https://cdn.galacean.com";
```
-If a `baseUrl` is set, the relative paths loaded will be combined with the `baseUrl`:
+If `baseUrl` is set, the relative path loaded will be combined with `baseUrl`:
```typescript
engine.resourceManager.load("img/2d.png");
```
-The actual loading path from the above code snippets would be `https://cdn.galacean.com/img/2d.png`.
+The actual loading path from the above two lines of code will be `https://cdn.galacean.com/img/2d.png`.
## Loading Progress
-By calling the load queue, you can obtain an [AssetPromise](/apis/core/#AssetPromise) object, and use [onProgress](/apis/core/#AssetPromise-onProgress) to get the loading progress.
+Calling the loading queue can get an [AssetPromise](/en/apis/core/#AssetPromise) object, and you can use [onProgress](/en/apis/core/#AssetPromise-onProgress) to get the loading progress.
```typescript
this.engine.resourceManager
@@ -114,9 +116,9 @@ this.engine.resourceManager
});
```
-## Canceling Loading
+## Cancel Loading
-The _ResourceManager_ object has a method [cancelNotLoaded](/apis/core/#ResourceManager-cancelNotLoaded) that can be used to cancel the loading of unfinished resources by calling this method. Providing a URL will cancel the loading of a specific resource.
+The _ResourceManager_ object has a [cancelNotLoaded](/en/apis/core/#ResourceManager-cancelNotLoaded) method, which can be used to cancel resources that have not been loaded yet. Passing in a URL will cancel the resource loading for that specific URL.
```typescript
// 取消所有未加载完的资源。
@@ -127,17 +129,19 @@ this.engine.resourceManager.cancelNotLoaded("test.gltf");
> Note: Currently, canceling the loading of unfinished resources will throw an exception.
-## Retrieving Loaded Assets
+## Get Loaded Assets
-Currently loaded assets are cached in the _ResourceManager_. To retrieve loaded assets, you can use the more secure `load` method, an **asynchronous method**, which will reload the corresponding resource even if it is not in the cache.
+Currently, loaded assets are cached in the _ResourceManager_. If you need to get loaded assets, you can use the more reliable `load` **asynchronous method**. **Even if the asset is not in the cache**, this interface will reload the corresponding resource.
```typescript
const asset = await this.engine.resourceManager.load(assetItem);
```
-If you are certain that the resource is currently in the cache, you can also use the `getFromCache` method, a **synchronous method**:
+If you know for sure that this resource is now in the cache, you can also call the `getFromCache` **synchronous method**:
```typescript
-// Get the asset corresponding to the URL provided
+// Get the asset corresponding to the passed URL
const asset = this.engine.resourceManager.getFromCache(url);
```
+
+It looks like you haven't pasted any content yet. Please provide the Markdown content you want translated, and I'll help you with the translation while adhering to the rules you've specified.
diff --git a/docs/en/assets/overall.md b/docs/en/assets/overall.md
index 4b0c341f10..84373558de 100644
--- a/docs/en/assets/overall.md
+++ b/docs/en/assets/overall.md
@@ -5,21 +5,21 @@ type: Asset Workflow
label: Resource
---
-In Galacean, grids, materials, textures, sprites, atlases, animation clips, animation controllers, and so on are all considered as assets.
+In Galacean, meshes, materials, textures, sprites, atlases, animation clips, animation controllers, etc., are all considered assets.
## Asset Workflow
-In Galacean, the typical workflow for assets is as follows:
+In Galacean, the asset workflow typically follows these steps:
```mermaid
flowchart LR
- Import Assets --> Edit Assets --> Build Export --> Distribute --> Load
+ Import Assets --> Edit Assets --> Build and Export --> Distribute --> Load
```
This chapter will mainly cover:
-- How to [customize asset loaders](./custom)
-- [CRUD operations on assets](./interface): while in edit mode
-- How assets are [exported and deployed](./interface) after building the project
-- How to [load assets](./load) at runtime
-- [Garbage collection](./gc) at runtime
+- How to [customize asset loaders](/en/docs/assets/custom)
+- [CRUD operations on assets](/en/docs/assets/interface) in edit mode
+- How to [export and deploy assets](/en/docs/assets/build) after building the project
+- How to [load assets](/en/docs/assets/load) at runtime
+- How to [perform garbage collection](/en/docs/assets/gc) at runtime
diff --git a/docs/en/basics/_meta.json b/docs/en/basics/_meta.json
new file mode 100644
index 0000000000..d5fb0db513
--- /dev/null
+++ b/docs/en/basics/_meta.json
@@ -0,0 +1,8 @@
+{
+ "overview": {
+ "title": "Overview"
+ },
+ "quickStart": {
+ "title": "Quick Start"
+ }
+}
diff --git a/docs/en/basics/benchmark.mdx b/docs/en/basics/benchmark.mdx
new file mode 100644
index 0000000000..be3a83ee04
--- /dev/null
+++ b/docs/en/basics/benchmark.mdx
@@ -0,0 +1,46 @@
+---
+order: 4
+title: Benchmarking
+---
+
+## Test Preparation
+
+We have designed a set of test plans to compare the performance of various game engines across multiple performance factors. All tests are conducted in the following environment:
+
+- Macbook M2 Pro
+- Memory 16GB
+- Sonoma 14.4.1
+
+Like the engine, our benchmarking code is also open source. You can get the source code from the [benchmark](https://github.com/galacean/benchmark) repository. This test includes the Galacean Engine, Babylon.js, and Three.js engines.
+
+### Basic Rendering
+
+In this benchmark, we tested the rendering performance of the three engines after loading 100 glTF models and configuring 10 PointerLights.
+
+
+
+### Animation
+
+In this benchmark, we tested the performance of the three engines when loading 225 glTF models and simultaneously playing 255 glTF animations.
+
+
+
+### Particle System
+
+In this benchmark, we tested the rendering performance of the three engines using 500 particle systems with similar parameters on an opaque background.
+
+
+
+> Three.js does not have its own Particle System, nor does it have a currently maintained preferred third-party library. Similar effects are generally implemented independently by developers. Therefore, we used `three-nebula` to represent the particle system of Three.js in this test.
+
+### 2D Rendering
+
+In the 2D test, we tested the performance of placing and rotating 7920 2D sprites simultaneously.
+
+
+
+## Summary
+
+We chose Babylon.js, Three.js, and PixiJS for testing because they are currently the most popular game engines, and during development, they have always been our benchmarks to surpass. We hope to provide the community with transparent and comparable performance information through these benchmarks.
+
+If you want us to include your game engine or have suggestions for improving these tests, feel free to open an [issue](https://github.com/galacean/benchmark/issues).
diff --git a/docs/en/basics/overview.md b/docs/en/basics/overview.md
new file mode 100644
index 0000000000..b3ca61890f
--- /dev/null
+++ b/docs/en/basics/overview.md
@@ -0,0 +1,96 @@
+---
+order: 0
+title: Overview
+label: Basics
+---
+
+**Galacean Engine** is a web-first, mobile-first, open-source real-time interactive solution, built with a component-based architecture and written in [Typescript](https://www.typescriptlang.org/). It includes features such as [Rendering](/en/docs/graphics/renderer/renderer), [Physics](/en/docs/physics/overall), [Animation](/en/docs/animation/overview), [Interaction](/en/docs/input/input), and [XR](/en/docs/xr/overall). It also provides a comprehensive visual online editor with a complete workflow to help you create stunning 2D/3D interactive applications in the browser. It mainly consists of two parts:
+
+- Editor: An online web interactive creation platform [Editor](https://galacean.antgroup.com/editor)
+- Runtime: A web-first, mobile-first high-performance interactive runtime [Runtime](https://github.com/galacean/runtime), rich non-core features and business logic customization features [Toolkit](https://github.com/galacean/runtime-toolkit), and a series of secondary ecosystem packages.
+
+## Editor
+
+[Galacean Editor](https://galacean.antgroup.com/editor/projects) is an online web interactive creation platform. It helps you quickly create, edit, and export an interactive project. You can quickly upload assets, edit materials, adjust lighting, and create entities through the Galacean Editor to create complex scenes.
+
+The overall process of creating an interactive project using the editor:
+
+```mermaid
+flowchart LR
+ Create Project --> Create Assets --> Build Scene --> Write Scripts --> Export
+```
+
+The editor allows better collaboration between technical and artistic team members. You can quickly start the development of your first project through business templates on the [editor homepage](https://galacean.antgroup.com/editor).
+
+## Runtime
+
+Core functionalities are provided by [Galacean Runtime](https://www.npmjs.com/package/@galacean/runtime), while advanced non-core and business logic customization features are provided by [Galacean Toolkit](https://github.com/galacean/runtime-toolkit). You can browse various engine [examples](/en/examples/latest/background) online through the browser.
+
+### Core Packages
+
+Includes the following sub-packages:
+
+| Package | Description | Related Documentation |
+| :-- | :-- | --- |
+| [@galacean/engine](https://www.npmjs.com/package/@galacean/engine) | Core architecture logic and core functionalities | [API](/en/apis/galacean) |
+| [@galacean/engine-physics-lite](https://www.npmjs.com/package/@galacean/engine-physics-lite) | Lightweight physics engine | [Doc](/en/docs/physics/overall) |
+| [@galacean/engine-physics-physx](https://www.npmjs.com/package/@galacean/engine-physics-physx) | Full-featured physics engine | [Doc](/en/docs/physics/overall) |
+| [@galacean/engine-shader-lab](https://www.npmjs.com/package/@galacean/engine-shader-lab) | Galacean Shader compiler | [Doc](/en/docs/graphics/shader/lab) |
+| [@galacean/engine-xr](https://www.npmjs.com/package/@galacean/engine-xr) | XR logic package | [Doc](/en/docs/xr/overall) |
+| [@galacean/engine-xr-webxr](https://www.npmjs.com/package/@galacean/engine-xr-webxr) | WebXR backend | [Doc](/en/docs/xr/overall) |
+
+You can install it via [NPM](https://docs.npmjs.com/):
+
+```bash
+npm install --save @galacean/engine
+```
+
+Then import and use it in your project:
+
+```typescript
+import { WebGLEngine, Camera } from "@galacean/engine";
+```
+
+### Toolkit
+
+Non-core functionalities and business logic customization features are provided by the galacean-toolkit package (for a complete list of features, please check [engine-toolkit](https://github.com/galacean/engine-toolkit/tree/main)):
+
+| Package | Description | API |
+| :-- | :-- | :-- |
+| [@galacean/engine-toolkit-controls](https://www.npmjs.com/package/@galacean/engine-toolkit-controls) | Controllers | [Doc](/en/docs/graphics/camera/control/) |
+| [@galacean/engine-toolkit-framebuffer-picker](https://www.npmjs.com/package/@galacean/engine-toolkit-framebuffer-picker) | Framebuffer Picker | [Doc](/en/docs/input/framebuffer-picker/) |
+| [@galacean/engine-toolkit-stats](https://www.npmjs.com/package/@galacean/engine-toolkit-stats) | Engine Stats Panel | [Doc](/en/docs/performance/stats/) |
+| ...... | | |
+
+> In the same project, please ensure that the core engine package version is consistent and the major version of the toolkit is consistent. For example, with version 1.3.x of the engine, you need to use version 1.3.y of the toolkit.
+
+### Secondary Ecosystem Packages
+
+There are also some secondary ecosystem packages, which are introduced and used in the same way as the engine toolkit:
+
+| Package | Description | API |
+| :-- | :-- | :-- |
+| [@galacean/engine-spine](https://www.npmjs.com/package/@galacean/engine-spine) | Spine Animation | [Doc](/en/docs/graphics/2D/spine/overview/) |
+| [@galacean/engine-lottie](https://www.npmjs.com/package/@galacean/engine-lottie) | Lottie Animation | [Doc](/en/docs/graphics/2D/lottie/) |
+
+> For version dependencies of secondary ecosystem packages, please refer to the corresponding documentation.
+
+> [Click to learn more about Galacean's version management](/en/docs/basics/version/)
+
+## Compatibility
+
+Galacean Runtime runs in environments that support WebGL. So far, all major mobile and desktop browsers support this standard. You can check the compatibility of the runtime environment on [CanIUse](https://caniuse.com/?search=webgl).
+
+In addition, **Galacean Runtime** also supports running in [Alipay/Taobao Mini Programs](/en/docs/assets/build), and developers in the community have contributed [WeChat Mini Programs/Game adaptation solutions](https://github.com/deepkolos/platformize). For some functional modules that require additional compatibility considerations, the current adaptation solutions are as follows:
+
+| Module | Compatibility Considerations | Specific Documentation |
+| :-- | :-- | :-- |
+| [Mouse and Touch](/en/docs/input) | [PointerEvent](https://caniuse.com/?search=PointerEvent) | For compatibility, please refer to [polyfill-pointer-event](https://github.com/galacean/polyfill-pointer-event) |
+| [PhysX](/en/docs/physics/overall) | [WebAssembly](https://caniuse.com/?search=wasm) | In environments that do not support WebAssembly, it will degrade to the JS version, with slightly lower performance and performance compared to the WebAssembly version |
+
+## Open Source Collaboration
+
+**Galacean** is eager to collaborate with you to build an interactive engine. All development processes, including [planning](https://github.com/galacean/engine/projects?query=is%3Aopen), [milestones](https://github.com/galacean/engine/milestones), and [architecture design](https://github.com/galacean/engine/wiki/Physical-system-design), are publicly available in GitHub's project management. You can participate in the construction of the engine by [creating an issue](https://docs.github.com/zh/issues/tracking-your-work-with-issues/creating-an-issue) and [submitting a PR](https://docs.github.com/zh/pull-requests/collaborating-with-pull-requests/proposing-changes-to-your-work-with-pull-requests/creating-a-pull-request-from-a-fork). If you have any questions or need help, you can join the DingTalk group or seek help in the [discussion area](https://github.com/orgs/galacean/discussions).
+```
+
+It seems like your message was empty. Please paste the Markdown content you want translated, and I'll handle the translation while adhering to the rules you've provided.
diff --git a/docs/en/basics/quickStart/quick-start.md b/docs/en/basics/quickStart/quick-start.md
new file mode 100644
index 0000000000..f3aa0ea84a
--- /dev/null
+++ b/docs/en/basics/quickStart/quick-start.md
@@ -0,0 +1,99 @@
+---
+order: 1
+title: Quick Start
+label: Basics
+---
+
+We will understand the use of the engine through an example of a "rotating duck."
+
+## Create a Project
+
+After logging in, the first thing you see is the editor's homepage, which displays all the projects you have created. Use the button in the upper right corner to create a project. After clicking, you can choose the type of project to create, either 2D or 3D. We choose 3D Project.
+
+
+
+At this point, a new blank project will open, with a camera and a directional light built into the scene.
+
+
+
+## Build the Scene
+
+Before that, let's explain some basic concepts in the game engine:
+
+| Concept | Explanation |
+| ----------- | ------------------------------------------------------------ |
+| Scene | An environment containing all 2D/3D elements |
+| Entity | The basic unit that makes up the scene, representing any object in the scene with independent significance |
+| Component | The specific implementation of the entity's function, each component is responsible for handling a specific function of the entity |
+| Script Component | A special type of component that gives the entity dynamic behavior and logic control capabilities |
+| Asset | A general term for reusable resources used to build scenes, such as 3D models, materials, etc. |
+| 3D Model | A digital representation created by computer software that can represent the shape and appearance of objects in three-dimensional space. It includes the three-dimensional geometric shapes of characters, environmental objects (such as buildings, vegetation), props (weapons, furniture), usually with texture and material definitions |
+
+### Place the Duck
+
+First, click this [link](https://gw.alipayobjects.com/os/bmw-prod/6cb8f543-285c-491a-8cfd-57a1160dc9ab.glb) to download a 3D model of a duck. Drag the downloaded model to the asset panel, and after a while, you will see that the model has been uploaded to the editor:
+
+
+
+Next, drag the model from the asset panel to the scene view, and you can render this 3D model in the scene. At this point, a new entity has been added to the scene's node tree.
+
+
+
+### Adjust the Duck's Transform
+
+First, to better preview the final effect on mobile devices, we can select the **camera** entity. You can use the positioning button to clarify the current preview camera's position in the scene, simulate real device previews by selecting different sizes of mobile devices, and also choose to lock the preview window so that it does not disappear when selecting other entities.
+
+
+
+Next, we select the duck and use the **move**, **rotate**, **scale**, and other [transform](/en/docs/core/transform) operations from the top toolbar. Switching between different transform types will also switch different operation handles on the duck. These operation handles are similar to the interactions in most 3D software. If you are using such handles for the first time, don't worry, just move the mouse to the handle and play around a bit, and you will quickly get the hang of it. Through simple transform operations, we can adjust the duck's position, angle, and size to meet our expected effect. The camera preview in the upper left corner will display the effect of your adjustments in real-time.
+
+
+
+### Adjusting the Light
+
+At this point, the duck is a bit dark. Select the `DirectLight` light entity in the node tree, and in the inspector panel on the right, drag the slider to appropriately adjust the intensity of the light, making the scene brighter.
+
+
+
+### Making the Duck Rotate
+
+First, in the asset panel, *right-click → Create → New Empty Script* to create a Script asset.
+
+
+
+After creation, you can see a new Script asset in the asset panel.
+
+
+
+Next, select the duck entity, and in the inspector panel on the right, click **Add Component** to add a [ Script ](/en/docs/script/class) component.
+
+
+
+Click **Select asset** to choose the Script you just created. This binds the script to the entity, meaning the script's lifecycle functions will apply to this entity.
+
+
+
+After creating the script, you can **double-click it** to jump to the code editor page.
+
+
+
+In the code editor, add a line of code in the `onUpdate` function to make the duck rotate along the Y-axis. After writing the code, save it (`⌘+s`), and you can see the effect in real-time in the preview area on the right.
+
+```ts
+// Script.ts
+import { Script } from "@galacean/engine";
+
+export default class extends Script {
+ onUpdate(deltaTime: number) {
+ this.entity.transform.rotate(0, 1, 0);
+ }
+}
+```
+
+## Exporting the Project
+
+We have completed the development work in the editor. Next, let's export this project to the local machine. Click the **Download** button on the left toolbar to bring up the export interface. Here, rename the project to "duck", then click the `Download` button. The editor will package the project into a `duck.zip` file for download.
+
+
+
+After the project is packaged, open the box project with VsCode, run `npm install` & `npm run dev`, and you can see the project running normally.
diff --git a/docs/en/basics/version.md b/docs/en/basics/version.md
new file mode 100644
index 0000000000..d7bbc13105
--- /dev/null
+++ b/docs/en/basics/version.md
@@ -0,0 +1,71 @@
+---
+order: 0
+title: Version Management
+label: Basics
+---
+
+Galacean has a mature version management scheme. This article will take `@galacean/engine` as an example to introduce Galacean's management tools, naming conventions, release strategies, and dependency management.
+
+## Version Management Tools
+
+The version management tool used by Galacean is [Git](https://git-scm.com/). The code is hosted on [GitHub](https://github.com/galacean/), and all development processes, including [planning](https://github.com/galacean/engine/projects?query=is%3Aopen), [milestones](https://github.com/galacean/engine/milestones), and [architecture design](https://github.com/galacean/engine/wiki/Physical-system-design), are publicly available in GitHub's project management. You can participate in the construction of the engine by [creating an issue](https://docs.github.com/zh/issues/tracking-your-work-with-issues/creating-an-issue) and [submitting a PR](https://docs.github.com/zh/pull-requests/collaborating-with-pull-requests/proposing-changes-to-your-work-with-pull-requests/creating-a-pull-request-from-a-fork).
+
+## Package Management Tools
+
+The package management tool used by Galacean is [NPM](https://www.npmjs.com/). You can install [@galacean/engine](https://www.npmjs.com/package/@galacean/engine?activeTab=versions) using the npm command:
+
+```bash
+npm install --save @galacean/engine
+```
+
+Then import it in your project:
+
+```typescript
+import { WebGLEngine, Camera } from "@galacean/engine";
+```
+
+## Version Naming Conventions and Release Strategy
+
+Galacean version numbers follow the format `MAJOR.MINOR.PATCH-TAG.X`. `MAJOR.MINOR` represents milestone versions, usually accompanied by significant feature updates. `PATCH` versions indicate backward-compatible bug fixes, and the `TAG` label marks the purpose of the release version.
+
+| TAG | Meaning |
+| :-- | :-- |
+| **alpha** | Internal test version, used for early feature development, includes new features within the milestone but is less stable, e.g., [1.3.0-alpha.3](https://www.npmjs.com/package/@galacean/engine/v/1.3.0-alpha.3) |
+| **beta** | Public test version, internal testing is mostly complete, more stable but may still have minor issues and defects, e.g., [1.2.0-beta.7](https://www.npmjs.com/package/@galacean/engine/v/1.2.0-beta.7) |
+| **latest** | Official stable version, thoroughly tested and verified, no major defects, recommended for production use, e.g., [1.1.3](https://www.npmjs.com/package/@galacean/engine/v/1.1.3) |
+| **custom** | Released internally for testing specific features, e.g., [0.0.0-experimental-1.3-xr.9](https://www.npmjs.com/package/@galacean/engine/v/0.0.0-experimental-1.3-xr.9) |
+
+> You can view all available versions on [Github](https://github.com/galacean/engine/releases) or [NPM](https://www.npmjs.com/package/@galacean/engine?activeTab=versions).
+
+## Version Upgrade
+
+Each milestone version update will be accompanied by a [version upgrade guide](https://github.com/galacean/engine/wiki/Migration-Guide), which includes the contents of the update and BreakChange. You can refer to this document for version updates.
+
+## Version Dependencies
+
+| Situation | Rule |
+| :-- | :-- |
+| **Core Package** | Ensure consistent versions among core packages |
+| **Tool Package** depends on **Core Package** | Ensure the tool package version is consistent with the major version of the core engine package. For example, the 1.3.x version of the tool package depends on the 1.3.y version of the core package |
+| **Secondary Package** depends on **Core Package** | The dependency relationship of secondary ecosystem packages on the engine version should refer to the corresponding documentation, such as [Lottie](/en/docs/graphics/2D/lottie/#lottie-使用版本说明) |
+
+> The basic rules are as above. If there are special instructions, please follow them to choose dependencies.
+
+## Others
+
+### Editor Upgrade Engine Version
+
+In [Project Settings](/en/docs/interface/menu/#项目设置), you can control the runtime engine version.
+
+### Runtime Output Version Information
+
+Most Galacean packages will output version information in the `Console` at runtime.
+
+
+
+This is usually used to determine if there are issues with package version dependencies:
+
+- Do the version dependencies not meet the rules?
+- Are there multiple different versions of the same dependency?
+
+If you encounter the above issues, please check the project and resolve the dependency issues.
diff --git a/docs/en/core/canvas.md b/docs/en/core/canvas.md
index d058757b9a..d408572681 100644
--- a/docs/en/core/canvas.md
+++ b/docs/en/core/canvas.md
@@ -1,11 +1,11 @@
---
order: 1
title: Canvas
-group: Basic
+group: Basics
label: Core
---
-The Galacean Engine encapsulates canvases for different platforms. For example, [WebCanvas](/apis/rhi-webgl/WebCanvas) supports controlling [HTMLCanvasElement](https://developer.mozilla.org/en-US/en/docs/Web/API/HTMLCanvasElement) or [OffscreenCanvas](https://developer.mozilla.org/en-US/en/docs/Web/API/OffscreenCanvas) using [Engine](/apis/core/#Engine).
+The Galacean Engine encapsulates canvases for different platforms, such as [WebCanvas](/en/apis/rhi-webgl/WebCanvas), supporting control of [HTMLCanvasElement](https://developer.mozilla.org/en-US/docs/Web/API/HTMLCanvasElement) or [OffscreenCanvas](https://developer.mozilla.org/en-US/docs/Web/API/OffscreenCanvas) using [Engine](/en/apis/core/#Engine).
@@ -21,9 +21,9 @@ Insert a `` tag in HTML and specify an id:
```
-> Developers should ensure to check the height and width of the canvas to avoid rendering issues caused by a height or width value of **0**.
+> Developers should check the height and width of the canvas to avoid rendering issues caused by a height or width value of **0**.
-When creating a WebGLEngine instance, a WebCanvas instance is automatically created:
+When creating an instance of WebGLEngine, a WebCanvas instance is automatically created. The parameter `canvas` is the `id` of the _Canvas_ element.
```typescript
const engine = await WebGLEngine.create({ canvas: "canvas" });
@@ -33,7 +33,7 @@ console.log(engine.canvas); // => WebCanvas instance
### Basic Adaptation
-The canvas size is generally controlled by the **device pixel ratio**, using [WebCanvas](/apis/rhi-webgl/WebCanvas) as an example:
+The canvas size is generally controlled by the **device pixel ratio**. Taking [WebCanvas](/en/apis/rhi-webgl/WebCanvas) as an example:
```mermaid
flowchart TD
@@ -41,11 +41,11 @@ flowchart TD
C[HtmlCanvas.clientHeight] -->|pixelRatio| D[WebCanvas.height]
```
-If developing by exporting an **NPM package** through an editor, controlling the **device pixel ratio** in the export configuration at [Project Export](/en/docs/assets-build) is sufficient.
+If developing by exporting an **NPM package** through the editor, you only need to control the **device pixel ratio** in the [project export](/en/docs/assets/build) rendering export configuration.
-Alternatively, adapt the canvas by actively calling `resizeByClientSize` in the code.
+Or actively call `resizeByClientSize` in the code to adapt the canvas.
```typescript
// 使用设备像素比( window.devicePixelRatio )调整画布尺寸,
@@ -54,25 +54,25 @@ engine.canvas.resizeByClientSize();
engine.canvas.resizeByClientSize(1.5);
```
-> When the display size of the canvas changes (such as when the browser window changes), the image may stretch or compress. You can restore it to normal by calling `resizeByClientSize`. In general, this line of code should meet adaptation needs. If you have more complex adaptation requirements, please refer to the "Advanced Usage" section.
+> When the display size of the canvas changes (such as when the browser window changes), the image may appear stretched or compressed. You can call `resizeByClientSize` to restore it to normal. Generally, this line of code can meet the adaptation needs. If you have more complex adaptation requirements, please read the "Advanced Usage" section.
## Advanced Usage
-Regarding adaptation, the key point to note is the **device pixel ratio**. For example, on an iPhoneX, the device pixel ratio `window.devicePixelRatio` is _3_, the window width `window.innerWidth` is _375_, and the physical screen pixel width is: 375 * 3 = *1125\*.
+Regarding adaptation, the core point to note is the **device pixel ratio**. Taking iPhoneX as an example, the device pixel ratio `window.devicePixelRatio` is _3_, the window width `window.innerWidth` is _375_, and the screen physical pixel width is: 375 * 3 = *1125*.
-Rendering load and physical screen pixel height and width are directly proportional. The larger the physical pixels, the greater the rendering load and power consumption. It is recommended to set the height and width of the canvas using the API exposed by [WebCanvas](/apis/rhi-webgl/WebCanvas), and not to use native canvas APIs such as `canvas.width` or `canvas.style.width`.
+Rendering pressure is proportional to the physical pixel height and width of the screen. The larger the physical pixels, the greater the rendering pressure, and the more power it consumes. It is recommended to set the height and width of the canvas through the API exposed by [WebCanvas](/en/apis/rhi-webgl/WebCanvas), rather than using the native canvas API, such as `canvas.width` or `canvas.style.width`.
-> ⚠️ **Note**: Some front-end scaffolding tools may insert the following tag to modify the page's scaling factor:
+> ️ **Note**: Some front-end scaffolds will insert the following tag to modify the page zoom ratio:
>
> ` `
>
-> This line of code changes the value of `window.innerWidth` from 375 to 1125.
+> This line of code will change the value of `window.innerWidth` from 375 to 1125.
-In addition to automatic adaptation with `resizeByClientSize`, it is recommended to use the following two modes:
+除了 `resizeByClientSize` 自动适配,推荐使用以下两种模式:
-### Energy Saving Mode
+### 节能模式
-Considering that mobile devices have high-definition screens (device pixel ratio is high), but the actual graphics card performance cannot meet the performance requirements of high-definition real-time rendering well (**the rendering area ratio of 3x screen and 2x screen is 9:4, and the 3x screen is more likely to cause the phone to heat up**), in this mode, the engine achieves adaptation by scaling and stretching the canvas. The code is as follows:
+考虑到移动端设备虽然是高清屏幕(设别像素比高)但实际显卡性能并不能很好地满足高清实时渲染的性能要求的情况(**3 倍屏和 2 倍屏渲染面积比是 9:4,3 倍屏较容易造成手机发烫**),此模式下引擎通过对画布缩放拉伸来达到适配的目的。代码如下:
```typescript
const canvas = document.getElementById("canvas");
@@ -88,7 +88,7 @@ webcanvas.height = (window.innerHeight * pixelRatio) / scale;
webcanvas.setScale(scale, scale); // 拉伸画布
```
-If the canvas width and height have already been set through CSS (such as `width: 100vw; height: 100vh;`), you can achieve canvas scaling by passing parameters to `resizeByClientSize`:
+如果已经通过 CSS 设置了画布高宽(比如 `width: 100vw; height: 100vh;`),那么可以通过 `resizeByClientSize` 传参实现画布的缩放:
```typescript
const canvas = document.getElementById("canvas");
@@ -98,9 +98,9 @@ const scale = 2 / 3; // 3 倍高清屏按 2 倍屏来计算画布尺寸
webcanvas.resizeByClientSize(scale); // 拉伸画布
```
-### Fixed Width Mode
+### 固定宽度模式
-In some cases, such as when the design draft has a fixed width of 750, developers may choose to hardcode the canvas width to reduce adaptation costs. The code is as follows:
+某些情况下,比如设计稿固定 750 宽度的情况,开发者有可能会写死画布宽度来降低适配成本。代码如下:
```typescript
import { WebCanvas } from "@galacean/engine";
diff --git a/docs/en/core/clone.md b/docs/en/core/clone.md
index 306508e399..ff83602424 100644
--- a/docs/en/core/clone.md
+++ b/docs/en/core/clone.md
@@ -5,17 +5,16 @@ type: Core
label: Core
---
-
-Node cloning is a common feature at runtime, and cloning a node will also clone the components bound to it. For example, during initialization, create a certain number of identical entities dynamically based on configuration, and then place them in different positions in the scene according to logical rules. Here, we will provide a detailed explanation of the cloning details of scripts.
+Node cloning is a common runtime feature, and node cloning also includes cloning its bound components. For example, during the initialization phase, dynamically create a certain number of identical entities based on configuration, and then place them in different positions in the scene according to logical rules. Here, the details of script cloning will be explained in detail.
## Entity Cloning
-It's very simple, just call the [clone()](/apis/design/#IClone-clone) method of the entity to complete the cloning of the entity and its attached components.
+It's very simple, just call the entity's [clone()](/en/apis/design/#IClone-clone) method to complete the cloning of the entity and its attached components.
```typescript
const cloneEntity = entity.clone();
```
## Script Cloning
-Since scripts are essentially components, when we call the [clone()](/apis/design/#IClone-clone) function of an entity, the engine will not only clone the built-in components but also clone custom scripts. The cloning rules for built-in components have been customized by the official, and similarly, we have opened up the cloning capability and rules for scripts to developers. The default cloning method for scripts is shallow copy. For example, if we modify the field values of a script and then clone it, the cloned script will retain the modified values without the need for any additional coding. Here is an example of cloning a custom script:
+Scripts are essentially components, so when we call the entity's [clone()](/en/apis/design/#IClone-clone) function, the engine will not only clone the built-in components but also clone custom scripts. The cloning rules for built-in components have been customized by the official team, and similarly, we have also opened up the cloning capabilities and rules for scripts to developers. The default cloning method for script fields is shallow copy. For example, if we modify the field values of the script and then clone it, the cloned script will retain the modified values without any additional coding. Below is an example of custom script cloning:
```typescript
// define a custom script
class CustomScript extends Script{
@@ -44,16 +43,16 @@ console.log(cloneScript.b); // output is 2.
console.log(cloneScript.c); // output is (1,1,1).
```
### Clone Decorators
-In addition to the default cloning method, the engine also provides "clone decorators" to customize the cloning method of script fields. There are four built-in clone decorators:
+In addition to the default cloning method, the engine also provides "clone decorators" to customize the cloning method for script fields. The engine has four built-in clone decorators:
| Decorator Name | Decorator Description |
| :--- | :--- |
-| [ignoreClone](/apis/core/#ignoreClone) | Ignore the field during cloning. |
-| [assignmentClone](/apis/core/#assignmentClone) | (Default, equivalent to not adding any clone decorators) Assign the field during cloning. If it is a primitive type, it will copy the value; if it is a reference type, it will copy the reference address. |
-| [shallowClone](/apis/core/#shallowClone) | Perform shallow cloning of the field during cloning. After cloning, it will maintain its own reference independently and clone all its internal fields using the assignment method (copying the value for primitive types and copying the reference address for reference types). |
-| [deepClone](/apis/core/#deepClone) | Perform deep cloning of the field during cloning. After cloning, it will maintain its own reference independently, and all its deep internal fields will remain completely independent. |
+| [ignoreClone](/en/apis/core/#ignoreClone) | Ignore the field during cloning. |
+| [assignmentClone](/en/apis/core/#assignmentClone) | (Default value, equivalent to not adding any clone decorator) Assign the field during cloning. If it is a basic type, the value will be copied; if it is a reference type, the reference address will be copied. |
+| [shallowClone](/en/apis/core/#shallowClone) | Shallow clone the field during cloning. After cloning, it will maintain its own independent reference and clone all its internal fields by assignment (if the internal field is a basic type, the value will be copied; if the internal field is a reference type, the reference address will be copied). |
+| [deepClone](/en/apis/core/#deepClone) | Deep clone the field during cloning. After cloning, it will maintain its own independent reference, and all its internal deep fields will remain completely independent. |
-We slightly modified the above example and added different "clone decorators" to the four fields in `CustomScript`. Since `shallowClone` and `deepClone` are more complex, we added additional print outputs for fields `c` and `d` for further explanation.
+We slightly modify the above example and add different "clone decorators" to the four fields in `CustomScript`. Since `shallowClone` and `deepClone` are more complex, we add additional print output to the fields `c` and `d` for further explanation.
```typescript
// define a custom script
class CustomScript extends Script{
@@ -99,8 +98,6 @@ console.log(script.d[0]); // output is (1,1,1). bacause deepClone let d[0] use t
- Note:
- `shallowClone` and `deepClone` are usually used for *Object*, *Array*, and *Class* types.
- - `shallowClone` maintains its own reference independently after cloning and clones all its internal fields using the assignment method (copying the value for primitive types and copying the reference address for reference types).
- - `deepClone` performs deep cloning, recursively cloning properties, and how the sub-properties of properties are cloned depends on the decorators of the sub-properties.
- - If the clone decorators do not meet the requirements, you can append custom cloning by implementing the [_cloneTo()](/apis/design/#IClone-cloneTo) method.
-
-
+ - `shallowClone` will maintain its own independent reference after cloning and clone all its internal fields by assignment (if the internal field is a basic type, the value will be copied; if the internal field is a reference type, the reference address will be copied).
+ - `deepClone` is a deep clone that will recursively clone the properties deeply. How the sub-properties of the properties are cloned depends on the decorators of the sub-properties.
+ - If the clone decorators do not meet the requirements, you can implement the [_cloneTo()](/en/apis/design/#IClone-cloneTo) method to add custom cloning.
diff --git a/docs/en/core/component.md b/docs/en/core/component.md
index da5c8f1bd9..2fb51df536 100644
--- a/docs/en/core/component.md
+++ b/docs/en/core/component.md
@@ -5,32 +5,32 @@ type: Core
label: Core
---
-In the Galacean engine, the [Entity](/apis/core/#Entity) does not have actual functionalities like rendering models, which are achieved by loading component classes such as [Component](/apis/core/#Component). For example, if you want to turn an _Entity_ into a camera, you just need to add the camera component [Camera](/apis/core/#Camera}) to that _Entity_. This component-based approach to functionality extension emphasizes encapsulating programs independently by functionality, allowing them to be combined and added as needed, which is very beneficial for reducing program coupling and increasing code reusability.
+In the Galacean engine, [Entity](/en/apis/core/#Entity) does not have actual functionalities such as rendering models. These functionalities are implemented by loading [Component](/en/apis/core/#Component) component classes. For example, if you want to turn an _Entity_ into a camera, you just need to add the camera component [Camera](/en/apis/core/#Camera) to that _Entity_. This component-based functionality extension method focuses on encapsulating the program independently according to functionality, and combining and adding as needed during use, which is very beneficial for reducing program coupling and improving code reusability.
Common components:
-| Name | Description |
-| :---------------------------------------------------- | :---------------- |
-| [Camera](/apis/core/#Camera) | Camera |
-| [MeshRenderer](/apis/core/#MeshRenderer) | Static Model Renderer |
-| [SkinnedMeshRenderer](/apis/core/#SkinnedMeshRenderer) | Skeletal Model Renderer |
-| [Animator](/apis/core/#Animator) | Animation Controller |
-| [DirectLight](/apis/core/#DirectLight) | Directional Light |
-| [PointLight](/apis/core/#PointLight) | Point Light |
-| [SpotLight](/apis/core/#SpotLight) | Spot Light |
-| [ParticleRenderer](/apis/core/#ParticleRenderer) | Particle System |
-| [BoxCollider](/apis/core/#BoxCollider) | Box Collider |
-| [SphereCollider](/apis/core/#SphereCollider) | Sphere Collider |
-| [PlaneCollider](/apis/core/#PlaneCollider) | Plane Collider |
-| [Script](/apis/core/#Script) | Script |
+| Name | Description |
+| :---------------------------------------------------- | :------------- |
+| [Camera](/en/apis/core/#Camera) | Camera |
+| [MeshRenderer](/en/apis/core/#MeshRenderer) | Static Model Renderer |
+| [SkinnedMeshRenderer](/en/apis/core/#SkinnedMeshRenderer) | Skinned Model Renderer |
+| [Animator](/en/apis/core/#Animator) | Animation Control Component |
+| [DirectLight](/en/apis/core/#DirectLight) | Directional Light |
+| [PointLight](/en/apis/core/#PointLight) | Point Light |
+| [SpotLight](/en/apis/core/#SpotLight) | Spotlight |
+| [ParticleRenderer](/en/apis/core/#ParticleRenderer) | Particle System |
+| [BoxCollider](/en/apis/core/#BoxCollider) | Box Collider |
+| [SphereCollider](/en/apis/core/#SphereCollider) | Sphere Collider |
+| [PlaneCollider](/en/apis/core/#PlaneCollider) | Plane Collider |
+| [Script](/en/apis/core/#Script) | Script |
## Editor Usage
-After selecting an entity from the **[Hierarchy Panel](/en/docs/interface/hierarchy)** or the scene, the Inspector will display all the components attached to the currently selected node, with the component names shown in the top left corner
+After selecting an entity from the **[Hierarchy Panel](/en/docs/interface/hierarchy)** or the scene, the inspector will display all the components mounted on the currently selected node, with the component names displayed in the upper left corner
-You can control whether it is enabled in the Inspector
+You can control whether it is enabled in the inspector
@@ -44,14 +44,13 @@ Or edit its various properties
If it is an empty node, you can click the `Add Component` button to add new components to the current entity.
-
## Script Usage
-### Add Component
+### Adding Components
-We use [addComponent(Component)](/apis/core/#Entity-addComponent) to add components. For example, adding a "Direct Light" component ([DirectLight](/apis/core/#DirectLight})) to an `Entity`:
+We use [addComponent(Component)](/en/apis/core/#Entity-addComponent) to add components, for example, adding a "Direct Light" component ([DirectLight](/en/apis/core/#DirectLight)) to an `Entity`:
```typescript
const lightEntity = rootEntity.createChild("light");
@@ -60,31 +59,31 @@ directLight.color = new Color(0.3, 0.3, 1);
directLight.intensity = 1;
```
-### Find Component on Entity
+### Finding Components on an Entity
-When we need to access a component on an entity, the [getComponent](/apis/core/#Entity-getComponent) API helps you find the target component.
+When we need to get a component on an entity, the [getComponent](/en/apis/core/#Entity-getComponent) API will help you find the target component.
```typescript
const component = newEntity.getComponent(Animator);
```
-Sometimes there may be multiple components of the same type, and the above method will only return the first found component. If you need to find all components, you can use [getComponents](/apis/core/#Entity-getComponents):
+Sometimes there may be multiple components of the same type, and the above method will only return the first found component. If you need to find all components, you can use [getComponents](/en/apis/core/#Entity-getComponents):
```typescript
const components = [];
newEntity.getComponents(Animator, components);
```
-In entities obtained from assets like glTF, where we may not know which entity the target component is on, you can use [getComponentsIncludeChildren](/apis/core/#Entity-getComponentsIncludeChildren) to search.
+In assets like glTF, we might not know which entity the target component is on. In this case, you can use [getComponentsIncludeChildren](/en/apis/core/#Entity-getComponentsIncludeChildren) to search.
```typescript
const components = [];
newEntity.getComponentsIncludeChildren(Animator, components);
```
-### Get Entity of a Component
+### Getting the Entity of a Component
-Continuing from the example of adding a component at the beginning, you can directly get the entity of the component:
+Continuing with the example of adding a component at the beginning. You can directly get the entity of the component:
```typescript
const entity = directLight.entity;
@@ -92,10 +91,8 @@ const entity = directLight.entity;
### State
-When not using a component temporarily, you can actively call the [enabled](/apis/core/#Component-enabled) property of the component.
+When temporarily not using a component, you can actively call the component's [enabled](/en/apis/core/#Component-enabled)
```typescript
directLight.enabled = false;
```
-
-{ /*examples*/ }
diff --git a/docs/en/core/engine.md b/docs/en/core/engine.md
index 02145fc85d..2e7758777f 100644
--- a/docs/en/core/engine.md
+++ b/docs/en/core/engine.md
@@ -5,72 +5,86 @@ type: Core
label: Core
---
-`Engine` plays the role of the main controller in Galacean Engine, mainly including three major functions: **canvas**, **render control**, and **engine subsystem management**:
+`Engine` plays the role of the main controller in the Galacean Engine, mainly including functions such as **canvas**, **render control**, and **engine subsystem management**:
- **[Canvas](/en/docs/core/canvas)**: Operations related to the main canvas, such as modifying the canvas width and height.
-- **Render Control**: Controls rendering execution/pause/resume, vertical synchronization, and other functions.
+- **Render Control**: Controls the execution/pause/resume of rendering, vertical synchronization, and other functions.
- **Engine Subsystem Management**:
- - [Scene Management](/en/docs/core/scene})
- - [Resource Management](/en/docs/assets-overall})
- - [Physics System](/en/docs/physics-overall})
- - [Interaction System](/en/docs/input})
- - [XR System](/en/docs/xr-overall})
-- **Context Management for Execution Environment**: Controls the context management of WebGL and other execution environments.
+ - [Scene Management](/en/docs/core/scene)
+ - [Resource Management](/en/docs/assets/overall)
+ - [Physics System](/en/docs/physics/overall)
+ - [Interaction System](/en/docs/input)
+ - [XR System](/en/docs/xr/overall)
+- **Execution Environment Context Management**: Controls the context management of execution environments such as WebGL.
## Initialization
-To facilitate users in creating a web-based engine directly, Galacean provides [WebGLEngine](/apis/rhi-webgl/WebGLEngine):
+To facilitate users to directly create a web engine, Galacean provides [WebGLEngine](/en/apis/rhi-webgl/WebGLEngine):
```typescript
const engine = await WebGLEngine.create({ canvas: "canvas" });
```
-> `WebGLEngine.create` not only instantiates the engine but also handles rendering context configuration and initialization of certain subsystems.
+`WebGLEngine` supports WebGL1.0 and WebGL2.0. It can control all behaviors of the canvas, as well as resource management, scene management, execution/pause/resume, vertical synchronization, and other functions. `WebGLEngine.create`. Below is the type description of the configuration passed in when creating the engine:
-### Rendering Context
+```mermaid
+---
+title: WebGLEngineConfiguration Interface
+---
+classDiagram
+ EngineConfiguration <|-- WebGLEngineConfiguration
+ class EngineConfiguration {
+ <>
+ +IPhysics physics
+ +IXRDevice xrDevice
+ +ColorSpace colorSpace
+ +IShaderLab shaderLab
+ +IInputOptions input
+ }
+
+ class WebGLEngineConfiguration{
+ <>
+ +HTMLCanvasElement | OffscreenCanvas | string canvas
+ +WebGLGraphicDeviceOptions graphicDeviceOptions
+ }
+```
-Developers can set the context's rendering configuration in the [Export Interface](/en/docs/assets-build).
+Projects exported by the editor usually automatically set the relevant options configured by the editor. For example, developers can set the rendering configuration of the context in the [export interface](/en/docs/assets/build):
-You can also manage this by setting the third parameter [WebGLGraphicDeviceOptions](/apis/rhi-webgl/WebGLGraphicDeviceOptions) of [WebGLEngine](/apis/rhi-webgl/WebGLEngine}), for example, to manage **canvas transparency**, the engine by default enables the canvas's alpha channel, meaning the canvas will blend with the elements behind it. If you need to disable transparency, you can do so like this:
+Or select the physics backend and XR backend in the project settings interface of the editor:
+
+
+
+You can also modify the code to change the engine configuration. Take **canvas transparency** as an example. By default, the engine enables the transparency channel of the canvas, which means the canvas will blend with the elements behind it. If you need to disable transparency, you can set it like this:
```typescript
const engine = await WebGLEngine.create({
canvas: htmlCanvas,
- graphicDeviceOptions: { alpha: false },
+ graphicDeviceOptions: { alpha: false }
});
```
-Similarly, you can control WebGL1/2 with `webGLMode`, and attributes other than `webGLMode` will be passed to the context. For more details, refer to the [getContext parameter interpretation](https://developer.mozilla.org/en-US/en/docs/Web/API/HTMLCanvasElement/getContext#parameters).
-
-### Physics System
-
-Refer to the [Physics System](/en/docs/physics-overall) documentation.
-
-### Interaction System
-
-Refer to the [Interaction System](/en/docs/input) documentation.
-
-### XR System
+Similarly, you can use `webGLMode` to control WebGL1/2. Properties other than `webGLMode` will be passed to the context. For details, refer to [getContext parameter explanation](https://developer.mozilla.org/en-US/docs/Web/API/HTMLCanvasElement/getContext#parameters).
-Refer to the [XR System](/en/docs/xr-overall) documentation.
+For more related configuration information, refer to [Physics System](/en/docs/physics/overall), [Interaction System](/en/docs/input), [XR System](/en/docs/xr/overall).
## Properties
-| Property Name | Property Description |
-| ----------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
-| [time](/apis/core/#Engine-time) | Information related to engine time. |
-| [vSyncCount](/apis/core/#Engine-vSyncCount) | By default, the engine enables [vertical synchronization](https://en.wikipedia.org/wiki/Screen_tearing) with a refresh rate `vSyncCount` of `1`, meaning it synchronizes with the screen refresh rate. If `vSyncCount` is set to `2`, the engine updates every 2 frames. |
-| [resourceManager](/apis/core/#Engine-resourceManager) | Resource management. |
-| [sceneManager](/apis/core/#Engine-sceneManager) | Scene management. _Engine_ serves as the main controller, _Scene_ acts as a scene unit for easy entity management in large scenes; _Camera_ is a component attached to a specific entity in _Scene_, similar to a real camera, allowing you to capture any entity in the _Scene_ and render it to a screen area or off-screen rendering. |
-| [inputManager](/apis/core/#Engine-inputManager) | Interaction management. |
+| Property Name | Property Description |
+| --- | --- |
+| [time](/en/apis/core/#Engine-time) | Engine time-related information. For details, refer to [Time](/en/docs/core/time/) |
+| [vSyncCount](/en/apis/core/#Engine-vSyncCount) | Vertical synchronization refresh rate. The engine enables [vertical synchronization](https://baike.baidu.com/item/%E5%9E%82%E7%9B%B4%E5%90%8C%E6%AD%A5/7263524?fromtitle=V-Sync&fromid=691778) by default, and the refresh rate `vSyncCount` is `1` (consistent with the screen refresh rate). If `vSyncCount` is set to `2`, the engine updates once every 2 frames of screen refresh. |
+| [resourceManager](/en/apis/core/#Engine-resourceManager) | Resource manager, generally used for [loading](/en/docs/assets/load/) and [releasing](/en/docs/assets/gc/) assets |
+| [sceneManager](/en/apis/core/#Engine-sceneManager) | Scene manager. Galacean supports rendering multiple scenes simultaneously. The scene manager can be used to conveniently manage the addition, deletion, modification, and query of the current scene. For details, refer to [Scene](/en/docs/core/scene/) |
+| [inputManager](/en/apis/core/#Engine-inputManager) | Interaction manager, generally used to obtain keyboard, touch, and scroll information. For details, refer to [Interaction](/en/docs/input/input/) |
### Refresh Rate
-By default, the engine uses vertical synchronization mode and [vSyncCount](/apis/core/#Engine-vSyncCount) to control the rendering refresh rate. In this mode, the rendering frame waits for the vertical synchronization signal of the screen. [vSyncCount](/apis/core/#Engine-vSyncCount) represents the expected number of screen synchronization signals between rendering frames, with a default value of 1. The value of this property must be an integer. For example, if we want to render 30 frames per second on a device with a screen refresh rate of 60 frames, we can set this value to 2.
+By default, the engine uses vertical synchronization mode and controls the rendering refresh rate with [vSyncCount](/en/apis/core/#Engine-vSyncCount). In this mode, the rendered frame will wait for the screen's vertical sync signal. [vSyncCount](/en/apis/core/#Engine-vSyncCount) represents the desired number of screen sync signals between rendered frames. The default value is 1, and this property must be an integer. For example, if we want to render 30 frames per second on a device with a screen refresh rate of 60 frames, we can set this value to 2.
-Users can also disable vertical synchronization by setting [vSyncCount](/apis/core/#Engine-vSyncCount) to 0 and then setting [targetFrameRate](/apis/core/#Engine-targetFrameRate) to the desired frame rate value. In this mode, rendering does not consider vertical synchronization signals. For example, setting it to 120 means expecting 120 frames per second.
+Additionally, users can disable vertical synchronization by setting [vSyncCount](/en/apis/core/#Engine-vSyncCount) to 0 and then setting [targetFrameRate](/en/apis/core/#Engine-targetFrameRate) to the desired frame rate. In this mode, rendering does not consider the vertical sync signal. For example, 120 means 120 frames, i.e., the expectation is to refresh 120 times per second.
```typescript
// 垂直同步
@@ -86,9 +100,9 @@ engine.targetFrameRate = 120;
## Methods
-| Method Name | Method Description |
-| ----------------------------------- | ----------------------- |
-| [run](/apis/core/#Engine-run) | Execute engine frame loop |
-| [pause](/apis/core/#Engine-pause) | Pause engine frame loop |
-| [resume](/apis/core/#Engine-resume) | Resume engine rendering loop |
-| [destroy](/apis/core/#Engine-destroy)| Destroy the engine |
+| Method Name | Description |
+| ------------------------------------- | ------------------ |
+| [run](/en/apis/core/#Engine-run) | Execute engine rendering frame loop |
+| [pause](/en/apis/core/#Engine-pause) | Pause engine rendering frame loop |
+| [resume](/en/apis/core/#Engine-resume)| Resume engine rendering loop |
+| [destroy](/en/apis/core/#Engine-destroy)| Destroy the engine |
diff --git a/docs/en/core/entity.mdx b/docs/en/core/entity.mdx
new file mode 100644
index 0000000000..ef73f05abb
--- /dev/null
+++ b/docs/en/core/entity.mdx
@@ -0,0 +1,146 @@
+---
+order: 3
+title: Entity
+type: Core
+label: Core
+---
+
+## Editor Usage
+
+**[Hierarchy Panel](/en/docs/interface/hierarchy)** is located on the far left side of the editor. It displays all nodes in the current scene in a tree structure. The scene node is the parent node of all other nodes, including cameras, lights, meshes, etc. There is a search box at the top of the node panel that allows you to search for nodes in the scene to quickly locate them. Through the **[Hierarchy Panel](/en/docs/interface/hierarchy)**, you can add or delete nodes and organize nodes better by dragging and dropping to sort them.
+
+
+Nodes in the editor are called `entities` in the engine. To avoid confusion, the term `entity` will be used uniformly below.
+
+
+
+
+### Adding a New Entity
+
+To add a new entity, you can click the add button in the **[Hierarchy Panel](/en/docs/interface/hierarchy)** or right-click an entity and select Add Child Entity. After adding, you can edit the properties of the new entity in the **[Inspector Panel](/en/docs/interface/inspector)**. If you use the add entity button, you can also quickly create basic models such as cubes/spheres.
+
+### Editing an Entity
+
+Click on an entity to edit it. In the **[Inspector Panel](/en/docs/interface/inspector)** on the right, you can edit its name.
+
+
+
+Whether it is active
+
+
+
+Transform
+
+
+
+And add or remove components for it
+
+
+
+### Deleting an Entity
+
+After selecting an entity, you can click the delete button in the **[Hierarchy Panel](/en/docs/interface/hierarchy)** or use the delete option in the right-click menu to delete the entity. Deleting an entity will delete the entity and all its child entities. Therefore, when deleting an entity, you need to be aware of whether the deleted entity will affect other entities in the scene.
+
+### Sorting Entities
+
+To better organize entities, you can sort entities by dragging and dropping. After selecting an entity, you can change its position in the hierarchy tree by dragging it with the left mouse button.
+
+### Searching for Entities
+
+There is a search box at the top of the **[Hierarchy Panel](/en/docs/interface/hierarchy)** where users can enter the name of an entity to search for entities in the scene. The search box supports fuzzy search, allowing you to enter part of the entity's name to find it.
+
+### Hiding Entities
+
+Each entity has an eye button on the right side. Clicking it toggles the entity's visibility in the scene. Note that this adjustment to the entity's visibility only affects the workspace and not the `isActive` property in the **[Inspector Panel](/en/docs/interface/inspector)**.
+
+## Script Usage
+
+### Creating a New Entity
+
+In the [Scene](/en/docs/core/scene), we have already introduced how to get the active scene. In a new scene, we usually add a root entity first:
+
+```typescript
+const scene = engine.sceneManager.activeScene;
+const rootEntity = scene.createRootEntity();
+```
+
+Generally, a new entity is created by adding a child entity:
+
+```typescript
+const newEntity = rootEntity.createChild("firstEntity");
+```
+
+Of course, you can also create an entity directly. However, such an entity is detached and will not appear in the scene until it is associated with the hierarchy tree.
+
+```typescript
+const newEntity = new Entity(engine, "firstEntity");
+rootEntity.addChild(newEntity);
+```
+
+### Delete Entity
+
+When an entity is no longer needed in the scene, we can delete it:
+
+```typescript
+rootEntity.removeChild(newEntity);
+```
+
+It is worth noting that this method only releases the object from the hierarchy tree and does not display it in the scene. To completely destroy it, you need to:
+
+```typescript
+newEntity.destroy();
+```
+
+### Find Child Entity
+
+When the parent entity is known, we usually obtain the child entity through the parent entity:
+
+```typescript
+const childrenEntity = newEntity.children;
+```
+
+If you know the child's _index_ in the parent entity, you can directly use [getChild](/en/apis/core/#Entity-getChild):
+
+```typescript
+newEntity.getChild(0);
+```
+
+If you don't know the child's index, you can use [findByName](/en/apis/core/#Entity-findByName) to search by name. `findByName` will search not only child entities but also grandchild entities.
+
+```typescript
+newEntity.findByName("model");
+```
+
+If there are entities with the same name, you can use [findByPath](/en/apis/core/#Entity-findByPath) to pass in the path for step-by-step search. Using this API will also improve search efficiency to some extent.
+
+```typescript
+newEntity.findByPath("parent/child/grandson");
+```
+
+### State
+
+When an entity is temporarily not in use, you can stop activating it by calling the entity's [isActive](/en/apis/core/#Entity-isActive). At the same time, the components under this entity will be passively `component.enabled = false`
+
+```typescript
+newEntity.isActive = false;
+```
diff --git a/docs/en/core/math.md b/docs/en/core/math.md
index 72c5a49cbb..f6c638b339 100644
--- a/docs/en/core/math.md
+++ b/docs/en/core/math.md
@@ -1,133 +1,132 @@
---
order: 8
-title: Mathematics Library
+title: Math Library
type: Core
label: Core
---
-In a rendering scene, we often perform operations such as translation, rotation, scaling on objects (these operations are collectively referred to as [transform](/en/docs/core/transform)), in order to achieve the interactive effects we desire. The calculations for these transformations are typically implemented using vectors, quaternions, matrices, etc. Therefore, we provide a mathematics library to handle operations related to *vectors*, *quaternions*, *matrices*, and more. Additionally, the mathematics library offers a variety of classes to help us describe *points*, *lines*, *planes*, *geometric shapes* in space, as well as determine their intersections and spatial relationships in three-dimensional space.
+In a rendering scene, we often perform operations such as translation, rotation, scaling, etc. on objects (we collectively refer to these operations as [transform](/en/docs/core/transform)) to achieve the interactive effects we want. These transformations are generally achieved through vectors, quaternions, matrices, etc. For this purpose, we provide a math library to perform operations related to *vectors*, *quaternions*, *matrices*, etc. In addition, the math library also offers more classes to help us describe *points*, *lines*, *planes*, and *geometric bodies* in space, as well as determine their intersections and positional relationships in three-dimensional space. It can even describe color values.
| Type | Description |
| :--- | :--- |
-| [BoundingBox](/apis/math/#BoundingBox) | Axis-Aligned Bounding Box (AABB) |
-| [BoundingFrustum](/apis/math/#BoundingFrustum) | View Frustum |
-| [BoundingSphere](/apis/math/#BoundingSphere) | Bounding Sphere |
-| [CollisionUtil](/apis/math/#CollisionUtil) | Provides many static methods to determine intersections and spatial relationships between objects in space |
-| [Color](/apis/math/#Color) | Color class, described using RGBA |
-| [MathUtil](/apis/math/#MathUtil) | Utility class, provides common calculations such as comparisons, angle-radian conversions, etc. |
-| [Matrix](/apis/math/#Matrix) | Default 4x4 matrix, offers basic matrix operations and transformation-related operations |
-| [Matrix3x3](/apis/math/#Matrix3x3) | 3x3 matrix, provides basic matrix operations and transformation-related operations |
-| [Plane](/apis/math/#Plane) | Plane class, used to describe planes in three-dimensional space |
-| [Quaternion](/apis/math/#Quaternion) | Quaternion, contains x, y, z, w components, responsible for rotation-related operations |
-| [Ray](/apis/math/#Ray) | Ray class, used to describe rays in three-dimensional space |
-| [Vector2](/apis/math/#Vector2) | Two-dimensional vector, contains x, y components |
-| [Vector3](/apis/math/#Vector3) | Three-dimensional vector, contains x, y, z components |
-| [Vector4](/apis/math/#Vector4) | Four-dimensional vector, contains x, y, z, w components |
+| [BoundingBox](/en/apis/math/#BoundingBox) | AABB Bounding Box |
+| [BoundingFrustum](/en/apis/math/#BoundingFrustum) | View Frustum |
+| [BoundingSphere](/en/apis/math/#BoundingSphere) | Bounding Sphere |
+| [CollisionUtil](/en/apis/math/#CollisionUtil) | Provides many static methods to determine intersections and positional relationships between objects in space |
+| [Color](/en/apis/math/#Color) | Color class, described using RGBA |
+| [MathUtil](/en/apis/math/#MathUtil) | Utility class, provides common calculations such as comparisons, angle-radian conversions, etc. |
+| [Matrix](/en/apis/math/#Matrix) | Default 4x4 matrix, provides basic matrix operations and transformation-related operations |
+| [Matrix3x3](/en/apis/math/#Matrix3x3) | 3x3 matrix, provides basic matrix operations and transformation-related operations |
+| [Plane](/en/apis/math/#Plane) | Plane class, used to describe planes in three-dimensional space |
+| [Quaternion](/en/apis/math/#Quaternion) | Quaternion, contains x, y, z, w components, responsible for rotation-related operations |
+| [Ray](/en/apis/math/#Ray) | Ray class, used to describe rays in three-dimensional space |
+| [Vector2](/en/apis/math/#Vector2) | Two-dimensional vector, contains x, y components |
+| [Vector3](/en/apis/math/#Vector3) | Three-dimensional vector, contains x, y, z components |
+| [Vector4](/en/apis/math/#Vector4) | Four-dimensional vector, contains x, y, z, w components |
## Vectors
-The most basic definition of a vector is a direction. More formally, a vector has a direction (Direction) and magnitude (Magnitude, also known as strength or length). You can think of a vector as instructions on a treasure map: "Take 10 steps to the left, 3 steps north, then 5 steps to the right"; "Left" is the direction, and "10 steps" is the length of the vector. So, this treasure map has a total of 3 vectors. Vectors can exist in any dimension, but we typically use 2 to 4 dimensions. If a vector has 2 dimensions, it represents a direction in a plane (imagine a 2D image), and when it has 3 dimensions, it can express a direction in a 3D world.
+The most basic definition of a vector is a direction. More formally, a vector has a direction and a magnitude (also called strength or length). You can think of a vector as an instruction on a treasure map: "Walk 10 steps to the left, 3 steps north, then 5 steps to the right"; "left" is the direction, "10 steps" is the length of the vector. So, the instructions on this treasure map consist of 3 vectors. Vectors can exist in any dimension, but we usually use 2 to 4 dimensions. If a vector has 2 dimensions, it represents a direction on a plane (imagine a 2D image). When it has 3 dimensions, it can express a direction in a 3D world.
In the Galacean engine, vectors are used to represent object coordinates (position), rotation, scaling, and color.
```typescript
import { Vector3 } from '@galacean/engine-math';
-// Create Vector3, the x,y,z is 0.
+// 创建默认三维向量,即 x,y,z 分量均为0
const v1 = new Vector3();
-// Create a Vector3 and initialize the x, y, and z components with the given values.
+// 创建三维向量,并用给定值初始化 x,y,z 分量
const v2 = new Vector3(1, 2, 3);
-// Set the specified value.
+// 设置指定值
v1.set(1, 2, 2);
-// Get x, y, and z components.
+// 获取各个分量
const x = v1.x;
const y = v1.y;
const z = v1.z;
-// Vector addition, static method.
+// 向量相加,静态方式
const out1 = new Vector3();
Vector3.add(v1, v2, out1);
-// Vector addition, instance method.
+// 向量相加,实例方式
const out2 = v1.add(v2);
-// The length of Vector3.
+// 向量的标量长度
const len: number = v1.length();
-// Normalized Vector3.
+// 向量归一化
v1.normalize();
-// Clone Vector3.
+// 克隆一个向量
const c1 = v1.clone();
-// Clone the values of the Vector3 to another Vector3.
+// 将向量的值克隆到另外一个向量
const c2 = new Vector3();
v1.cloneTo(c2);
```
-
## Quaternions
-Quaternions are simple hypercomplex numbers, and in graphics engines, quaternions are mainly used for three-dimensional rotations ([Relationship between quaternions and three-dimensional rotations](https://krasjet.github.io/quaternion/quaternion.pdf)), which can represent rotations not only with quaternions but also with Euler angles, axis-angle, matrices, etc. The reason for choosing quaternions is mainly due to the following advantages:
+Quaternions are simple hypercomplex numbers, and in graphics engines, quaternions are mainly used for three-dimensional rotations ([relationship between quaternions and three-dimensional rotations](https://krasjet.github.io/quaternion/quaternion.pdf)). Besides quaternions, Euler angles, axis angles, matrices, etc., can also represent rotations. The main advantages of choosing quaternions are:
-- Solves the problem of gimbal lock
-- Requires storing only 4 floating-point numbers, making it lighter compared to matrices
-- More efficient for operations like inversion, concatenation, etc., compared to matrices
+- Solves the gimbal lock problem
+- Only requires storing 4 floating-point numbers, making it lighter compared to matrices
+- Operations such as inversion and concatenation are more efficient compared to matrices
-In the Galacean engine, quaternions are also used for rotation-related operations and provide APIs for converting Euler angles, matrices, etc., to quaternions.
+In the Galacean engine, quaternions are also used for rotation-related operations, and APIs are provided for converting Euler angles and matrices to quaternions.
```typescript
import { Vector3, Quaternion, MathUtil } from '@galacean/engine-math';
-// Create Quaternion, the x,y,z is 0, and w is 1.
+// 创建默认四元数,即 x,y,z 分量均为0,w 分量为1
const q1 = new Quaternion();
-// Create a Quaternion and initialize the x, y, z and w components with the given values.
+// 创建四元数,并用给定值初始化 x,y,z,w 分量
const q2 = new Quaternion(1, 2, 3, 4);
-// Set the specified value.
+// 设置指定值
q1.set(1, 2, 3, 4);
-// Check if the values of two quaternions are equal.
+// 判断两个四元数的值是否相等
const isEqual: boolean = Quaternion.equals(q1, q2);
const xRad = Math.PI * 0.2;
const yRad = Math.PI * 0.5;
const zRad = Math.PI * 0.3;
-// Generate a quaternion based on yaw (Y), pitch (X), and roll (Z).
+// 根据 yaw(Y)、pitch(X)、roll(Z) 生成四元数
const out1 = new Quaternion();
Quaternion.rotationYawPitchRoll(yRad, xRad, zRad, out1);
-// Generate a quaternion from rotation Euler angles (in radians) around the x, y, and z axes.
+// 根据 x,y,z 轴的旋转欧拉角(弧度)生成四元数
const out2 = new Quaternion();
-// Equivalent to Quaternion.rotationYawPitchRoll(yRad, xRad, zRad, out2)
+// 等价于 Quaternion.rotationYawPitchRoll(yRad, xRad, zRad, out2)
Quaternion.rotationEuler(xRad, yRad, zRad, out2);
-// Generating quaternions for rotations around the X, Y, and Z axes. Let's take rotating around the X axis as an example.
+// 绕 X、Y、Z 轴旋转生成四元数,我们以绕 X 轴为例
const out3 = new Quaternion();
Quaternion.rotationX(xRad, out3);
-// The current quaternion rotates successively around the X, Y, and Z axes.
+// 当前四元数依次绕 X、Y、Z 轴旋转
const q3 = new Quaternion();
q3.rotateX(xRad).rotateY(yRad).rotateZ(zRad);
-// Retrieve the Euler angles (in radians) from the current quaternion.
+// 获取当前四元数的欧拉角(弧度)
const eulerV = new Vector3();
q3.toEuler(eulerV);
-// Convert radians to degrees.
-eulerV.scale(MathUtil.radToDegreeFactor);
+// 弧度转角度
+eulerV.scale(MathUtil.radToDegreeFactor);
```
## Matrices
-In 3D graphics engines, calculations can be performed in multiple different Cartesian coordinate spaces, and transforming from one coordinate space to another requires the use of transformation matrices, which is the purpose of the Matrix module in our mathematics library.
+In a 3D graphics engine, calculations can be performed in multiple different Cartesian coordinate spaces. Transforming from one coordinate space to another requires the use of transformation matrices, and the Matrix module in our math library exists to provide this capability.
-In Galacean, matrices use the same column-major format as the WebGL standard. For example, a 4x4 matrix with 16 elements is stored in an array as follows:
+In Galacean, matrices are column-major, just like the WebGL standard. For a 4x4 matrix, the 16 elements are stored in an array as follows:
```typescript
const elements: Float32Array = new Float32Array(16);
@@ -142,92 +141,93 @@ elements[3] & elements[7] & elements[11] & elements[15]
\end{bmatrix}
$$
-In the Galacean engine, there are local coordinates, global coordinates, view coordinates, clip coordinates, etc., and the transformation of objects between these coordinates is achieved through transformation matrices.
+In the Galacean engine, there are local coordinates, global coordinates, view coordinates, clip coordinates, etc. The transformation of objects between these coordinates is accomplished through transformation matrices.
-Matrix multiplication order is from right to left. For example, to calculate the MV matrix using the model matrix and the view matrix, the notation is as follows:
+The order of matrix multiplication is from right to left. For example, if we want to calculate the MV matrix using the model matrix and view matrix, the code is as follows:
```typescript
Matrix.multiply(viewMatrix, modelMatrix, mvMatrix);
```
-Below are some commonly used functionalities in matrices:
+Below are some commonly used functions in matrices:
```typescript
import { Vector3, Matrix3x3, Matrix } from '@galacean/engine-math';
-// Create a default 4x4 matrix, initialized as the identity matrix.
+// 创建默认4x4矩阵,默认为单位矩阵
const m1 = new Matrix();
-// Create a 4x4 matrix and initialize it with the given values.
+// 创建4x4矩阵,并按给定值初始化
const m2 = new Matrix(1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16);
-// Set m2 to the identity matrix.
+// 将 m2 设置为单位矩阵
m2.identity();
-// Check if two matrices have equal values, returning true.
+// 判断两个矩阵的值是否相等 true
const isEqual1: boolean = Matrix.equals(m1, m2);
-// Matrix multiplication, static method.
+// 矩阵相乘 静态方式
const m3 = new Matrix(1, 2, 3.3, 4, 5, 6, 7, 8, 9, 10.9, 11, 12, 13, 14, 15, 16);
const m4 = new Matrix(16, 15, 14, 13, 12, 11, 10, 9, 8.88, 7, 6, 5, 4, 3, 2, 1);
const out1 = new Matrix();
Matrix.multiply(m3, m4, out1);
-// Matrix multiplication, instance method.
+// 矩阵相乘,实例方式
const out2 = m3.multiply(m4);
-// Check if two matrices have equal values, returning true.
+// 判断两个矩阵的值是否相等 true
const isEqual2: boolean = Matrix.equals(out1, out2);
-// Calculate the determinant of a matrix.
+// 求矩阵行列式
const m5 = new Matrix(1, 2, 3, 4, 5, 6, 7, 8, 9, 10.9, 11, 12, 13, 14, 15, 16);
const det: number = m5.determinant();
-// Convert a 4x4 matrix to a 3x3 matrix.
+// 4x4矩阵转3x3矩阵
const m6 = new Matrix3x3();
m6.setValueByMatrix(m5);
-// Create a 4x4 matrix and initialize it with the given values.
+// 创建4x4矩阵,并按给定值初始化
const m7 = new Matrix(1, 2, 3, 4, 5, 6, 7, 8, 9, 10.9, 11, 12, 13, 14, 15, 16);
-// Compute the transpose of a matrix, using a static method.
+// 求矩阵的转置矩阵,静态方式
Matrix.transpose(m7, m7);
-// Compute the transpose of a matrix, using a instance method.
+// 求矩阵的转置矩阵。实例方式
m7.transpose();
-// Generate a 4x4 matrix for rotation around the Y-axis.
+// 绕 Y 轴旋转生成4x4矩阵
const axis = new Vector3(0, 1, 0);
const out4 = new Matrix();
Matrix.rotationAxisAngle(axis, Math.PI * 0.25, out4);
-// Extract rotation, scaling, and translation from a matrix.
+// 从一个矩阵内获取旋转、缩放和位移
const m8 = new Matrix(4.440892098500626e-16, 2, 0, 0, -2, 4.440892098500626e-16, 0, 0, 0, 0, 2, 0, 0, 10, 10, 1);
-// Storage for translation.
+// 用于存放位移
const translate = new Vector3();
-// Storage for scale.
+// 用于存放缩放
const scale = new Vector3();
-// Storage for rotation.
+// 用于存放旋转
const qua = new Quaternion();
m8.decompose(translate, qua, scale);
const rotation = new Vector3();
-// Retrieve the rotation angle in radians for each axis from the acquired quaternion.
+// 根据拿到的旋转四元数获取每个轴的旋转弧度
qua.toEuler(rotation);
-// Generate a rotation matrix from a quaternion.
+// 根据四元数生成旋转矩阵
const m9 = new Matrix();
Matrix.rotationQuaternion(qua, m9);
-// Generate a rotation matrix from rotation angles.
+// 根据旋转角度生成旋转矩阵
const m10 = new Matrix();
Matrix.rotationAxisAngle(new Vector3(0, 0, 1), Math.PI * 0.5, m10);
-// Generate a scaling matrix from scaling factors.
+// 根据缩放生成缩放矩阵
const m11 = new Matrix();
Matrix.scaling(scale, m11);
-// Generate a translation matrix from translation values.
+// 根据位移生成位移矩阵
const m12 = new Matrix();
Matrix.translation(translate, m12);
-// Generate a matrix from rotation, scaling, and translation.
+// 根据旋转、缩放、位移生成矩阵
const m13 = new Matrix();
-Matrix.affineTransformation(scale, qua, translate, m13);d
+Matrix.affineTransformation(scale, qua, translate, m13);
+
```
## Color
@@ -235,7 +235,7 @@ Matrix.affineTransformation(scale, qua, translate, m13);d
```typescript
import { Color } from "@galacean/engine-math";
-// Create Color.
+// 创建 Color 对象
const color1 = new Color(1, 0.5, 0.5, 1);
const color2 = new Color();
color2.r = 1;
@@ -243,58 +243,56 @@ color2.g = 0.5;
color2.b = 0.5;
color2.a = 1;
-// Convert linear space to gamma space.
+// linear 空间转 gamma 空间
const gammaColor = new Color();
color1.toGamma(gammaColor);
-// Convert gamma space to linear space.
+// gamma 空间转 linear 空间
const linearColor = new Color();
color2.toLinear(linearColor);
```
## Plane
-We can determine a plane through a vector (normal) and a distance (distance). Normal means the direction of the plane based on the origin of the coordinate system. The plane is perpendicular to normal. Distance means the distance of the plane from the origin of the coordinate system along the normal direction. We take a plane perpendicular to the Y axis with a distance of 5 as an example, as shown below:
+We can define a plane using a vector (normal) and a distance. The normal represents the direction of the plane based on the coordinate origin, and the plane is perpendicular to the normal. The distance represents the distance of the plane from the coordinate origin along the normal direction. For example, a plane perpendicular to the Y-axis with a distance of 5 is illustrated as follows:

-
-The creation method is as follows:
-
+The code to create it is as follows:
```typescript
const normal = new Vector3(0, 1, 0);
const distance = 5;
const plane = new Plane(normal, distance);
```
-Other usage:
+Other usages:
```typescript
import { Plane, Vector3 } from "@galacean/engine-math";
-// Create a plane using the three vertices of a triangle.
+// 通过三角形的三个顶点创建平面
const point1 = new Vector3(0, 1, 0);
const point2 = new Vector3(0, 1, 1);
const point3 = new Vector3(1, 1, 0);
const plane1 = new Plane();
Plane.fromPoints(point1, point2, point3, plane1);
-// Create a plane using the plane's normal and the distance from the origin.
+// 通过平面的法线以及法线距离原点距离创建平面
const plane2 = new Plane(new Vector3(0, 1, 0), -1);
```
## Bounding Box
-In Galacean, BoundingBox represents an AABB (Axis-Aligned Bounding Box), which is a simple and efficient type of bounding box commonly used in computer graphics and collision detection. It is defined by a minimum point and a maximum point, forming a rectangle or cuboid (in 3D space) aligned with the coordinate axes.
+In Galacean, BoundingBox represents an AABB (Axis-Aligned Bounding Box), which is a simple and efficient type of bounding box commonly used in computer graphics and collision detection. It is defined by a minimum point and a maximum point, forming a rectangle or cuboid aligned with the coordinate axes (in 3D space).
```typescript
import { BoundingBox, BoundingSphere, Matrix, Vector3 } from "@galacean/engine-math";
-// Create the same bounding box using different methods.
+// 通过不同的方式创建同样的包围盒
const box1 = new BoundingBox();
const box2 = new BoundingBox();
const box3 = new BoundingBox();
-// Create using the center point and box extent.
+// 通过中心点和盒子范围来创建
BoundingBox.fromCenterAndExtent(new Vector3(0, 0, 0), new Vector3(1, 1, 1), box1);
-// Create using multiple points.
+// 通过很多点来创建
const points = [
new Vector3(0, 0, 0),
new Vector3(-1, 0, 0),
@@ -309,11 +307,11 @@ const points = [
];
BoundingBox.fromPoints(points, box2);
-// Create using a bounding sphere.
+// 通过包围球来创建
const sphere = new BoundingSphere(new Vector3(0, 0, 0), 1);
BoundingBox.fromSphere(sphere, box3);
-// Transform the bounding box using a matrix.
+// 通过矩阵来对包围盒进行变换
const box = new BoundingBox(new Vector3(-1, -1, -1), new Vector3(1, 1, 1));
const matrix = new Matrix(
2, 0, 0, 0,
@@ -324,16 +322,16 @@ const matrix = new Matrix(
const newBox = new BoundingBox();
BoundingBox.transform(box, matrix, newBox);
-// Merge two bounding boxes, box1 and box2, into a new bounding box box.
+// 合并两个包围盒 box1, box2 成为一个新的包围盒 box
BoundingBox.merge(box1, box2, box);
-// Get the center point and dimensions of the bounding box.
+// 获取包围盒的中心点和范围
const center = new Vector3();
box.getCenter(center);
const extent = new Vector3();
box.getExtent(extent);
-// Get the all vertices of the bounding box.
+// 获取包围盒的8个顶点
const corners = [
new Vector3(), new Vector3(), new Vector3(), new Vector3(),
new Vector3(), new Vector3(), new Vector3(), new Vector3()
@@ -342,15 +340,14 @@ box.getCorners(corners);
```
## Bounding Sphere
-
```typescript
import { BoundingBox, BoundingSphere, Vector3 } from "@galacean/engine-math";
-// Create a bounding sphere using different methods.
+// 通过不同方式来创建包围球
const sphere1 = new BoundingSphere();
const sphere2 = new BoundingSphere();
-// Create a bounding sphere using multiple points.
+// 通过很多点来创建
const points = [
new Vector3(0, 0, 0),
new Vector3(-1, 0, 0),
@@ -365,30 +362,29 @@ const points = [
];
BoundingSphere.fromPoints(points, sphere1);
-// Create a bounding sphere from a bounding box.
+// 通过包围盒来创建
const box = new BoundingBox(new Vector3(-1, -1, -1), new Vector3(1, 1, 1));
BoundingSphere.fromBox(box, sphere2);
```
## Frustum
-
```typescript
import { BoundingBox, BoundingSphere, BoundingFrustum,Matrix, Vector3 } from "@galacean/engine-math";
-// Create a frustum based on the View-Projection (VP) matrix. In practical projects, the view matrix and projection matrix are typically obtained from the camera.
+// 根据 VP 矩阵创建视锥体,实际项目中,一般从相机中获取 view matrix 和 projection matrix
const viewMatrix = new Matrix(1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, -20, 1);
const projectionMatrix = new Matrix(0.03954802080988884, 0, 0, 0, 0, 0.10000000149011612, 0, 0, 0, 0, -0.0200200192630291, 0, -0, -0, -1.0020020008087158, 1);
const vpMatrix = new Matrix();
Matrix.multiply(projectionMatrix, viewMatrix, vpMatrix);
const frustum = new BoundingFrustum(vpMatrix);
-// Check for intersection with an AABB (Axis-Aligned Bounding Box).
+// 判断是否和 AABB 包围盒相交
const box1 = new BoundingBox(new Vector3(-2, -2, -2), new Vector3(2, 2, 2));
const isIntersect1 = frustum.intersectsBox(box1);
const box2 = new BoundingBox(new Vector3(-32, -2, -2), new Vector3(-28, 2, 2));
const isIntersect2 = frustum.intersectsBox(box2);
-// Check for intersection with a bounding sphere.
+// 判断是否和包围球相交
const sphere1 = new BoundingSphere();
BoundingSphere.fromBox(box1, sphere1);
const isIntersect3 = frustum.intersectsSphere(sphere1);
@@ -396,40 +392,36 @@ const sphere2 = new BoundingSphere();
BoundingSphere.fromBox(box2, sphere2);
const isIntersect4 = frustum.intersectsSphere(sphere2);
```
-
## Ray
-A ray is represented as a line extending infinitely in a specified direction (direction) from a point (origin), as follows:
+A ray represents a line that starts from a point (origin) and extends infinitely in a specified direction (direct), as shown below:

-The supported detection types for rays are as follows:
-| type | note |
+The types of detection supported by rays are as follows:
+| Type | Description |
| :--- | :--- |
-| [Plane](/apis/math/#Plane) | Detecting the distance from a ray to a plane; if -1, the ray does not intersect with the plane |
-| [BoundingSphere](/apis/math/#BoundingSphere) | Detecting the distance from a ray to a sphere; if -1, the ray does not intersect with the sphere |
-| [BoundingBox](/apis/math/#BoundingBox) | Detecting the distance from a ray to a box; if -1, the ray does not intersect with the box |
+| [Plane](/en/apis/math/#Plane) | Detects the distance from the ray to the plane. If -1, the ray and the plane do not intersect. |
+| [BoundingSphere](/en/apis/math/#BoundingSphere) | Detects the distance from the ray to the sphere. If -1, the ray and the sphere do not intersect. |
+| [BoundingBox](/en/apis/math/#BoundingBox) | Detects the distance from the ray to the box. If -1, the ray and the box do not intersect. |
```typescript
import { BoundingBox, BoundingSphere, Plane, Ray, Vector3 } from "@galacean/engine-math";
-// Create Ray.
+// 创建 ray
const ray = new Ray(new Vector3(0, 0, 0), new Vector3(0, 1, 0));
const plane = new Plane(new Vector3(0, 1, 0), -3);
-// To determine if a ray intersects with a plane:
-// if they intersect, distance represents the distance from the ray's origin to the plane, otherwise, distance is -1.
+// 判断射线是否和平面相交,相交的话 distance 为射线到平面距离,不相交的话 distance 为 -1
let distance = ray.intersectPlane(plane);
const sphere = new BoundingSphere(new Vector3(0, 5, 0), 1);
-// To determine if a ray intersects with a sphere:
-// if they intersect, distance represents the distance from the ray's origin to the sphere, otherwise, distance is -1.
+// 判断射线是否和包围球相交,相交的话 distance 为射线到平面距离,不相交的话 distance 为 -1
distance = ray.intersectSphere(sphere);
const box = new BoundingBox();
BoundingBox.fromCenterAndExtent(new Vector3(0, 20, 0), new Vector3(5, 5, 5), box);
-// To determine if a ray intersects with a box:
-// if they intersect, distance represents the distance from the ray's origin to the box, otherwise, distance is -1.
+// 判断射线是否和包围盒 (AABB) 相交,相交的话 distance 为射线到平面距离,不相交的话 distance 为 -1
distance = ray.intersectBox(box);
-// Point at a specified distance from the ray's origin.
+// 到射线起点指定距离的点
const out = new Vector3();
ray.getPoint(10, out);
@@ -437,45 +429,45 @@ ray.getPoint(10, out);
## Rand
-The math library has added a random number generator `Rand`, which is based on the `xorshift128+` algorithm (also used in V8, Safari, and Firefox), providing a fast, high-quality, and fully-periodic pseudo-random number generation algorithm.
+The math library has added a random number generator `Rand`, which is based on the `xorshift128+` algorithm (also used in V8, Safari, and Firefox). It is a fast, high-quality, and full-period pseudo-random number generation algorithm.
```typescript
-// Initialize a random number generator instance.
+// 初始化随机数生成器实例
const rand = new Rand(0, 0xf3857f6f);
-// Generate a random integer within the range [0, 0xffffffff].
+// 生成区间在[0, 0xffffffff)的随机整数
const num1 = rand.randomInt32();
const num2 = rand.randomInt32();
const num3 = rand.randomInt32();
-// Generate a random number in the range [0, 1).
+// 生成区间在[0, 1)的随机数
const num4 = rand.random();
const num5 = rand.random();
const num6 = rand.random();
-// Reset the seed.
+// 重置种子
rand.reset(0, 0x96aa4de3);
```
## CollisionUtil
-CollisionUtil provides a wide range of functions for collision and intersection detection, including:
-| function | note |
+CollisionUtil provides a large number of functions for collision and intersection detection, as follows:
+| Function | Description |
| :--- | :--- |
-| intersectionPointThreePlanes | Calculate the point where three planes intersect |
-| distancePlaneAndPoint | Calculate the distance from a point to a plane |
-| intersectsPlaneAndPoint | Detect the spatial relationship between a point and a plane: in front of the plane (in the direction of the normal), behind the plane, or on the plane |
-| intersectsPlaneAndBox | Detect the spatial relationship between an AABB bounding box and a plane: in front of the plane (in the direction of the normal), behind the plane, or intersecting the plane |
-| intersectsPlaneAndSphere | Detect the spatial relationship between an bounding sphere and a plane: in front of the plane (in the direction of the normal), behind the plane, or intersecting the plane |
-| intersectsRayAndPlane | Check the distance between the plane and the ray. If they do not intersect, return -1.|
-| intersectsRayAndBox | Check the distance between the AABB bounding box and the ray. If they do not intersect, return -1 |
-| intersectsRayAndSphere | Check the distance between the sphere and the ray. If they do not intersect, return -1. |
-| intersectsBoxAndBox | Check if two AABB bounding boxes intersect |
-| intersectsSphereAndSphere | Check if two spheres intersect |
-| intersectsSphereAndBox | Check if the sphere and AABB bounding box intersect |
-| intersectsFrustumAndBox | Check if the view frustum and AABB bounding box intersect |
-| frustumContainsPoint | The spatial relationship between the detection point and the viewing cone: inside the viewing cone, intersecting the viewing cone, outside the viewing cone |
-| frustumContainsBox | Detect the spatial position relationship between the AABB bounding box and the view frustum: inside the view frustum, intersecting with the view frustum, outside the view frustum |
-| frustumContainsSphere | Detect the spatial position relationship between the sphere and the cone: inside the cone, intersecting the cone, outside the cone |
+| intersectionPointThreePlanes | Calculates the intersection point of 3 planes |
+| distancePlaneAndPoint | Calculates the distance from a point to a plane |
+| intersectsPlaneAndPoint | Detects the spatial relationship between a point and a plane: in front of the plane (normal direction is front), behind the plane, on the plane |
+| intersectsPlaneAndBox | Detects the spatial relationship between an AABB bounding box and a plane: in front of the plane (normal direction is front), behind the plane, intersects the plane |
+| intersectsPlaneAndSphere | Detects the spatial relationship between a sphere and a plane: in front of the plane (normal direction is front), behind the plane, intersects the plane |
+| intersectsRayAndPlane | Detects the distance between a plane and a ray. If they do not intersect, returns -1 |
+| intersectsRayAndBox | Detects the distance between an AABB bounding box and a ray. If they do not intersect, returns -1 |
+| intersectsRayAndSphere | Detects the distance between a sphere and a ray. If they do not intersect, returns -1 |
+| intersectsBoxAndBox | Detects whether two AABB bounding boxes intersect |
+| intersectsSphereAndSphere | Detects whether two spheres intersect |
+| intersectsSphereAndBox | Detects whether a sphere and an AABB bounding box intersect |
+| intersectsFrustumAndBox | Detects whether a frustum and an AABB bounding box intersect |
+| frustumContainsPoint | Detects the spatial relationship between a point and a frustum: inside the frustum, intersects the frustum, outside the frustum |
+| frustumContainsBox | Detects the spatial relationship between an AABB bounding box and a frustum: inside the frustum, intersects the frustum, outside the frustum |
+| frustumContainsSphere | Detects the spatial relationship between a sphere and a frustum: inside the frustum, intersects the frustum, outside the frustum |
```typescript
import {
@@ -496,11 +488,11 @@ const vpMatrix = new Matrix();
Matrix.multiply(projectionMatrix, viewMatrix, vpMatrix);
const frustum = new BoundingFrustum(vpMatrix);
-// Distance between points and faces.
+// 点和面之间的距离
const point = new Vector3(0, 10, 0);
let distance = CollisionUtil.distancePlaneAndPoint(plane, point);
-// Determine the spatial relationship between points and surfaces.
+// 判断点和面的空间关系
const point1 = new Vector3(0, 10, 0);
const point2 = new Vector3(2, 5, -9);
const point3 = new Vector3(0, 3, 0);
@@ -508,7 +500,7 @@ const intersection1 = CollisionUtil.intersectsPlaneAndPoint(plane, point1);
const intersection2 = CollisionUtil.intersectsPlaneAndPoint(plane, point2);
const intersection3 = CollisionUtil.intersectsPlaneAndPoint(plane, point3);
-// Determine the spatial relationship between the face and the bounding box.
+// 判断面和包围盒的空间关系
const box1 = new BoundingBox(new Vector3(-1, 6, -2), new Vector3(1, 10, 3));
const box2 = new BoundingBox(new Vector3(-1, 5, -2), new Vector3(1, 10, 3));
const box3 = new BoundingBox(new Vector3(-1, 4, -2), new Vector3(1, 5, 3));
@@ -518,13 +510,13 @@ const intersection22 = CollisionUtil.intersectsPlaneAndBox(plane, box2);
const intersection33 = CollisionUtil.intersectsPlaneAndBox(plane, box3);
const intersection44 = CollisionUtil.intersectsPlaneAndBox(plane, box4);
-// Determine the spatial relationship between rays and planes.
+// 判断射线和平面的空间关系
const ray1 = new Ray(new Vector3(0, 0, 0), new Vector3(0, 1, 0));
const ray2 = new Ray(new Vector3(0, 0, 0), new Vector3(0, -1, 0));
const distance1 = CollisionUtil.intersectsRayAndPlane(ray1, plane);
const distance2 = CollisionUtil.intersectsRayAndPlane(ray2, plane);
-// Determine the spatial relationship between the view frustum and the bounding box.
+// 判断视锥体和包围盒的空间关系
const contain1 = CollisionUtil.frustumContainsBox(frustum, box1);
const contain2 = CollisionUtil.frustumContainsBox(frustum, box2);
const contain3 = CollisionUtil.frustumContainsBox(frustum, box3);
diff --git a/docs/en/core/prefab.md b/docs/en/core/prefab.md
new file mode 100644
index 0000000000..c0d60e9947
--- /dev/null
+++ b/docs/en/core/prefab.md
@@ -0,0 +1,102 @@
+---
+order: 10
+title: Prefab
+type: Core Concept
+label: Core
+---
+
+## Introduction
+
+A prefab is an asset that can be seen as a template storing **[entity](/en/docs/core/entity)** data (including its child entities and components). Prefabs allow developers to create and update this template, then instantiate it into the scene as needed.
+
+If you want to reuse an entity configured in a specific way in multiple locations within a scene or across multiple scenes in a project, such as non-player characters (NPCs), props, or scenery, you should convert this game object into a prefab. This approach is better than simply copying and pasting game objects because prefabs can keep all copies synchronized.
+
+### Main features of prefabs:
+
+#### Reusability:
+The same prefab can be used in multiple scenes, and any modifications to the prefab can be automatically applied to all instances.
+
+#### Easy management:
+Prefabs can package complex entities (including their child entities and components) into a single asset, making them easier to manage and organize.
+
+#### Consistency:
+By using prefabs, you can ensure that objects used in different scenes or parts remain consistent.
+
+#### Increased efficiency:
+Extensive use of prefabs can reduce repetitive work, making the development process more efficient.
+
+## Editor Usage
+
+### Creating a Prefab
+
+There are two ways to create a prefab
+
+1. Right-click the entity to create a prefab using this entity as a template
+
+2. Drag the entity from the **[Hierarchy Panel](/en/docs/interface/hierarchy)** to the **[Assets Panel](/en/docs/assets/interface)** to create a prefab using this entity as a template
+
+
+### Updating a Prefab
+
+There are two ways to update a prefab
+
+#### Global Replacement
+
+Drag the entity from the **Hierarchy Panel** to the existing prefab asset in the **Assets Panel**, and the prefab asset will be updated with the new content. All instances generated from this prefab will also be updated with the latest content.
+
+
+The entity can also be an instance of this prefab
+
+
+#### Apply Instance Modifications
+
+Instances of a prefab are generated based on the prefab template, and all instances will change with the template. However, each instance may have its own modifications: adding, deleting, or updating child entities and their components. To make the instance's modifications part of the template, you can do the following:
+
+1. Add child nodes
+
+
+2. Delete child nodes
+
+
+3. Modify child nodes
+
+
+4. Add components
+
+
+5. Update components
+
+
+6. Delete components
+
+
+### Unpack Prefab Instance
+
+If you want a prefab instance to disconnect from the prefab so that it does not change with the prefab, you can unpack the prefab instance
+
+
+### Delete Prefab
+
+A prefab is an asset, and the deletion method is the same as other assets. It is worth noting that after deleting the prefab, there are two ways to handle the prefab instances:
+
+1. All prefab instances are deleted
+
+2. All prefab instances are unpacked
+
+
+
+## Script Usage
+
+### Load Prefab
+The engine object for the prefab asset is [PrefabResource](/en/apis/loader/#PrefabResource}).
+After loading (see [Asset Loading](/en/docs/assets/load)), you can instantiate the prefab using the [instantiate](/en/apis/loader/#PrefabResource-instantiate}) method.
+
+```typescript
+engine.resourceManager
+ .load({ url: "Prefab's URL", type: AssetType.Prefab })
+ .then((prefab: PrefabResource) => {
+ const prefabInstanceEntity = prefab.instantiate();
+ scene.addRootEntity(prefabInstanceEntity);
+ });
+
+```
diff --git a/docs/en/core/scene.md b/docs/en/core/scene.md
index ab1cdb1778..6cc3dea588 100644
--- a/docs/en/core/scene.md
+++ b/docs/en/core/scene.md
@@ -1,64 +1,65 @@
---
order: 2
title: Scene
-type: Core Concepts
+type: Core Concept
label: Core
---
-As a unit of scene, Scene can easily manage entity trees, especially in large game scenes. For example, **scene1** and **scene2** are two different scenes that can independently manage their own **Entity** trees. Therefore, the lighting components, rendering components, and physics components under the scene are also isolated from each other and do not affect each other. We can render one or more Scenes at the same time, and dynamically switch the current Scene based on project logic at specific times.
+Scene, as a scene unit, can facilitate entity tree management, especially for large game scenes. For example: **scene1** and **scene2** as two different scenes can independently manage their own **Entity** trees. Therefore, the lighting components, rendering components, and physical components under the scenes are also isolated from each other and do not affect each other. We can render one or more Scenes simultaneously, or dynamically switch the current Scene according to project logic at specific times.
-Structurally, each Engine can contain one or more active scenes (the editor currently does not support multiple scenes). Each Scene can have multiple root entities.
+Structurally, each Engine can contain one or more active scenes (currently the editor does not support multiple). Each Scene can have multiple root entities.
## Editor Usage
### Creation and Switching
-Right-click in the **[Asset Panel](/en/docs/assets/interface)** (or click the + icon in the upper right corner of the asset panel) to create a scene. Double-click on a scene to switch to it:
+Right-click in the **[Assets Panel](/en/docs/assets/interface)** (or the + sign at the top right of the assets panel) to create a scene, double-click the scene to switch to it:

-### Property Panel
+### Properties Panel
### Ambient Light
-For more details, please refer to the [Ambient Light Tutorial](/en/docs/graphics-light-ambient) and [Baking Tutorial](/en/docs/graphics-light-bake).
+For details, please refer to the [Ambient Light Tutorial](/en/docs/graphics/light/ambient/) and [Baking Tutorial](/en/docs/graphics/light/bake/).
### Background
-For more details, please refer to the [Background Tutorial](/en/docs/graphics-background).
+For details, please refer to the [Background Tutorial](/en/docs/graphics/background/background/).
-### Shadows
+### Shadow
-For more details, please refer to the [Shadows Tutorial](/en/docs/graphics-light-shadow).
+For details, please refer to the [Shadow Tutorial](/en/docs/graphics/light/shadow/).
-### Post Process
+### Post-Processing
-For more details, please refer to the [PostProcess Tutorial](/en/docs/graphics-postProcess).
+For details, please refer to the [Post-Processing Tutorial](/en/docs/graphics/postProcess/postProcess/).
### Fog
-You can add **linear, exponential, and exponential squared** fog to the entire scene:
+You can add **linear, exponential, exponential squared** 3 types of fog to the entire scene:

+
## Script Usage
-| Property Name | Description |
-| :------------------------------------------------ | :---------- |
-| [scenes](/apis/core/#SceneManager-scenes) | List of scenes |
+| Property Name | Description |
+| :-------------------------------------------- | :---------- |
+| [scenes](/en/apis/core/#SceneManager-scenes) | Scene list |
-| Method Name | Description |
-| :------------------------------------------------ | :---------- |
-| [addScene](/apis/core/#SceneManager-addScene) | Add a scene |
-| [removeScene](/apis/core/#SceneManager-removeScene) | Remove a scene |
-| [mergeScenes](/apis/core/#SceneManager-mergeScenes) | Merge scenes |
-| [loadScene](/apis/core/#SceneManager-loadScene) | Load a scene |
+| Method Name | Description |
+| :--------------------------------------------------- | :---------- |
+| [addScene](/en/apis/core/#SceneManager-addScene) | Add scene |
+| [removeScene](/en/apis/core/#SceneManager-removeScene) | Remove scene|
+| [mergeScenes](/en/apis/core/#SceneManager-mergeScenes) | Merge scenes|
+| [loadScene](/en/apis/core/#SceneManager-loadScene) | Load scene |
### Loading a Scene
-To load a **Scene** asset as a scene in the application, you can use `engine.resourceManager.load` and pass in the URL.
+If you want to load a **Scene** asset as a scene in the application, you can use `engine.resourceManager.load` and pass in the URL.
```typescript
const sceneUrl = "...";
@@ -72,7 +73,7 @@ engine.resourceManager
### Getting Scene Objects
-By calling `engine.sceneManager.scenes`, you can get all the scenes currently active in the engine runtime. You can also use `entity.scene` to get the `scene` to which the corresponding `entity` belongs.
+By calling `engine.sceneManager.scenes`, you can get all the scenes currently active in the engine runtime. You can also get the corresponding `scene` to which an `entity` belongs through `entity.scene`.
```typescript
// 获取当前所有激活的场景
@@ -82,9 +83,9 @@ const scenes = engine.sceneManager.scenes;
const scene = entity.scene;
```
-### Adding/Removing Scenes
+### Adding/Removing Scene
-`engine.sceneManager.scenes` is read-only. If you need to add or remove **Scenes**, you need to call `engine.sceneManager.addScene()` or `engine.sceneManager.removeScene()`. **The engine supports rendering multiple scenes simultaneously**.
+`engine.sceneManager.scenes` is read-only. If you need to add or remove a **Scene**, you need to call `engine.sceneManager.addScene()` or `engine.sceneManager.removeScene()`. **The engine supports rendering multiple scenes simultaneously**.
```typescript
// 假设已经有两个场景
@@ -100,13 +101,13 @@ engine.sceneManager.addScene(scene1);
engine.sceneManager.removeScene(scene2);
```
-Rendering in multiple scenarios is shown below:
+The example of multi-scene rendering is as follows:
-### Merge Scenes
+### Merging Scenes
-You can use `engine.sceneManager.mergeScenes` to merge two scenes into one scene.
+You can use `engine.sceneManager.mergeScenes` to merge 2 scenes into 1 scene.
```typescript
// 假设已经有两个未激活的场景
@@ -119,18 +120,18 @@ engine.sceneManager.mergeScenes(sourceScene, destScene);
engine.sceneManager.addScene(destScene);
```
-### Destroy Scene
+### Destroying Scenes
-Call `scene.destroy()` to destroy the scene. The destroyed scene will also be automatically removed from the active scene list.
+Call `scene.destroy()` to destroy a scene. The destroyed scene will also be automatically removed from the active scene list.
### Entity Tree Management
-| Method Name | Explanation |
-| :------------------------------------------------------- | :--------------------------------------------------------------------------------------------------- |
-| [createRootEntity](/apis/core/#Scene-createRootEntity) | By default, a newly created _scene_ does not have a root entity and needs to be created manually |
-| [addRootEntity](/apis/core/#Scene-addRootEntity) | You can create a new entity directly or add an existing entity |
-| [removeRootEntity](/apis/core/#Scene-removeRootEntity) | Remove the root entity |
-| [getRootEntity](/apis/core/#Scene-getRootEntity) | Find the root entity, you can get all root entities or a single entity object. Note that all entities are read-only arrays and cannot change length or order |
+| Method Name | Description |
+| :------------------------------------------------------- | :---------------------------------------------------------------------------------------------------------- |
+| [createRootEntity](/en/apis/core/#Scene-createRootEntity) | The newly created _scene_ does not have a root entity by default and needs to be created manually |
+| [addRootEntity](/en/apis/core/#Scene-addRootEntity) | You can directly create a new entity or add an existing entity |
+| [removeRootEntity](/en/apis/core/#Scene-removeRootEntity) | Remove the root entity |
+| [getRootEntity](/en/apis/core/#Scene-getRootEntity) | Find the root entity, you can get all root entities or a single entity object. Note that all entities are read-only arrays and cannot change length or order |
```typescript
const engine = await WebGLEngine.create({ canvas: "demo" });
@@ -153,10 +154,11 @@ const entity2 = scene.getRootEntity(2);
### Others
-It is important to note that after becoming familiar with [Engine](/apis/core/#Engine) and [Scene](/apis/core/#Scene), if you want to output the rendering to the screen or perform off-screen rendering, you must ensure that a [Camera](/apis/core/#Camera) is attached to the entity tree of the current _scene_. The method to attach a camera is as follows:
+It should be noted that when we are familiar with [Engine](/en/apis/core/#Engine) and [Scene](/en/apis/core/#Scene), if we want to output the rendered picture to the screen or perform off-screen rendering, we must also ensure that the entity tree of the current _scene_ has a [Camera](/en/apis/core/#Camera) mounted. The method to mount the camera is as follows:
```typescript
const cameraEntity = rootEntity.createChild("camera");
cameraEntity.addComponent(Camera);
```
+```
diff --git a/docs/en/core/space.md b/docs/en/core/space.md
index fd97f35d97..793f45d665 100644
--- a/docs/en/core/space.md
+++ b/docs/en/core/space.md
@@ -5,59 +5,60 @@ type: Core
label: Core
---
-The coordinate system plays a crucial role in the rendering engine, ensuring the accuracy of rendering results and interactions. By reading this document, you can understand most of the coordinate systems involved in Galacean. It is important to note that the definitions of various spaces differ among different rendering engines. This document only discusses the spatial standards in Galacean.
+The coordinate system plays a very important role in the rendering engine, ensuring the accuracy of rendering results and interactions. By reading this document, you can understand most of the coordinate systems involved in Galacean. It is important to note that the definitions of various spaces differ among different rendering engines. This article only discusses the space standards in Galacean.
-This document will horizontally compare various coordinate spaces based on aspects such as `space definition` and `coordinate system type`. The term `coordinate system type` specifically refers to `left-handed coordinate system` and `right-handed coordinate system`, as shown in the following image:
+This article will horizontally compare various coordinate spaces according to the `definition of space` and `types of coordinate systems`. The `types of coordinate systems` specifically refer to the `left-handed coordinate system` and the `right-handed coordinate system`, as shown in the figure below:
-Being defined as a `left-handed coordinate system` or a `right-handed coordinate system` will affect the orientation of `forward` and the direction of rotation (counterclockwise or clockwise). For the definition of orientation, you can imagine aligning your right hand with `+X` and your head with `+Y`, where the direction your face is pointing is considered `forward`. You can easily compare the differences between Galacean and Unity:
+Defining as a `left-handed coordinate system` or `right-handed coordinate system` will affect the direction of `forward` and the direction of rotation (clockwise or counterclockwise). For the definition of direction, you can imagine aligning your right hand with `+X` and the top of your head with `+Y`. The direction your face is facing is `forward`. You can simply compare the differences between Galacean and Unity:
-- In Unity, both the local and world coordinate systems are `left-handed coordinate systems`. When performing pose transformations, rotations are done in a clockwise direction, and the `forward` direction corresponds to `+Z`. Therefore, the camera's orientation (viewing direction) is in the `+Z` direction.
+- Unity's local coordinates and world coordinate system are both `left-handed coordinate systems`. The pose transformation rotates in a clockwise direction, and the corresponding `forward` direction is `+Z`. Therefore, the camera's direction (viewing direction) is `+Z`.
-- In Galacean, both the local and world coordinate systems are `right-handed coordinate systems`. When performing pose transformations, rotations are done in a counterclockwise direction, and the `forward` direction corresponds to `-Z`. Therefore, the camera's orientation (viewing direction) is in the `-Z` direction.
+- Galacean's local coordinates and world coordinate system are both `right-handed coordinate systems`. The pose transformation rotates in a counterclockwise direction, and the corresponding `forward` direction is `-Z`. Therefore, the camera's direction (viewing direction) is `-Z`.
## Local Space
-Local space is relative, using the object's own position as the reference coordinate system. Therefore, when describing it, it is usually expressed as "a point in Node A's local space." Local space is a `right-handed coordinate system`, and the `Transform` component automatically calculates the positions of various points in world space according to the following formula.
+Local space is relative, using the object's own position as the reference coordinate system. Therefore, it is usually described as: "a point in the local space of node A". The local space is a `right-handed coordinate system`. The `Transform` component will automatically calculate the position of each point in the world space according to the following formula.
## World Space
-World space is absolute, with the root node placed in `world space`, and its child nodes inherit its spatial relationships. Similar to `local space`, `world space` is also a `right-handed coordinate system`. When two nodes are not in the same `local space`, they can be transformed to world space to compare their relative positions.
+World space is absolute. The root node is placed in `world space`, and its child nodes inherit its spatial relationship. Like `local space`, `world space` is also a `right-handed coordinate system`. When two nodes are not in the same `local space`, they can be converted to world space to compare their relative positions.
## Editor Usage
-Determining the gizmo's posture in the scene
+Determine the posture of the gizmo in the scene
-| Icon | Option | Description |
-| :-------------------------------------------------------------------------------------------------------------------------------- | :--------- | :------------------------------------------------ |
-| | `Local Coordinates` | Maintains the Gizmo's rotation relative to the selected entity |
-| | `Global Coordinates` | Fixes the Gizmo in the direction of world space. It aligns with the grid direction in the scene |
+| Icon | Option | Content |
+| :-------------------------------------------------------------------------------------------------------------------------------- | :----------- | :--------------------------------------------------- |
+| | `Local Space` | Keep the Gizmo's rotation relative to the selected entity |
+| | `Global Space` | Fix the Gizmo to the world space direction, consistent with the grid direction in the scene |
## View Space
-`View space` refers to the camera's local space. Taking a perspective camera as an example:
+`View space` is the local space of the camera. Taking a perspective camera as an example:
-## Screen Space {/examples}
+## Screen Space
-The definition of screen space is consistent with the front-end specifications, which is a two-dimensional coordinate system with the canvas's top-left corner as the origin. The value range within the space is consistent with the canvas size and is often used in interactions and screen space conversions.
+The definition of screen space is consistent with front-end specifications. It is a two-dimensional coordinate system with the origin at the top-left corner of the canvas. The value range within this space is consistent with the dimensions of the canvas. It is often used in interactions and screen space transformations.
-## Viewport Space {/examples}
+## Viewport Space
-The definition of viewport space is consistent with the front-end specifications. By setting the camera's viewport, you can control the target rendering area.
+The definition of viewport space is consistent with front-end specifications. By setting the camera's viewport, you can control the target area for rendering.
-## 2D Sprites {/examples}
+## 2D Sprites
-When rendering sprites or masks and other 2D elements, they are placed on the XoY plane in the local coordinate system by default.
+When rendering sprites or other 2D elements such as masks, the default is to place the plane on the XoY plane in the local coordinate system:
+
diff --git a/docs/en/core/time.md b/docs/en/core/time.md
index 0f0e081077..4bea46b69e 100644
--- a/docs/en/core/time.md
+++ b/docs/en/core/time.md
@@ -5,16 +5,17 @@ type: Core
label: Core
---
-`Time` contains information related to the engine's time:
+`Time` contains information related to engine time:
## Properties
-| Name | Description |
+| Name | Description |
| ------------------------------------------------------ | -------------------------------------------------------------------------------------------------------------------------------------------------------- |
-| [timeScale](/apis/core/#Time-timeScale) | The time scale |
-| [maximumDeltaTime](/apis/core/#Time-maximumDeltaTime) | The maximum interval, in case of low frame rate or lag |
-| [frameCount](/apis/core/#Time-frameCount) | The total number of frames accumulated since the engine started |
-| [deltaTime](/apis/core/#Time-deltaTime) | The incremental time from the previous frame to the current frame, in seconds, will not exceed [maximumDeltaTime](/apis/core/#Time-maximumDeltaTime) \* [timeScale](/apis/core/#Time-timeScale) |
-| [actualDeltaTime](/apis/core/#Time-actualDeltaTime) | The actual incremental time from the previous frame to the current frame, in seconds, and ignores the impact of [timeScale](/apis/core/#Time-timeScale) and [maximumDeltaTime](/apis/core/#Time-maximumDeltaTime) |
-| [elapsedTime](/apis/core/#Time-elapsedTime) | The total elapsed time since the engine started, in seconds |
-| [actualElapsedTime](/apis/core/#Time-actualElapsedTime) | The total elapsed time since the engine started, in seconds |
+| [timeScale](/en/apis/core/#Time-timeScale) | Time scaling |
+| [maximumDeltaTime](/en/apis/core/#Time-maximumDeltaTime) | Maximum interval, in case of low frame rate or stuttering |
+| [frameCount](/en/apis/core/#Time-frameCount) | The cumulative number of frames since the engine started |
+| [deltaTime](/en/apis/core/#Time-deltaTime) | The incremental time from the previous frame to the current frame, in seconds, not exceeding [maximumDeltaTime](/en/apis/core/#Time-maximumDeltaTime) \* [timeScale](/en/apis/core/#Time-timeScale) |
+| [actualDeltaTime](/en/apis/core/#Time-actualDeltaTime) | The actual incremental time from the previous frame to the current frame, in seconds, ignoring the effects of [timeScale](/en/apis/core/#Time-timeScale) and [maximumDeltaTime](/en/apis/core/#Time-maximumDeltaTime) |
+| [elapsedTime](/en/apis/core/#Time-elapsedTime) | The cumulative elapsed time since the engine started, in seconds |
+| [actualElapsedTime](/en/apis/core/#Time-actualElapsedTime) | The cumulative elapsed time since the engine started, in seconds |
+
diff --git a/docs/en/core/transform.md b/docs/en/core/transform.md
index 51a414fc55..ca77bc1ea3 100644
--- a/docs/en/core/transform.md
+++ b/docs/en/core/transform.md
@@ -7,7 +7,7 @@ label: Core
## Basic Concepts
-`Transform` is a basic component that comes with `Entity`, allowing developers to manage the position, rotation, and scale of the `Entity` in both **local space** and **world space**.
+`Transform` is a fundamental component that comes with `Entity`. Developers can use it to manage the position, rotation, and scale of `Entity` in both **local space** and **world space**.
> Combining with Galacean's **[coordinate system](/en/docs/core/space)** will provide a deeper understanding.
@@ -17,30 +17,30 @@ label: Core
-Modify the visual transformation component of the selected entity by directly manipulating the auxiliary axis icons with the mouse.
+Change the visual transform component of the selected entity by directly using the mouse to manipulate the auxiliary icon axis.
- Translation
+ Move
| Icon | Operation | Shortcut |
| :-------------------------------------------------------------------------------------------------------------------------------- | :---------------------- | :------- |
-| | `Switch to Gizmo Move Mode` | W |
+| | `Switch to Gizmo Move Mode` | W |
-Click on the auxiliary axis to drag the selected entity in a single direction. Click on the auxiliary plane to drag the selected entity on a single plane.
+Click the auxiliary axis to drag the selected entity in a single direction. Click the auxiliary plane to drag the selected entity in a single plane.
- Rotation
+ Rotate
| Icon | Operation | Shortcut |
| :-------------------------------------------------------------------------------------------------------------------------------- | :---------------------- | :------- |
-| | `Switch to Gizmo Select Mode` | E |
+| | `Switch to Gizmo Rotate Mode` | E |
Click and drag to change the rotation of the selected entity.
-Red represents rotation around the X-axis, green represents rotation around the Y-axis, and blue represents rotation around the Z-axis.
+Red represents rotation around the X axis, green represents rotation around the Y axis, and blue represents rotation around the Z axis.
- Scale
+ Scale
| Icon | Operation | Shortcut |
| :-------------------------------------------------------------------------------------------------------------------------------- | :---------------------- | :------- |
-| | `Switch to Gizmo Scale Mode` | R |
+| | `Switch to Gizmo Scale Mode` | R |
点击中心立方体,在所有轴上均匀的缩放选中实体。点击辅助轴,在单个方向缩放选中实体。
@@ -81,37 +81,36 @@ cubeEntity.transform.rotate(new Vector3(45, 0, 0), true);
| 属性名称 | 属性释义 |
| :---------------------------------------------------------------------- | :----------------------------------------------------------------------------------------------------------------------- |
-| [position](/apis/core/#Transform-position) | 局部位移 |
-| [rotation](/apis/core/#Transform-rotation) | 局部旋转 - 欧拉角 |
-| [rotationQuaternion](/apis/core/#Transform-rotationquaternion) | 局部旋转 - 四元数 |
-| [scale](/apis/core/#Transform-scale) | 局部缩放 |
-| [worldPosition](/apis/core/#Transform-worldPosition) | 世界位移 |
-| [worldRotation](/apis/core/#Transform-worldRotation) | 世界旋转 - 欧拉角 |
-| [worldRotationQuaternion](/apis/core/#Transform-worldRotationQuaternion) | 世界旋转 - 四元数 |
-| [lossyWorldScale](/apis/core/#Transform-lossyWorldScale) | 世界有损缩放 - 当父节点有缩放,子节点有旋转时,缩放会倾斜,无法使用 Vector3 正确表示,必须使用 Matrix3x3 矩阵才能正确表示 |
-| [localMatrix](/apis/core/#Transform-localMatrix) | 局部矩阵 |
-| [worldMatrix](/apis/core/#Transform-worldMatrix) | 世界矩阵 |
-| [worldForward](/apis/core/#Transform-worldMatrix) | forward 向量(世界空间中的单位矩阵) |
-| [worldRight](/apis/core/#Transform-worldMatrix) | right 向量(世界空间中的单位矩阵) |
-| [worldUp](/apis/core/#Transform-worldMatrix) | up 向量(世界空间中的单位矩阵) |
-
+| [position](/en/apis/core/#Transform-position) | 局部位移 |
+| [rotation](/en/apis/core/#Transform-rotation) | 局部旋转 - 欧拉角 |
+| [rotationQuaternion](/en/apis/core/#Transform-rotationquaternion) | 局部旋转 - 四元数 |
+| [scale](/en/apis/core/#Transform-scale) | 局部缩放 |
+| [worldPosition](/en/apis/core/#Transform-worldPosition) | 世界位移 |
+| [worldRotation](/en/apis/core/#Transform-worldRotation) | 世界旋转 - 欧拉角 |
+| [worldRotationQuaternion](/en/apis/core/#Transform-worldRotationQuaternion) | 世界旋转 - 四元数 |
+| [lossyWorldScale](/en/apis/core/#Transform-lossyWorldScale) | 世界有损缩放 - 当父节点有缩放,子节点有旋转时,缩放会倾斜,无法使用 Vector3 正确表示,必须使用 Matrix3x3 矩阵才能正确表示 |
+| [localMatrix](/en/apis/core/#Transform-localMatrix) | 局部矩阵 |
+| [worldMatrix](/en/apis/core/#Transform-worldMatrix) | 世界矩阵 |
+| [worldForward](/en/apis/core/#Transform-worldMatrix) | forward 向量(世界空间中的单位矩阵) |
+| [worldRight](/en/apis/core/#Transform-worldMatrix) | right 向量(世界空间中的单位矩阵) |
+| [worldUp](/en/apis/core/#Transform-worldMatrix) | up 向量(世界空间中的单位矩阵) |
## Component Methods
-| Method Name | Method Description |
-| -------------------------------------------------------------------- | --------------------------------------------- |
-| [getWorldUp](/apis/core/#Transform-getWorldUp) | Get the world matrix up vector |
-| [getWorldRight](/apis/core/#Transform-getWorldRight) | Get the world matrix right vector |
-| [getWorldForward](/apis/core/#Transform-getWorldForward) | Get the world matrix forward vector |
-| [lookAt](/apis/core/#Transform-lookAt) | Rotate and ensure the world forward vector points to the target world position |
-| [registerWorldChangeFlag](/apis/core/#Transform-registerWorldChangeFlag) | Register a flag for world transformation changes |
-| [rotate](/apis/core/#Transform-rotate) | Rotate based on specified Euler angles |
-| [rotateByAxis](/apis/core/#Transform-rotateByAxis) | Rotate around a specified axis by a specified angle |
-| [translate](/apis/core/#Transform-translate) | Translate based on specified direction and distance |
+| Method Name | Method Description |
+| ----------------------------------------------------------------------- | -------------------------------------- |
+| [getWorldUp](/en/apis/core/#Transform-getWorldUp) | Get the world matrix up vector |
+| [getWorldRight](/en/apis/core/#Transform-getWorldRight) | Get the world matrix right vector |
+| [getWorldForward](/en/apis/core/#Transform-getWorldForward) | Get the world matrix forward vector |
+| [lookAt](/en/apis/core/#Transform-lookAt) | Rotate and ensure the world forward vector points to the target world position |
+| [registerWorldChangeFlag](/en/apis/core/#Transform-registerWorldChangeFlag) | Register world transformation change flag |
+| [rotate](/en/apis/core/#Transform-rotate) | Rotate according to the specified Euler angles |
+| [rotateByAxis](/en/apis/core/#Transform-rotateByAxis) | Rotate around the specified axis by the specified angle |
+| [translate](/en/apis/core/#Transform-translate) | Translate according to the specified direction and distance |
-### Purpose of `registerWorldChangeFlag`
+### The Role of `registerWorldChangeFlag`
-The `transform` component internally optimizes calculations using dirty flags. Since the `worldMatrix` property of `transform` also optimizes using dirty flags, if external components need to track whether the current `transform`'s `worldMatrix` has changed, they need to access the state of its dirty flag. The `transform` component provides the `registerWorldChangeFlag` method: this method will return an update flag, which will be triggered when the `worldMatrix` of the current `transform` is modified. For specific usage, refer to the camera component:
+The `transform` component internally uses dirty flags for a lot of computational optimizations. Since the `worldMatrix` property of `transform` is also optimized using dirty flags, if external components need to monitor whether the current `transform`'s `worldMatrix` has changed, they need to get the status of its dirty flag. The `transform` component provides the `registerWorldChangeFlag` method: this method returns an update flag that triggers a change when the current `transform`'s `worldMatrix` is modified. For specific usage, refer to the camera component:
```typescript
class Camera {
diff --git a/docs/en/device/restore.md b/docs/en/device/restore.md
index 3233941145..bd7aceddda 100644
--- a/docs/en/device/restore.md
+++ b/docs/en/device/restore.md
@@ -5,17 +5,17 @@ type: Resource
label: Device
---
-Since the GPU is a shared resource, in some cases, the GPU may revoke control, causing your program to lose the GPU device. The following situations may lead to device loss:
+Since the GPU is a shared resource, there are situations where the GPU might reclaim control, causing your program's GPU device to be lost. For example, device loss might occur in the following scenarios:
- A page is stuck for too long
-- Multiple pages consume too many GPU resources, causing all pages to lose context and only restoring the foreground page
-- PC device switches graphics cards or updates graphics card drivers
+- Multiple pages occupy too many GPU resources, causing all pages to lose context and only the foreground page to recover
+- PC switches graphics cards or updates graphics card drivers
-After the device is lost, the engine will automatically restore all program content at the appropriate time. Users usually do not need to worry about it. When necessary, users can handle device loss and recovery logic through the following mechanisms.
+After device loss, the engine will automatically recover all content at an appropriate time. Users usually do not need to worry about it. If necessary, users can handle device loss and recovery logic through the following mechanisms.
-### Handling Loss and Recovery
+### Loss and Recovery Handling
-When the GPU device is lost, the `Engine` will dispatch a `devicelost` event, allowing users to implement logic such as user prompts or saving configurations:
+When the GPU device is lost, the `Engine` will dispatch a `devicelost` event. Users can perform some user prompts or save configuration logic:
```typescript
engine.on("devicelost", () => {
@@ -24,7 +24,7 @@ engine.on("devicelost", () => {
});
```
-The engine supports automatic GPU device recovery. When the program can be restored, the `Engine` will dispatch a `devicerestored` event. The engine will automatically rebuild low-level GPU resources such as textures, buffers, and shaders, and attempt to restore their data content automatically. Resources created through methods provided by the engine, such as Loader and PrimitiveMesh, can be fully restored automatically without any manual intervention. Manual handling is only required when developers modify resource content themselves, such as manually modifying the pixel content of a texture.
+The engine supports automatic GPU device recovery. When the program can recover, the `Engine` will dispatch a `devicerestored` event. The engine will automatically rebuild textures, buffers, shaders, and other low-level GPU resources and will attempt to automatically restore their data content. Resources created through the engine's Loader and PrimitiveMesh methods can usually fully recover their content automatically, and developers do not need to do anything. Only when developers manually modify resource content, such as manually modifying the texture's pixel content, do they need to handle it manually.
```typescript
engine.on("devicerestored", () => {
@@ -34,9 +34,9 @@ engine.on("devicerestored", () => {
});
```
-### Custom Recovery
+### Custom Restorer
-Another scenario is when resources are entirely created by developers, such as custom [Loader](/en/docs/assets-type) or procedurally generated resources. In addition to handling in the `devicerestored` event as mentioned above, custom content recovery can also be achieved by implementing a custom recoveryer. The following example registers a custom recoveryer for a texture created by the user and registers it with the `ResourceManager`. When the device needs to be restored, the `restoreContent` method will automatically trigger and restore its content.
+Another situation is when resources are entirely created by the developer, such as custom [Loader](/en/docs/assets/custom) or programmatically generated resources. Besides handling it in the `devicerestored` event as mentioned above, you can also implement a custom content restorer. The following example registers a custom restorer for a user-created texture and registers it with the `ResourceManager`. When the device needs to be restored, the `restoreContent` method will automatically trigger and restore its content.
```typescript
// Step 1: Define content restorer
@@ -69,18 +69,18 @@ resourceManager.addContentRestorer(
);
```
-> Note: It is not recommended for recoveryer implementations to rely on or consume a large amount of CPU memory.
+> Note: It is not recommended for the restorer implementation to rely on and occupy a large amount of CPU memory.
### Simulating Device Loss and Recovery
-In actual projects, the probability of triggering device loss and recovery is small. To facilitate developers in testing the program's performance and logic handling after device loss and recovery, the `Engine` provides built-in methods to simulate device loss and recovery.
+In actual projects, the probability of triggering device loss and recovery is relatively low. To facilitate developers in testing the program's performance and logic handling after device loss and recovery, the `Engine` provides built-in methods to simulate device loss and recovery.
-| Method | Description |
-| ---------------------------------------------------------- | ----------------- |
-| [forceLoseDevice](/apis/core/#Engine-forceLoseDevice) | Force device loss |
-| [forceRestoreDevice](/apis/core/#Engine-forceRestoreDevice) | Force device recovery |
+| Method | Description |
+| ----------------------------------------------------------- | ------------ |
+| [forceLoseDevice](/en/apis/core/#Engine-forceLoseDevice) | Force device loss |
+| [forceRestoreDevice](/en/apis/core/#Engine-forceRestoreDevice) | Force device recovery |
### References
-- "Handling WebGL Context Lost": https://www.khronos.org/webgl/wiki/HandlingContextLost
+- "WebGL Handling Context Lost": https://www.khronos.org/webgl/wiki/HandlingContextLost
diff --git a/docs/en/graphics/2D/2d.md b/docs/en/graphics/2D/2d.md
index 33f4315bbc..5871667d7d 100644
--- a/docs/en/graphics/2D/2d.md
+++ b/docs/en/graphics/2D/2d.md
@@ -6,24 +6,25 @@ group: 2D
label: Graphics/2D
---
-Galacean is a 3D/2D interactive solution. You can quickly experience 2D interactive development by navigating to **Menu View** on the **Editor Homepage** and selecting **Templates** -> **Pixel Bird**.
+Galacean is a 3D/2D interactive solution. You can quickly experience 2D interactive development by **clicking Template** -> **Flappy Bird** in the **Menu View** on the **Editor Homepage**.
-You can also create a blank 2D project by going to **Project View** on the **Editor Homepage** and selecting **New Project** -> **2D Project**.
+You can also create a blank 2D project by **clicking New Project** -> **2D Project** in the **Project View** on the **Editor Homepage**.
-In the editor, there isn't much difference between 2D and 3D projects, except for the change in perspective from 3D to 2D and the default camera being set to **Orthographic**. You can quickly create a 2D child node by selecting a node in the **Hierarchy Panel** and right-clicking -> **2D Object**.
+In the editor, there is not much difference between 2D and 3D projects, except that the perspective switches from 3D to 2D, and the default camera is set to **orthographic**. In the **Hierarchy Interface**, you can **select a node and right-click** -> **2D Object** to quickly create a 2D child node.
+In a 2D project, you can attach a **Sprite Renderer** and set **Sprite Assets** to render images, use a **Text Renderer** to render 2D text, use **SpriteMask** to achieve masking effects for 2D elements, and use Lottie or Spine (2D skeletal animation) to display 2D effects. For performance optimization, you can also pack **sprites** into a **sprite atlas** to improve request and rendering performance.
-In a 2D project, you can attach a **Sprite Renderer** and set a **Sprite Asset** to render images, use a **Text Renderer** to render 2D text, implement masking effects for 2D elements with **SpriteMask**, and showcase 2D effects using Lottie or Spine (2D skeletal animation). For performance optimization, you can pack **Sprites** into **Sprite Atlases** to improve request and rendering performance.
+Next, let's dive into the following topics:
+
+- [Sprite](/en/docs/graphics/2D/sprite/)
+- [Sprite Renderer](/en/docs/graphics/2D/spriteRenderer/)
+- [Sprite Mask](/en/docs/graphics/2D/spriteMask/)
+- [Text Renderer](/en/docs/graphics/2D/text/)
+- [Sprite Atlas](/en/docs/graphics/2D/spriteAtlas/)
+- [Lottie](/en/docs/graphics/2D/lottie/)
+- [Spine](/en/docs/graphics/2D/spine/overview)
-Let's delve deeper into the following topics:
-- [Sprite](/en/docs/graphics/2D/sprite)
-- [Sprite Renderer](/en/docs/graphics/2D/spriteRenderer)
-- [Sprite Mask](/en/docs/graphics/2D/spriteMask)
-- [Text Renderer](/en/docs/graphics/2D/text)
-- [Sprite Atlas](/en/docs/graphics/2D/spriteAtlas)
-- [Lottie](/en/docs/graphics/2D/lottie)
-- [Spine](/en/docs/graphics/2D/spine/overview/})
diff --git a/docs/en/graphics/2D/lottie.md b/docs/en/graphics/2D/lottie.md
index a500878114..ea52ea830b 100644
--- a/docs/en/graphics/2D/lottie.md
+++ b/docs/en/graphics/2D/lottie.md
@@ -6,63 +6,63 @@ group: 2D
label: Graphics/2D
---
-[lottie](https://airbnb.io/lottie/) is a cross-platform animation solution released by Airbnb around 2017, which can be used on iOS, Android, React Native, and web. It parses animations from [AE](https://www.adobe.com/products/aftereffects.html) using the Bodymovin plugin and exports json files that can render animations on mobile and web. Designers create animations in AE, export the corresponding json files using Bodymovin, and provide them to front-end developers. Front-end developers can use this json file to directly generate animations that are 100% faithful to the original.
+[lottie](https://airbnb.io/lottie/) is a cross-platform animation solution released by Airbnb around 2017. It can be applied to iOS, Android, React Native, and web. It parses [AE](https://www.adobe.com/products/aftereffects.html) animations through the Bodymovin plugin and exports json files that can render animations on mobile and web. Designers create animations using AE and export the corresponding json files with Bodymovin for the frontend, which can use these json files to generate 100% accurate animations.
Users can easily handle Lottie assets and add components in Galacean.
### Resource Upload
-It is recommended that designers encode images in base64 format when exporting Lottie files in AE and write them into the json file.
+It is recommended that designers use base64 format for images when exporting Lottie files in AE, embedding them into the Lottie json file.
-After developers receive the `.json` file, they need to upload the file to the Galacean Editor. Select the "lottie" asset from the asset panel by clicking the upload button, choose a local [lottie json](https://github.com/galacean/galacean.github.io/files/14106485/_Lottie.3.json) file, and then upload it:
+After obtaining the `.json` file, developers need to upload the `.json` file to the Galacean Editor. Use the upload button in the asset panel to select the "lottie" asset, choose a local [lottie json](https://github.com/galacean/galacean.github.io/files/14106485/_Lottie.3.json) file, and then upload it:
### Add Component
-Select an entity, add a Lottie component, choose the resource uploaded in the previous step, and the Lottie effect will be displayed and played:
+Select an entity, add the Lottie component, and choose the resource as the asset uploaded in the previous step to display and play the Lottie effect:

-Developers can adjust various parameters in the property panel to configure Lottie:
+Developers can adjust various parameters in the properties panel to configure the Lottie:

-
| Property | Description |
| :--- | :--- |
| `resource` | Select Lottie asset |
-| `autoPlay` | Whether to autoplay, default is true |
-| `isLooping` | Whether to loop, default is true |
-| `speed` | Playback speed, `1` is normal speed, larger values play faster |
-| `priority` | Rendering priority, the smaller the value, the higher the priority, and the earlier it is rendered |
+| `autoPlay` | Auto play, default is auto |
+| `isLooping` | Loop play, default is loop |
+| `speed` | Play speed, `1` is the original speed, the larger the value, the faster the play |
+| `priority` | Render priority, the smaller the value, the higher the render priority |
-Sometimes developers may need to dynamically configure Lottie during runtime. Add the following code in the script component:
+Sometimes developers may need to dynamically set Lottie at runtime. Add the following code in the script component:
```typescript
-// First find the entity where Lottie is located (lottieEntity), then get the LottieAnimation.
+// 先找到 Lottie 所在的实体 lottieEntity,然后获取 Lottie 组件
const lottie = lottieEntity.getComponent(LottieAnimation);
-// Set lottie's property
+// 设置 lottie 属性
lottie.speed = 2;
```
-Sometimes developers only upload Lottie assets in the editor and dynamically create the LottieAnimation when needed. The usage is as follows:
+Sometimes developers only upload Lottie resources in the editor and dynamically create Lottie components when needed. Use the following method:
```typescript
-// Dynamically load Lottie assets from the editor.
+// 动态加载编辑器中的 Lottie 资源
const lottieResource = await engine.resourceManager.load({url: '/光球.json', type: 'EditorLottie'});
-// Add LottieAnimation.
+// 给一个实体添加 Lottie 组件
const lottie = entity.addComponent(LottieAnimation);
-// Set lottie's resource.
+// 给 Lottie 组件设置 Lottie 资源
lottie.resource = lottieResource;
```
-Additionally, the Lottie component provides 2 APIs to control animation playback and pause:
+
+Additionally, the Lottie component provides 2 APIs to control animation play and pause:
| Method | Description |
| :--- | :--- |
-| `play` | Play animation, passing in the animation segment name will play a specific animation segment |
+| `play` | Play animation, passing in the animation segment name will play a specific segment |
| `pause` | Pause animation |
-### Listen for Playback End
+### Listen for Play End
-Often, there is a need to listen for the end of Lottie animation playback, such as running some business logic when the animation ends. The `play` method of `LottieAnimation` returns a `Promise`, making it easy to listen for the end of the animation:
+Often, we need to listen for the end of a Lottie animation to run some business logic. The `play` method of `LottieAnimation` returns a `Promise`, making it easy to listen for the end of the animation:
```typescript
const lottie = lottieEntity.getComponent(LottieAnimation);
@@ -70,15 +70,15 @@ await lottie.play();
// do something next..
```
-### Slicing Functionality
+### Slicing Function
-The editor provides a feature to slice animations, dividing the entire segment provided by the designer into multiple segments. Each segment needs to define three fields: segment name, start frame, and end frame.
+The editor provides an animation slicing function, allowing you to cut the entire segment provided by the designer into multiple segments. Each segment needs to define three fields: segment name, start frame, and end frame.
-This operation will add a `lolitaAnimations` field in the Lottie protocol to implement animation slicing.
+This operation will add the `lolitaAnimations` field in the Lottie protocol to implement animation slicing:
```json
"lolitaAnimations": [
@@ -95,10 +95,9 @@ This operation will add a `lolitaAnimations` field in the Lottie protocol to imp
]
```
+### Install Dependencies
-### Installing Dependencies
-
-@galacean/engine-lottie is a second-party package of Galacean Engine. When using Lottie in your project, make sure to install this package in your project:
+@galacean/engine-lottie is a secondary package of the Galacean Engine. When using Lottie in the project, ensure that this package is installed:
```bash
npm i @galacean/engine-lottie --save
@@ -106,7 +105,7 @@ npm i @galacean/engine-lottie --save
### Pro Code Development Mode
-When developing in `Pro Code` mode, you need a `json` file and an `atlas` file to implement the `lottie` animation. Usually, when artists export from After Effects (AE), they only provide the `json` file to developers. In this case, you need to use the [tools-atlas-lottie](https://www.npmjs.com/package/@galacean/tools-atlas-lottie) `CLI` tool to generate the `atlas` file.
+When developing in `Pro Code` mode, you need a `json` file and an `atlas` file to implement `lottie` animations. Usually, the art team exports only the `json` file through `AE`. In this case, you need to use the [tools-atlas-lottie](https://www.npmjs.com/package/@galacean/tools-atlas-lottie) `CLI` tool to generate the `atlas` file.
```typescript
import { LottieAnimation } from "@galacean/engine-lottie";
@@ -132,40 +131,39 @@ engine.resourceManager.load({
+### 3D Transform
-### 3D Transformation
-
-There is often a need for 3D transformations in business scenarios, such as entry animations for pop-ups. Taking rotation as an example, traditional lottie-web solutions can only rotate along the **Z-axis** (i.e., perpendicular to the screen normal direction). Even if we achieve rotation along the **X-axis** or **Y-axis** in AE, it will be ignored when played using lottie-web.
+In business scenarios, there is often a need for 3D transformations, such as entrance animations for pop-ups. For example, with rotation, traditional lottie-web solutions can only rotate along the **Z-axis** (i.e., perpendicular to the screen's normal direction). Even if we achieve rotation along the **X-axis** or **Y-axis** in AE, it will be ignored when played with lottie-web.
-Thanks to the unified architecture of Galacean Engine's 2D/3D engine, 3D transformation can be easily implemented.
+Thanks to the unified architecture of the Galacean Engine 2D/3D engine, 3D transformation functions can be easily implemented.
-## Performance Recommendations
-
-- Simplify animations. When creating animations, always remember to keep the json file concise, for example, avoid using path keyframe animations that occupy the most space. Techniques like auto-trace drawing and jitter can make the json file very large and performance-intensive.
-- If there are looping frames, do not loop them within the animation file. Count the number of frames and let developers control the loop of this animation segment, which can save space for the same layers and animations.
-- Create shape layers. Convert resources like AI, EPS, SVG, and PDF into shape layers; otherwise, they cannot be used normally in lottie. After conversion, remember to delete the resource to prevent it from being exported to the json file.
-- Set dimensions. In AE, you can set the composition size to any size, but make sure the export size matches the resource size.
-- Trim paths appropriately for performance impact.
-- When animating with lottie, it layers according to AE's design, so try to minimize the number of layers.
-- If path animations are not necessary, replace vector shapes with PNG images and animate using the transform property.
-- Consider reducing animation frame rate or keyframe count based on actual conditions, which will reduce the number of drawings per second.
-- Shorten animation duration. For looping actions, avoid duplicating them on the timeline; each read of a keyframe consumes performance. Try to avoid a sequence where action a ends and action b begins; overlapping actions can reduce animation length.
-- Merge similar items. Some elements are similar or identical but used in different places. Pre-compose these elements for reuse, adjusting the animation properties of this pre-composition to achieve the desired animation effect.
-- Minimize the number of layers. Each layer will be exported as corresponding json data, so reducing layers can significantly reduce json size.
-- Draw all layers in AE rather than importing from other software. Importing from other software may result in a large json section describing this graphic.
-- When creating, fill the animation elements **across** the entire canvas to avoid waste and facilitate size adjustments by the front end.
-- If vector shapes are exported from AI, delete unnecessary "groups" and other elements that have no practical use.
-- Remove closed and unused properties.
-- Export only 1x images.
-- To avoid compatibility issues with lottie exports, try to use the English version of AE, keep the layers concise, and name them clearly.
-- Avoid large areas of vector parts and large particle effects.
-
-### Lottie Version
+### Version Dependencies
| Engine Version | Lottie Version |
| :--- | :--- |
| 1.2.x | 1.1.0-beta.0 |
| 1.3.x | 1.1.0-beta.0 |
+
+## Performance Recommendations
+
+- Simplify animations. Always keep the json file streamlined when creating animations, for example, by avoiding the use of path keyframe animations that take up the most space. Techniques like auto-tracing and wiggling can make the json file very large and performance-intensive.
+- If there are looping frames, do not loop them in the animation file. Count the frames and let the developer control the loop of this animation, which can save the size of the same layers and animations.
+- Create shape layers. Convert resources like AI, EPS, SVG, and PDF into shape layers; otherwise, they cannot be used properly in lottie. After conversion, be sure to delete the resource to prevent it from being exported to the json file.
+- Set dimensions. In AE, you can set the composition size to any size, but make sure the composition size and resource size are consistent when exporting.
+- Trim paths appropriately to meet the effect as much as possible, as this greatly impacts performance.
+- Lottie will layer according to AE's design during animation, so try to reduce the number of layers.
+- If path animation is not necessary, replace vector graphics with png images and use the transform attribute to complete the animation.
+- Depending on the actual situation, consider lowering the animation frame rate or reducing the number of keyframes, which will reduce the number of drawings per second.
+- Simplify the animation duration. For actions that can loop, do not do them twice on the timeline. Each keyframe read consumes performance. Try to avoid the end of action a and the start of action b in the arrangement; actions can overlap to reduce animation length.
+- Merge similar items. If some elements are similar or used in different places, pre-compose this element and reuse this component. You can achieve the desired animation effect by adjusting the animation properties of the pre-composition.
+- Minimize the number of layers. Each layer will be exported as corresponding json data, and reducing layers can significantly reduce the json size.
+- Try to ensure all layers are drawn in AE rather than imported from other software. If imported from other software, the json part describing this graphic may become very large.
+- When creating, make sure the animation elements **cover** the entire canvas to avoid waste and facilitate front-end size adjustments.
+- If vector graphics are exported from AI, delete unnecessary "groups" and other elements that have no practical use.
+- Delete those closed and useless properties.
+- Only export 1x images.
+- To prevent compatibility issues with lottie exports, try to use the English version of AE, keep layers simple, and name them clearly.
+- Avoid large vector parts and large particle effects.
diff --git a/docs/en/graphics/2D/spine/editor.md b/docs/en/graphics/2D/spine/editor.md
new file mode 100644
index 0000000000..5fd452e207
--- /dev/null
+++ b/docs/en/graphics/2D/spine/editor.md
@@ -0,0 +1,130 @@
+---
+order: 1
+title: Using in the Editor
+type: Graphics
+group: Spine
+label: Graphics/2D/Spine/editor
+---
+
+The Galacean editor has built-in support for Spine animations, no additional downloads or configurations are needed, making the development process much simpler. This chapter introduces how to use Spine animations in the Galacean editor.
+
+> For editor version dependencies, please refer to: [Version/Performance Chapter](/en/docs/graphics/2D/spine/other)
+
+## 1. Export Assets from Spine Editor
+The first step is to export your Spine animation assets from the Spine editor. You can find the complete steps in the [Spine User Guide](https://zh.esotericsoftware.com/spine-user-guide), which explains how to:
+
+1. [Export skeleton and animation data](https://zh.esotericsoftware.com/spine-export)
+2. [Export texture atlases containing skeleton images](https://zh.esotericsoftware.com/spine-texture-packer)
+
+Below is a brief process of exporting assets from Spine:
+
+1. After completing the animation, click `Spine Menu` > `Export` to open the export window
+
+
+
+2. Select **Binary** in the upper left corner of the export window (it is recommended to use binary, exporting in binary format instead of JSON format will make the file size smaller and load faster)
+
+
+
+3. Check the **Texture Atlas** packing checkbox
+
+
+
+4. Click **Packing Settings**
+
+Here it is recommended to check `Power of 2`; do not check `Premultiply` and `Bleed`
+After completing the packing settings, click **OK**
+
+
+5. Return to the export window, select the export folder, and click **Export**
+
+
+
+6. You will get the following three files:
+
+
+
+spineboy.skel contains skeleton animation data, spineboy.atlas contains texture atlas information, and the exported images may be multiple, each representing a page in the texture atlas.
+
+## 2. Import assets into the Galacean editor
+After exporting assets from the Spine editor, the second step is to import the assets into the Galacean editor. Open the editor and drag the exported files directly into the [Assets Panel](/en/docs/assets/interface/) to complete the upload.
+
+
+
+You can also click the upload button in the assets panel to upload:
+
+
+
+After the upload is complete, you will see the uploaded spine assets in the assets panel.
+
+### SpineSkeletonData Asset
+
+
+
+The SpineSkeletonData asset stores skeleton data and references the generated SpineAtlas asset.
+After clicking the asset, you can preview the Spine animation in the inspector. In the preview panel, you can switch `skins` and `animation clips`:
+
+
+
+### SpineAtlas Asset
+
+
+
+The SpineAtlas asset stores the texture atlas file and includes references to the required Texture assets.
+After clicking the asset, you can view its referenced Texture assets and Spine's atlas information in the inspector.
+
+
+
+### Asset Update
+If you need to update your Spine assets, re-export the assets from the Spine editor and re-import them into the Galacean editor to overwrite the original files.
+
+
+## 3. Adding Spine Animation
+
+After uploading the assets, the third step is to add the Spine animation to the scene. There are three ways to do this:
+
+1. Drag and Drop to Add
+
+Drag and drop is the quickest way. Click on the SpineSkeletonData asset, hold it, and drag it into the viewport. This will quickly create an entity with the SpineAnimationRenderer component added, and the asset will be set to the selected SpineSkeletonData asset.
+
+
+
+2. Quick Add
+
+Click the quick add button in the top left corner, select `2D Object`>`SpineAnimationRenderer`,
+
+
+
+After adding, you will see a new entity with the SpineAnimationRenderer component attached. Click the Resource property and select the uploaded SpineSkeletonData asset to see the Spine animation.
+
+
+
+3. Manual Add
+
+The manual add method is similar to the quick add method, but you need to manually create a new entity in the node tree and add the SpineAnimationRenderer component via the AddComponent button in the inspector.
+
+
+
+添加了 SpineAnimationRenderer 组件后,同样需要指定组件的 Resource,也就是 SpineAnimationRenderer 组件要渲染的 SpineSkeletonData 资产。
+
+### SpineAnimationRenderer 组件配置 {/*examples*/}
+以上三种添加 Spine 动画的方法实际上本质其实是相同的,都是通过给实体 `添加 SpineAnimationRenderer 组件` ,来让 Spine 动画添加至场景中的。
+
+SpineAnimationRenderer 组件的配置如下:
+
+
+
+通过 SpineAnimationRenderer 组件能够配置 Spine 动画的资产以及默认状态:
+
+- Resource:Spine 动画的资源 ( SpineSkeletonData 资产 )
+- Animation:默认播放的动画名称
+- Loop:默认播放的动画是否循环
+- Skin:默认的皮肤名称
+- Scale:默认的缩放系数
+- Priority:渲染优先级
+
+## 4. 项目导出 {/*examples*/}
+最终,完成场景编辑器后,可以参考[项目导出](/en/docs/assets/build/)流程,导出编辑器项目。
+
+
+下一章节:[在代码中使用 Galacean Spine 运行时](/en/docs/graphics/2D/spine/runtime)
diff --git a/docs/en/graphics/2D/spine/example.md b/docs/en/graphics/2D/spine/example.md
new file mode 100644
index 0000000000..aa6b57e5b8
--- /dev/null
+++ b/docs/en/graphics/2D/spine/example.md
@@ -0,0 +1,41 @@
+---
+order: 3
+title: Spine Example
+type: Graphics
+group: Spine
+label: Graphics/2D/Spine/example
+---
+
+**Animation Control**
+
+This example demonstrates how to orchestrate a spine animation queue using the setAnimation and addAnimation APIs:
+
+
+**Follow Shooting**
+
+This example demonstrates how to achieve aiming and shooting effects by modifying the IK bone position:
+
+
+**Partial Skin Change**
+
+This example demonstrates how to achieve partial skin change by modifying the attachments in the slots:
+
+
+**Full Skin Change**
+
+This example demonstrates how to achieve a full skin change using the setSkin method:
+
+
+**Skin Mixing**
+
+This example demonstrates how to achieve mixing effects by combining new skins at runtime:
+
+
+**Physics**
+
+This example demonstrates physics-based animation effects in spine version 4.2:
+
+
+
+
+Next Chapter: [Version and Performance](/en/docs/graphics/2D/spine/other)
diff --git a/docs/en/graphics/2D/spine/other.md b/docs/en/graphics/2D/spine/other.md
new file mode 100644
index 0000000000..e4d5764a11
--- /dev/null
+++ b/docs/en/graphics/2D/spine/other.md
@@ -0,0 +1,32 @@
+---
+order: 4
+title: Version and Performance
+type: Graphics
+group: Spine
+label: Graphics/2D/Spine/other
+---
+
+### Spine Version
+@galacen/engine-spine has supported spine 4.x versions since version 1.2.
+From version 1.2 onwards, the major version and minor version of the @galacen/engine-spine package correspond exactly to the spine version, as follows:
+- @galacean/engine-spine <= 1.2 corresponds to spine version 3.8
+- @galacean/engine-spine 4.0 corresponds to spine version 4.0
+- @galacean/engine-spine 4.1 corresponds to spine version 4.1
+- @galacean/engine-spine 4.2 corresponds to spine version 4.2
+- .....
+
+Currently, the 4.2 beta version has been released, and versions 4.1 and 4.0 will be released gradually.
+
+### Version Upgrade
+After upgrading to editor version 1.3, besides upgrading the engine version in the editor's [project settings](/en/docs/interface/menu/#项目设置), since the exported JSON or binary Spine editor version needs to [stay consistent](https://zh.esotericsoftware.com/spine-versioning#%E5%90%8C%E6%AD%A5%E7%89%88%E6%9C%AC) with the runtime version, after upgrading the editor to 1.3, `you also need to re-export the Spine assets of version 4.2 and upload them to the editor, completing the asset update by file overwrite`.
+
+### Performance Suggestions
+Here are some methods to optimize spine animation performance:
+
+1. Export the skeleton in binary file (.skel) format, as binary files are smaller and load faster.
+2. It is recommended to pack attachments into as few atlas pages as possible, and group attachments into atlas pages according to the drawing order to prevent unnecessary material switching. Please refer to: [Spine Texture Packer: Folder Structure](https://zh.esotericsoftware.com/spine-texture-packer#%E6%96%87%E4%BB%B6%E5%A4%B9%E7%BB%93%E6%9E%84) to learn how to arrange atlas regions in your Spine atlas.
+3. Use the clipping feature sparingly. Spine's clipping implementation is done through dynamic triangle clipping, which is very performance-intensive.
+4. Minimize the use of atlas page textures. That is, try to control the number of textures exported to one.
+
+### Questions
+For any questions about Spine, feel free to [create an issue](https://github.com/galacean/engine-spine/issues/new) on @galacean/engine-spine.
diff --git a/docs/en/graphics/2D/spine/overview.md b/docs/en/graphics/2D/spine/overview.md
new file mode 100644
index 0000000000..80d0c19cc6
--- /dev/null
+++ b/docs/en/graphics/2D/spine/overview.md
@@ -0,0 +1,23 @@
+---
+order: 0
+title: Spine Overview
+type: Graphics
+group: Spine
+label: Graphics/2D/Spine/overview
+---
+
+Spine animation is a 2D skeletal animation tool designed for game development. It binds images to bones and then controls the bones to create animations. It meets the program's need for control and flexibility over animations, while also providing a more efficient and streamlined workflow for artists and designers.
+Compared to traditional frame-by-frame animation, Spine animation has the following advantages:
+
+- **Smaller size:** Traditional animations require providing an image for each frame. Spine animation only saves the animation data of the bones, which takes up very little space.
+- **Art requirements:** Spine animation requires fewer art resources, saving you more manpower and resources to better invest in game development.
+- **Smoothness:** Spine animation uses interpolation algorithms to calculate intermediate frames, ensuring that your animations always remain smooth.
+- **Equip attachments:** Images are bound to bones to create animations. If needed, you can easily change the character's equipment to meet different requirements. You can even change the character's appearance to achieve animation reuse.
+- **Blending:** Animations can be blended. For example, a character can shoot while walking, running, jumping, or swimming.
+- **Programmatic animation:** Bones can be controlled through code, allowing for effects such as shooting following the mouse, looking at enemies, or leaning forward when going uphill.
+
+This section will introduce:
+- [How to use Spine animation in the Galacean editor](/en/docs/graphics/2D/spine/editor)
+- [How to use the Galacean spine runtime in code](/en/docs/graphics/2D/spine/runtime)
+- [Spine animation examples](/en/docs/graphics/2D/spine/example)
+- [Other content (versions, performance)](/en/docs/graphics/2D/spine/other)
diff --git a/docs/en/graphics/2D/sprite.md b/docs/en/graphics/2D/sprite.md
index f0cae2e8c9..aaa5229132 100644
--- a/docs/en/graphics/2D/sprite.md
+++ b/docs/en/graphics/2D/sprite.md
@@ -6,20 +6,20 @@ group: 2D
label: Graphics/2D
---
-[Sprite](/apis/core/#Sprite) is the most important asset in 2D projects. It retrieves graphic data from [Texture2D](/en/docs/graphics-texture-2d) and customizes the desired rendering result by setting properties such as [region](/apis/core/#Sprite-region) and [pivot](/apis/core/#Sprite-pivot). When assigned to a [SpriteRenderer](/apis/core/#SpriteRenderer), a node with a sprite renderer can display 2D images in a 3D space. When assigned to a [SpriteMask](/en/docs/graphics/2D/spriteMask), a node with a sprite mask can achieve masking effects on corresponding 2D elements. Let's delve deeper into the properties and usage of sprites.
+[Sprite](/en/apis/core/#Sprite) is the most important asset in 2D projects. It obtains graphical source data from [Texture2D](/en/docs/graphics/texture/2d/) and customizes the desired rendering result by setting properties such as [region](/en/apis/core/#Sprite-region) and [pivot](/en/apis/core/#Sprite-pivot). If assigned to a [SpriteRenderer](/en/apis/core/#SpriteRenderer), the node with the sprite renderer can display 2D images in three-dimensional space. If assigned to a [SpriteMask](/en/docs/graphics/2D/spriteMask/), the node with the sprite mask can achieve masking effects for corresponding 2D elements. Next, let's delve into the properties and usage of sprites.
## Properties
-| Property | Type | Description |
-| :------------------------------------ | :-------------------------------- | :------------------------------------------------------------------------------------------------------ |
-| [texture](/apis/core/#Sprite-texture) | [Texture2D](/apis/core/#Texture2D) | Reference to the texture |
-| [width](/apis/core/#Sprite-width) | Number | The width of the sprite. If the developer does not customize the sprite width, it defaults to texture pixel width / 100 |
-| [height](/apis/core/#Sprite-height) | Number | The height of the sprite. If the developer does not customize the sprite height, it defaults to texture pixel height / 100 |
-| [region](/apis/core/#Sprite-region) | [Rect](/apis/math/#Rect) | The position of the sprite on the original texture, ranging from 0 to 1 |
-| [pivot](/apis/core/#Sprite-pivot) | [Vector2](/apis/math/#Vector2) | The position of the sprite's center in the region on the original texture, ranging from 0 to 1 |
-| [border](/apis/core/#Sprite-border) | [Vector4](/apis/math/#Vector4) | When the renderer's drawing mode is nine-slice or tiling, the border configuration affects the final rendering result. x, y, z, w correspond to the distances from the left, bottom, right, and top edges respectively |
+| Property Name | Property Type | Description |
+| :----------------------------------- | :-------------------------------- | :------------------------------------------------------------------------------------------------------ |
+| [texture](/en/apis/core/#Sprite-texture) | [Texture2D](/en/apis/core/#Texture2D) | Reference to the texture used |
+| [width](/en/apis/core/#Sprite-width) | Number | Width of the sprite. If the developer does not customize the sprite width, it defaults to the texture pixel width / 100 |
+| [height](/en/apis/core/#Sprite-height) | Number | Height of the sprite. If the developer does not customize the sprite height, it defaults to the texture pixel height / 100 |
+| [region](/en/apis/core/#Sprite-region) | [Rect](/en/apis/math/#Rect) | Position of the sprite on the original texture, range 0 to 1 |
+| [pivot](/en/apis/core/#Sprite-pivot) | [Vector2](/en/apis/math/#Vector2) | Position of the sprite's center point in the region on the original texture, range 0 to 1 |
+| [border](/en/apis/core/#Sprite-border) | [Vector4](/en/apis/math/#Vector4) | When the renderer's drawing mode is nine-slice or tiled, the border configuration affects the final rendering effect. x, y, z, w correspond to the distances from the left, bottom, right, and top edges, respectively |
-The region determines the content displayed by the sprite, allowing you to select a rectangular area in the texture for display, with any excess automatically filtered out, as shown below:
+The region determines the display content of the sprite. You can select a rectangular area in the texture to display, and the excess part will be automatically filtered out, as shown below:
@@ -33,33 +33,33 @@ The pivot represents the position of the sprite's center in the region, as shown
#### Upload Sprite
-To upload a sprite asset, right-click on a blank space in the **[Assets Panel](/en/docs/assets/interface)**, then select **Upload** → **Sprite** → **Choose the corresponding image**. This will upload the sprite asset successfully, and the current asset list will synchronize with a texture asset named `image_name.png` and a sprite asset named `image_name-spr.png`.
+In the **[Assets Panel](/en/docs/assets/interface/)**, right-click on the blank area and select **Upload** → **Sprite** → **Select the corresponding image** to upload the sprite asset. After a successful upload, the current asset list will synchronously add a texture asset named `image_name.png` and a sprite asset named `image_name-spr.png`.
-
+
#### Create Blank Sprite
-To create a blank sprite asset, right-click on a blank space in the **[Assets Panel](/en/docs/assets/interface)**, then select **Create** → **Sprite**.
+In the **[Assets Panel](/en/docs/assets/interface/)**, right-click on the blank area and select **Create** → **Sprite** to create a blank sprite asset.
-
+
#### Script Creation
-Similarly, in scripts, we can create a sprite using the following code:
+Similarly, in the script, we can create a sprite with the following code:
```typescript
-// Create a blank sprite.
+// 创建一个空白精灵
const sprite = new Sprite(engine);
-// Creating a textured sprite.
+// 创建一个带纹理的精灵
const spriteWithTexture = new Sprite(engine, texture2D);
```
-### Setting Properties
+### Set Properties
-Here, we specifically explain the setting of the pivot in the editor. For the pivot, the bottom-left corner of the texture is `(0, 0)`, with the X-axis from left to right and the Y-axis from bottom to top. The editor provides some commonly used pivot shortcut values, as follows:
+Here, we specifically explain the setting of the pivot in the editor. For the pivot, the bottom-left corner of the texture is `(0, 0)`, the X-axis goes from left to right, and the Y-axis goes from bottom to top. The editor has some built-in common pivot shortcut values, as shown below:
-If the built-in values do not meet the requirements, you can customize your own pivot, as shown below:
+If the built-in values do not meet your needs, you can customize your own pivot, as shown below:
diff --git a/docs/en/graphics/2D/spriteAtlas.md b/docs/en/graphics/2D/spriteAtlas.md
index a46e323251..edf71042fc 100644
--- a/docs/en/graphics/2D/spriteAtlas.md
+++ b/docs/en/graphics/2D/spriteAtlas.md
@@ -6,62 +6,63 @@ group: 2D
label: Graphics/2D
---
-[SpriteAtlas](/apis/core/#SpriteAtlas) is a collection of sprites that combines multiple sprite textures into one sprite atlas to merge drawing commands during rendering. It has the following advantages:
+[SpriteAtlas](/en/apis/core/#SpriteAtlas) is a collection of sprite resources that combines multiple sprite textures into a single sprite atlas to merge draw calls during rendering. It has the following advantages:
-- Better performance (merge drawing commands);
-- Less memory usage (packing algorithm reduces texture size);
-- Fewer requests (reduce loading requests by reducing fragmented files);
+- Better performance (merged draw calls);
+- Less video memory (packing algorithm reduces texture size);
+- Fewer requests (reduces the number of load requests by reducing fragmented files);
-In the example below, only one drawing command is called per frame in the sprite atlas:
+In the example of the sprite atlas below, only one draw call is made per frame:
## Editor Usage
-### Create Sprite Atlas
+### Creating a Sprite Atlas
-Right-click inside the **[Asset Panel](/en/docs/assets/interface)**, select `Create` from the `Feature List`, and choose `Sprite Atlas`. This will create an empty sprite atlas asset.
+Right-click in the **[Assets Panel](/en/docs/assets/interface/)**, select `Create` from the `Function List`, and choose `Sprite Atlas`. At this point, a blank sprite atlas asset is successfully created.
-Select the `Sprite Atlas` asset to view detailed information in the **[Inspector Panel](/en/docs/interface/inspector)**.
+Select the `Sprite Atlas` asset to view detailed information about the asset in the **[Inspector Panel](/en/docs/interface/inspector)**.
-### Add Sprites
+### Adding Sprites
-After determining the relationship between the `Sprite Atlas` and `Sprites`, you need to add the `Sprites` to the corresponding `Sprite Atlas`. This can be done by operating the `Sprite` asset or the `Sprite Atlas` asset. The following explains both methods.
+After determining the inclusion relationship between the `Sprite Atlas` and the `Sprite`, you need to add the `Sprite` to the corresponding `Sprite Atlas`. This step can be achieved by operating the `Sprite` asset or by operating the `Sprite Atlas` asset. Next, we will practice both methods.
-#### Method 1: Operate Sprite
+#### Method 1: Operating the Sprite
-Left-click on the `Sprite` asset that needs to be added, and in the **[Inspector Panel](/en/docs/interface/inspector)**, find the `Hierarchy` of the sprite. Select `Pack into Atlas` to choose the desired `Sprite Atlas` asset to pack into.
+Left-click to select the `Sprite` asset to be added. In the **[Inspector Panel](/en/docs/interface/inspector)**, find the sprite's `Affiliation`, and select `Pack into Atlas` to choose the `Sprite Atlas` asset you want to pack into.
-#### Method 2: Operate Sprite Atlas
+#### Method 2: Operating the Sprite Atlas
-Left-click on the target `Sprite Atlas` asset, and in the **[Inspector Panel](/en/docs/interface/inspector)**, find the list of sprites packed in the atlas. Select `Add Sprite` to choose the desired `Sprite` asset to pack (selecting a folder will add all sprites under that folder).
+Left-click to select the target `Sprite Atlas` asset. In the **[Inspector Panel](/en/docs/interface/inspector)**, find the sprite list of the atlas, and select `Add Sprite` to choose the `Sprite` asset you want to pack. (If you select a folder, all sprites in the folder directory will be added)
-### Remove Sprites
+### Removing Sprites
-#### Method 1: Operate Sprite
+#### Method 1: Operating the Sprite
-Left-click on the `Sprite` asset that needs to be removed from the atlas, and in the **[Inspector Panel](/en/docs/interface/inspector)**, find the sprite's `Hierarchy` (make sure the path of the target atlas matches). Click the remove button to remove the sprite from the target atlas.
+Left-click to select the `Sprite` asset to be removed from the atlas. In the **[Inspector Panel](/en/docs/interface/inspector)**, find the sprite's `Affiliation` (make sure the target atlas path matches), and click the remove button to remove the sprite from the target atlas.
-#### Method 2: Operate Sprite Atlas
+#### Method 2: Operating the Sprite Atlas
-Left-click on the `Sprite Atlas` asset to be operated, and in the **[Inspector Panel](/en/docs/interface/inspector)**, find the list of sprites in the atlas. Locate the sprite to be removed and click the remove button.
+Left-click to select the `Sprite Atlas` asset to be operated on. In the **[Inspector Panel](/en/docs/interface/inspector)**, find the sprite list of the atlas, find the sprite object to be removed, and click the remove button.
-### Quickly Operate Sprites
+### Quick Operations on Sprites
-After a `Sprite` asset is added to a `Sprite Atlas`, you can quickly operate the sprite in the `Sprite Atlas` **[Inspector Panel](/en/docs/interface/inspector)**, and its properties will be synchronized with the `Sprite` asset.
+After the `Sprite` asset is added to the `Sprite Atlas`, you can quickly operate the sprite in the **[Inspector Panel](/en/docs/interface/inspector)** of the `Sprite Atlas`. Its properties will be synchronously modified in the `Sprite` asset.
+
### Settings
@@ -69,39 +70,43 @@ After a `Sprite` asset is added to a `Sprite Atlas`, you can quickly operate the
-| Setting Name | Definition |
+| Setting Name | Description |
| ------------------ | ---------------------------------------- |
-| Texture Max Width | Maximum width limit of the packed texture |
-| Texture Max Height | Maximum height limit of the packed texture |
-| Edge Padding | Padding width for the sprite packing |
-| Allow Rotation (Disabled) | Whether to improve the space utilization of the atlas packing through rotation |
-| Trim Blank Space (Disabled) | Whether to improve the space utilization of the atlas packing through blank space trimming |
+| Max Texture Width | Maximum width limit for the packed texture (1, 2048] |
+| Max Texture Height | Maximum height limit for the packed texture (1, 2048] |
+| Edge Padding | Edge padding width for the packed sprite |
+| Allow Rotation (Disabled) | Whether to improve atlas packing space utilization by rotation |
+| Trim Whitespaces (Disabled) | Whether to improve atlas packing space utilization by trimming whitespaces |
+
+If you encounter the following warning during packaging, it means that the size of the atlas exceeds the maximum width and height of the texture. You can solve this by adjusting the `Max Texture Width` and `Max Texture Height` or **rearranging** the scattered images for packaging.
+
+
#### Export Settings
-| Property | Value |
-| --------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
-| Wrap Mode U ([wrapModeU](/apis/core/#Texture-wrapModeU)) | Clamping Mode ([Clamp](/apis/core/#TextureWrapMode-Clamp)), Repeating Mode ([Repeat](/apis/core/#TextureWrapMode-Repeat)), Mirrored Repeat Mode ([Mirror](/apis/core/#TextureWrapMode-Mirror)) |
-| Wrap Mode V ([wrapModeV](/apis/core/#Texture-wrapModeV)) | Clamping Mode ([Clamp](/apis/core/#TextureWrapMode-Clamp)), Repeating Mode ([Repeat](/apis/core/#TextureWrapMode-Repeat)), Mirrored Repeat Mode ([Mirror](/apis/core/#TextureWrapMode-Mirror)) |
-| Filter Mode ([filterMode](/apis/core/#Texture-filterMode)) | Point Filtering ([Point](/apis/core/#TextureFilterMode-Point)), Bilinear Filtering ([Bilinear](/apis/core/#TextureFilterMode-Bilinear)), Trilinear Filtering ([Trilinear](/apis/core/#TextureFilterMode-Trilinear)) |
-| Anisotropic Filtering Level ([anisoLevel](/apis/core/#Texture-anisoLevel)) | Anisotropic level, 1 ~ 16 |
-| Texture Mapping ([Mipmap](/apis/core/#Texture-generateMipmaps)) | true , false |
+| Property | Value |
+| --- | --- |
+| Wrap Mode U ([wrapModeU](/en/apis/core/#Texture-wrapModeU)) | Clamp ([Clamp](/en/apis/core/#TextureWrapMode-Clamp)), Repeat ([Repeat](/en/apis/core/#TextureWrapMode-Repeat)), Mirror ([Mirror](/en/apis/core/#TextureWrapMode-Mirror)) |
+| Wrap Mode V ([wrapModeV](/en/apis/core/#Texture-wrapModeV)) | Clamp ([Clamp](/en/apis/core/#TextureWrapMode-Clamp)), Repeat ([Repeat](/en/apis/core/#TextureWrapMode-Repeat)), Mirror ([Mirror](/en/apis/core/#TextureWrapMode-Mirror)) |
+| Filter Mode ([filterMode](/en/apis/core/#Texture-filterMode)) | Point ([Point](/en/apis/core/#TextureFilterMode-Point)), Bilinear ([Bilinear](/en/apis/core/#TextureFilterMode-Bilinear)), Trilinear ([Trilinear](/en/apis/core/#TextureFilterMode-Trilinear)) |
+| Anisotropic Filtering Level ([anisoLevel](/en/apis/core/#Texture-anisoLevel)) | Anisotropic level, 1 ~ 16 |
+| Mipmap ([Mipmap](/en/apis/core/#Texture-generateMipmaps)) | true, false |
-### Best Practices {#best-practices}
+### Best Practices
-Click on the `Sprite Atlas` asset, adjust the `Texture Max Width` and `Texture Max Height` in the `Pack Settings`, and call `Pack and Preview` in the `Pack Objects` to ensure the utilization of the atlas is at a relatively high level.
+Click on the `Sprite Atlas` asset, adjust the `Max Texture Width` and `Max Texture Height` in the `Packaging Settings`, and then call `Pack and Preview` in the `Packaging Object` to ensure a high level of atlas utilization.

-The left side of the preview image shows the size information of the exported images, while the right side shows the information about the utilization of the atlas (representing the percentage of the sum of all individual image areas occupying the final large image). You can adjust the packing settings based on this value to achieve better results.
+The left side of the preview image shows the size information of the exported image, and the right side shows the atlas utilization information (representing the percentage of the total area of all scattered images occupying the final large image). You can adjust the packing settings based on this value to achieve better results.
-## Script Usage {#script-usage}
+## Script Usage {/*examples*/}
-### Atlas Generation
+### Atlas Generation {/*examples*/}
-Galacean provides a command-line tool for sprite atlas, developers can generate atlases following these steps:
+Galacean provides a command-line tool for sprite atlas generation. Developers can generate an atlas by following these steps:
1. Install the package
@@ -115,35 +120,35 @@ npm i @galacean/tools-atlas -g
galacean-tool-atlas p inputPath -o outputName
```
-Where `inputPath` represents the folder path that needs to be packed, and `outputName` represents the output sprite atlas file name. If you get the result shown below, it means the packing was successful.
+Here, `inputPath` represents the path of the folder to be packed, and `outputName` represents the name of the output sprite atlas file. If you get the result shown in the image below, it means the packing was successful.
-| Property | Description |
-| -------------- | ------------------------------------------- |
-| f/format | Sprite atlas format for packing (default: "galacean") |
-| o/output | Output sprite atlas file name (default: "galacean") |
-| a/algorithm | Algorithm for packing sprite atlas (default: "maxrects") |
+| Property | Description |
+| -------------- | --------------------------------------------- |
+| f/format | Format of the output sprite atlas (default: "galacean") |
+| o/output | Name of the output sprite atlas file (default: "galacean") |
+| a/algorithm | Algorithm for packing the sprite atlas (default: "maxrects") |
| ar/allowRotate | Whether the sprite atlas supports rotation (default: false) |
-| p/padding | Distance between each sprite in the atlas and its border (default: 1) |
-| mw/maxWidth | Maximum width of the final sprite atlas (default: 1024) |
-| mh/maxHeight | Maximum height of the final sprite atlas (default: 1024) |
-| s/square | Force packing into a square (default: false) |
-| pot | Force packing into power of 2 (default: false) |
+| p/padding | Distance between each sprite and its border in the atlas (default: 1) |
+| mw/maxWidth | Maximum width of the final packed sprite atlas (default: 1024) |
+| mh/maxHeight | Maximum height of the final packed sprite atlas (default: 1024) |
+| s/square | Force packing into a square (default: false) |
+| pot | Force width and height to be a power of 2 (default: false) |
-For more information, please refer to the [Atlas Packing Tool](https://github.com/galacean/tools/blob/main/packages/atlas/README.md).
+For more details, refer to [Atlas Packing Tool](https://github.com/galacean/tools/blob/main/packages/atlas/README.md).
-### Usage
+### Usage {/*examples*/}
-1. Upload the atlas image and atlas file to the same directory on CDN, for example, the addresses of the file and image should be `https://*cdnDir*/*atlasName*.atlas` and `https://*cdnDir*/*atlasName*.png` respectively.
+1. Upload the atlas image and atlas file to the same directory on the CDN. For example, the addresses of the file and image should be `https://*cdnDir*/*atlasName*.atlas` and `https://*cdnDir*/*atlasName*.png`, respectively.
-2. Load and Use
+2. Load and use
```typescript
engine.resourceManager
.load({
url: "https://*cdnDir*/*atlasName*.atlas",
- type: AssetType.SpriteAtlas,
+ type: AssetType.SpriteAtlas
})
.then((atlas) => {
// Get all sprites.
@@ -155,8 +160,8 @@ engine.resourceManager
});
```
-## Notes {#notes}
+## Notes {/*examples*/}
-1. Pack sprites that are drawn in sequence into the same atlas to significantly improve performance (reduce the number of draw command calls);
-2. When cleaning up sprite atlases, make sure that all sprites in the atlas are no longer in use;
-3. When packing sprite atlases, it is necessary to coordinate the number and size of sprites to avoid generating multiple sprite atlases at once;
+1. Please pack sprites with connected drawing sequences into the same atlas to significantly improve performance (reduce the number of draw calls);
+2. When cleaning up the sprite atlas, ensure that all sprites in the atlas are no longer in use;
+3. When packing the sprite atlas, it is necessary to coordinate the number and size of sprites to avoid generating multiple sprite atlases in one packing session;
diff --git a/docs/en/graphics/2D/spriteMask.md b/docs/en/graphics/2D/spriteMask.md
index 5250890166..1f8165cf0a 100644
--- a/docs/en/graphics/2D/spriteMask.md
+++ b/docs/en/graphics/2D/spriteMask.md
@@ -6,73 +6,73 @@ group: 2D
label: Graphics/2D
---
-The Sprite Mask component is used to achieve masking effects on [sprites](/en/docs/graphics/2D/spriteRenderer) and [text](/en/docs/graphics/2D/text) in 3D/2D scenes.
+The Sprite Mask component is used to apply masking effects to [Sprite Renderer](/en/docs/graphics/2D/spriteRenderer/) and [Text Renderer](/en/docs/graphics/2D/text/) in 3D/2D scenes.
-Control the effect on [sprites](/en/docs/graphics/2D/sprite}) by the parameters provided by [SpriteMask](/apis/core/#SpriteMask).
+Control the interaction with [Sprite](/en/docs/graphics/2D/sprite/) using parameters provided by [SpriteMask](/en/apis/core/#SpriteMask).
-| Parameter | Type | Description |
-| :--------------- | :----- | :----------------------------------------------------------------------------------------------- |
-| influenceLayers | number | The mask currently affects the masking layers, with a default value of SpriteMaskLayer.Everything, indicating that it affects all masking layers |
-| alphaCutoff | number | The lower limit of the effective alpha value of the current mask (range: 0~1), meaning that alpha values in the sprite's texture less than alphaCutoff will be discarded |
+| Parameter | Type | Description |
+| :-------------- | :----- | :----------------------------------------------------------------------------------------------- |
+| influenceLayers | number | The mask layers affected by the current mask. Default is SpriteMaskLayer.Everything, meaning it affects all mask layers. |
+| alphaCutoff | number | The lower limit of the effective alpha value for the current mask (range: 0~1). Pixels with alpha values less than alphaCutoff will be discarded. |
-[SpriteMaskLayer](/apis/core/#SpriteMaskLayer) declares the masking layers provided by the engine, with a total of 32 layers declared as Layer0~Layer31. Masking layers are unrelated to rendering; they are only used to help developers associate `SpriteMask` and `SpriteRenderer`. One prerequisite for a `SpriteMask` object to mask a `SpriteRenderer` object is that the masking layers of the two intersect.
+[SpriteMaskLayer](/en/apis/core/#SpriteMaskLayer) declares the mask layers provided by the engine. There are 32 mask layers in total, named Layer0~Layer31. Mask layers are unrelated to rendering and are only used to help developers set how `SpriteMask` and `SpriteRenderer` interact. For a `SpriteMask` object to mask a `SpriteRenderer` object, their mask layers must intersect.
-The `influenceLayers` of `SpriteMask` indicates which `SpriteRenderer` within the mask layers will be masked, and the `maskLayer` of `SpriteRenderer` indicates which masking layers the sprite is in, as shown below:
+The `influenceLayers` of `SpriteMask` indicates which mask layers the mask will affect for `SpriteRenderer`. The `maskLayer` of `SpriteRenderer` indicates which mask layers the sprite belongs to, as shown below:
-In the above image, the spriteMask affects sprites in `Layer1` and `Layer30`, spriteRenderer0 is in `Layer2` with no intersection, so it is not affected by spriteMask. spriteRenderer1 is in `Layer1`, intersecting with the masking layers affected by spriteMask, so spriteRenderer1 is affected by spriteMask.
+In the image above, the spriteMask affects sprites in `Layer1` and `Layer30`. spriteRenderer0 is in `Layer2`, so there is no intersection, and spriteRenderer0 does not interact with spriteMask. spriteRenderer1 is in `Layer1`, which intersects with the mask layers affected by spriteMask, so spriteRenderer1 interacts with spriteMask.
## Usage
-### Adding the Sprite Mask Component
+### Adding a Sprite Mask Component
-When we need to mask a sprite, we first need to create an entity and add the Sprite Mask component, as shown below:
+When we need to mask a sprite, we first need to create an entity and add a sprite mask component, as shown below:

### Setting the Mask Area
-The Sprite Mask component represents the mask area using an image. Here, we set the sprite resource through the component's `sprite` parameter, as shown below:
+The sprite mask component uses an image to represent the mask area. We set the sprite resource through the component's `sprite` parameter, as shown below:

### Setting the Sprite's Mask Type
-After the above two steps, you may notice that the mask still has no effect. This is because the current sprite's mask type is still the default (None). We set the `mask interaction` of the sprite in the scene to the inner mask type, as shown below:
+After the above two steps, you may find that the mask still has no effect. This is because the current sprite's mask type is still the default (None). We set the `mask interaction` of the sprite in the scene to the inner mask type, as shown below:

-### Set alpha cutoff
+### Setting alpha cutoff
-This parameter represents the lower limit of the current mask's valid `alpha` value (range: `0~1`), which means that alpha values in the sprite's texture that are less than the alpha cutoff will be discarded (i.e., not treated as the masking area). We can adjust the value of this property dynamically to see the actual effect, as shown below:
+This parameter represents the lower limit of the effective `alpha` value for the current mask (range: `0~1`). In other words, any alpha value in the sprite's texture that is less than the alpha cutoff will be discarded (i.e., it will not be considered as a mask area). We can dynamically adjust the value of this property to see the actual effect, as shown below:

-Similarly, in the script, we can use the following code to apply sprite masking:
+Similarly, in the script, we can use the following code to apply the sprite mask:
```typescript
-// Create a mask entity.
+// 创建一个遮罩实体
const spriteEntity = rootEntity.createChild(`spriteMask`);
-// Add a SpriteMask component to the entity.
+// 给实体添加 SpriteMask 组件
const spriteMask = spriteEntity.addComponent(SpriteMask);
-// Creating a sprite object from a texture.
+// 通过 texture 创建 sprite 对象
const sprite = new Sprite(engine, texture);
-// Set sprite.
+// 设置 sprite
spriteMask.sprite = sprite;
-// Textures in the mask's sprite with an alpha value less than 0.5 will be discarded.
+// mask 的 sprite 中纹理 alpha 小于 0.5 的将被丢弃
spriteMask.alphaCutoff = 0.5;
-// Mask is effective for all sprites in the mask layer.
+// mask 对所有遮罩层的精灵都生效
spriteMask.influenceLayers = SpriteMaskLayer.Everything;
-// Mask is only valid for sprites in the mask layer Layer0.
+// mask 只对处于遮罩层 Layer0 的精灵有效
spriteMask.influenceLayers = SpriteMaskLayer.Layer0;
-// Mask is valid for sprites in mask layers Layer0 and Layer1.
+// mask 对处于遮罩层 Layer0 和 Layer1 的精灵有效
spriteMask.influenceLayers = SpriteMaskLayer.Layer0 | SpriteMaskLayer.Layer1;
-// Set the mask interaction
+// 设置遮罩类型
spriteRenderer.maskInteraction = SpriteMaskInteraction.VisibleInsideMask;
-// Set which mask layer the sprite is in.
+// 设置精灵处于哪个遮罩层
spriteRenderer.maskLayer = SpriteMaskLayer.Layer0;
```
diff --git a/docs/en/graphics/2D/text.md b/docs/en/graphics/2D/text.md
index 5f28cc561b..741f97040a 100644
--- a/docs/en/graphics/2D/text.md
+++ b/docs/en/graphics/2D/text.md
@@ -6,61 +6,61 @@ group: 2D
label: Graphics/2D
---
-[TextRenderer](/apis/core/#TextRenderer) component is used to display text in 3D/2D scenes.
+[TextRenderer](/en/apis/core/#TextRenderer) component is used to display text in 3D/2D scenes.
## Editor Usage
-### Adding Text Component
+### Add Text Component
-When you need to display text, you first need to add a text component to an entity, as shown below:
+To display text, you need to add a text component to an entity, as shown below:
-
+
### Parameter Description
-Select an entity with a TextRenderer component, you can set all related properties in the right-side inspector to configure the text component:
-
+Select an entity with the TextRenderer component, and you can set all related properties in the inspector on the right to configure the text component:
+
-The properties are described as follows:
+The property descriptions are as follows:
| Property | Description |
| :--- | :--- |
-| `Text` | Text to be displayed |
+| `Text` | The text to be displayed |
| `Color` | Text color |
| `FontSize` | Font size of the text |
| `Font` | Custom font |
-| `Width` | Width of the text in 3D space, used for bounding box calculation and determining line breaks when displaying multiline text |
-| `Height` | Height of the text in 3D space, used for bounding box calculation and determining line breaks when displaying multiline text |
-| `LineSpacing` | Spacing between lines |
+| `Width` | The width of the text in 3D space, used for bounding box calculation and determining line breaks when multi-line text is needed |
+| `Height` | The height of the text in 3D space, used for bounding box calculation and determining line breaks when multi-line text is needed |
+| `LineSpacing` | Line spacing |
| `FontStyle` | Font style settings: bold/italic |
-| `HorizontalAlignment` | Horizontal alignment, options are: Left/Center/Right |
-| `VerticalAlignment` | Vertical alignment, options are: Top/Center/Bottom |
-| `EnableWrapping` | Enable wrapping mode, when enabled, text will wrap based on the set width. If width is set to 0, the text will not be rendered |
-| `OverflowMode` | Handling method when the total height of the text exceeds the set height, options are: Overflow/Truncate. Overflow means direct overflow display, Truncate means only content within the set height will be displayed, the specific display content also depends on the vertical alignment of the text |
-| `Mask Interaction` | Mask type, used to determine if the text needs masking, and if masking is required, whether to display content inside or outside the mask |
-| `Mask Layer` | Mask layer to which the text belongs, used for matching with SpriteMask, default is Everything, indicating it can be masked with any SpriteMask |
-| `priority` | Rendering priority, the smaller the value, the higher the rendering priority, and will be rendered first |
+| `HorizontalAlignment` | Horizontal alignment options: Left/Center/Right |
+| `VerticalAlignment` | Vertical alignment options: Top/Center/Bottom |
+| `EnableWrapping` | Whether to enable wrapping mode. When wrapping mode is enabled, the text will wrap according to the set width. If the width is set to 0, the text will not render |
+| `OverflowMode` | Handling method when the total height of the text exceeds the set height. Options: Overflow/Truncate. Overflow means the text will overflow and display, Truncate means only the content within the set height will be displayed. The specific display content is also related to the vertical alignment of the text |
+| `Mask Interaction` | Mask type, used to set whether the text needs a mask, and if so, whether to display the content inside or outside the mask |
+| `Mask Layer` | The mask layer to which the text belongs, used to match with SpriteMask. The default is Everything, meaning it can be masked by any SpriteMask |
+| `priority` | Rendering priority. The smaller the value, the higher the rendering priority, and the earlier it will be rendered |
-### Setting Displayed Text
+### Set Display Text
After adding the text component, you can set the Text property to display the desired text, as shown below:
-
+
-### Setting Custom Font
+### Set Custom Font
-To make the text display more diverse, developers can upload their own font files. The editor currently supports font file formats: **.ttf**, **.otf**, **.woff**
+To make the text display more diverse, developers can upload their own font files. The editor currently supports the following font file formats: **.ttf**, **.otf**, **.woff**
-
+
## Script Usage
-1. Create a [TextRenderer](/apis/core/#TextRenderer) component to display text
-2. Set a [Font](/apis/core/#Font) object through the font property
-3. Set the text to be displayed through the text property
-4. Set the font size through the fontSize property
-5. Set the text color through the color property
+1. Create a [TextRenderer](/en/apis/core/#TextRenderer) component to display text
+2. Set the [Font](/en/apis/core/#Font) object through the font property
+3. Set the text to be displayed through the text property
+3. Set the font size through the fontSize property
+4. Set the text color through the color property
```typescript
import {
@@ -74,34 +74,34 @@ import {
} from "@galacean/engine";
const textEntity = rootEntity.createChild("text");
-// Add a TextRenderer component to the entity.
+// 给实体添加 TextRenderer 组件
const textRenderer = textEntity.addComponent(TextRenderer);
-// Set the Font object via font.
+// 通过 font 设置 Font 对象
textRenderer.font = Font.createFromOS(engine, "Arial");
-// Set the text to be displayed through text.
+// 通过 text 设置需要显示的文本
textRenderer.text = "Galacean 会写字了!";
-// Set the font size via fontSize.
+// 通过 fontSize 设置字体大小
textRenderer.fontSize = 36;
-// Set the text color via color.
+// 通过 color 设置文本颜色
textRenderer.color.set(1, 0, 0, 1);
```
-### Setting Width and Height
+### Set Width and Height
-You can set the size of the text in 3D space using width/height, mainly for the following purposes:
-1. Used for bounding box calculation
-2. Used to determine line break rules when displaying multiline text
+You can set the size of the text in 3D space through width/height, which mainly has the following uses:
+1. Used for bounding box calculation
+2. When multi-line text is needed, the width and height will be used to determine the line break principle
```typescript
-// Set width.
+// 设置宽
textRenderer.width = 10;
-// Set height.
+// 设置高
textRenderer.height = 10;
```
-### Setting Line Spacing
+### Set Line Spacing
-When displaying multiline text, you can set the vertical spacing between two lines using lineSpacing.
+When you need to display multiple lines of text, you can set the vertical spacing between two lines of text through `lineSpacing`.
```typescript
// Set line spacing
@@ -110,51 +110,51 @@ textRenderer.lineSpacing = 0.1;
### Multi-line Text Display
-When the text is too long, you may want the text to be displayed on multiple lines. In this case, you can enable wrapping by setting the `enableWrapping` field to true. Once wrapping is enabled, the text will wrap based on the width set earlier. If the width is set to 0, the text will not be rendered.
+When the text is too long, you may want the text to be displayed in multiple lines. At this time, you can set the `enableWrapping` field to enable the wrapping mode. After turning on the wrapping mode, it will wrap according to the width set earlier. If the width is set to 0 at this time, the text will not be rendered.
```typescript
-// Enable wrapping mode
+// Turn on wrapping mode
textRenderer.enableWrapping = true;
```
### Text Truncation
-When displaying multi-line text, there may be too many lines of text. In this case, you can use the `overflowMode` field to determine whether to truncate and only display content within the set height. The specific content displayed also depends on the vertical alignment of the text (see: Text Alignment), as shown below:
+When displaying multiple lines of text, there may be too many lines of text. At this time, you can set the `overflowMode` field to determine whether to truncate part of the display, only retaining the content within the set height. The specific display content is also related to the vertical alignment of the text (see: Text Alignment), as follows:
```typescript
-// Set text support overflow.
+// 文本溢出
textRenderer.overflowMode = OverflowMode.Overflow;
-// Set text support truncate.
+// 文本截取
textRenderer.overflowMode = OverflowMode.Truncate;
```
### Text Alignment
-Text alignment is used to determine how text should be displayed within a specified width and height. The following attributes are available:
+Text alignment is used to set how the text is displayed within the specified width and height, as follows:
-| Attribute | Type | Description |
-| :----------------------------------------------------------------- | :------------------------------------------------------------- | :------------------------------------------------------------------------------ |
-| [horizontalAlignment](/apis/core/#TextRenderer-horizontalAlignment) | [TextHorizontalAlignment](/apis/core/#TextHorizontalAlignment) | Horizontal alignment: Left/Center/Right represent left/center/right alignment |
-| [verticalAlignment](/apis/core/#TextRenderer-horizontalAlignment) | [TextVerticalAlignment](/apis/core/#TextVerticalAlignment) | Vertical alignment: Top/Center/Bottom represent top/center/bottom alignment |
+| Property Name | Property Type | Description |
+| :------------------------------------------------------------------- | :-------------------------------------------------------------- | :-------------------------------------------------------------------------- |
+| [horizontalAlignment](/en/apis/core/#TextRenderer-horizontalAlignment) | [TextHorizontalAlignment](/en/apis/core/#TextHorizontalAlignment) | Horizontal alignment: Left/Center/Right represent left-aligned/center-aligned/right-aligned display respectively |
+| [verticalAlignment](/en/apis/core/#TextRenderer-horizontalAlignment) | [TextVerticalAlignment](/en/apis/core/#TextVerticalAlignment) | Vertical alignment: Top/Center/Bottom represent top-aligned/center-aligned/bottom-aligned display respectively |
-### Text Font Styles
+### Text Font Style
-Text font styles are used to set whether the text should be displayed in bold or italic. The following attributes are available:
+The text font style is used to set whether the text is displayed in bold or italic, as follows:
-| Attribute | Type | Description |
-| :------------------------------------------------ | :--------------------------------- | :---------------------------------------------- |
-| [fontStyle](/apis/core/#TextRenderer-fontStyle) | [FontStyle](/apis/core/#FontStyle) | Font style: None/Bold/Italic represent normal/bold/italic display |
+| Property Name | Property Type | Description |
+| :------------------------------------------------- | :------------------------------ | :------------------------------------------------ |
+| [fontStyle](/en/apis/core/#TextRenderer-fontStyle) | [FontStyle](/en/apis/core/#FontStyle) | Font style: None/Bold/Italic represent normal/bold/italic display respectively |
-Usage:
+Usage is as follows:
```typescript
-// Normal display.
+// 正常显示
textRenderer.fontStyle = FontStyle.None;
-// Bold display.
+// 加粗显示
textRenderer.fontStyle = FontStyle.Bold;
-// Italic display.
+// 斜体显示
textRenderer.fontStyle = FontStyle.Italic;
-// Display in both bold and italic.
+// 既加粗又斜体显示
textRenderer.fontStyle = FontStyle.Bold | FontStyle.Italic;
```
@@ -164,12 +164,11 @@ textRenderer.fontStyle = FontStyle.Bold | FontStyle.Italic;
### Custom Fonts
-[Font](/apis/core/#Font) is a font resource used to represent the font used for text.
+[Font](/en/apis/core/#Font) is a font resource used to represent the font used by the text.
-| Attribute | Type | Description |
-| :-------------------------------- | :------- | :-------------------------------------------------------------------------- |
-| [name](/apis/core/#Sprite-name) | string | Font resource name, used to uniquely identify a font resource, currently used to indicate the required system font |
-```
+| Property Name | Property Type | Description |
+| :----------------------------------- | :------------ | :-------------------------------------------------------------------------- |
+| [name](/en/apis/core/#Sprite-name) | string | Font resource name, used to uniquely identify a font resource. Currently, this field is used to represent the required system font |
```typescript
const font = Font.createFromOS(engine, "Arial");
diff --git a/docs/en/graphics/_meta.json b/docs/en/graphics/_meta.json
new file mode 100644
index 0000000000..a83b596243
--- /dev/null
+++ b/docs/en/graphics/_meta.json
@@ -0,0 +1,38 @@
+{
+ "camera": {
+ "title": "Camera"
+ },
+ "background": {
+ "title": "Background"
+ },
+ "light": {
+ "title": "Light"
+ },
+ "renderer": {
+ "title": "Renderer"
+ },
+ "model": {
+ "title": "Model"
+ },
+ "mesh": {
+ "title": "Mesh"
+ },
+ "material": {
+ "title": "Material"
+ },
+ "shader": {
+ "title": "Shader"
+ },
+ "texture": {
+ "title": "Texture"
+ },
+ "2D": {
+ "title": "2D"
+ },
+ "particle": {
+ "title": "Particle"
+ },
+ "postProcess": {
+ "title": "Post Process"
+ }
+}
diff --git a/docs/en/graphics/background/sky.md b/docs/en/graphics/background/sky.md
index 21ab33eed9..61e9f9f3b9 100644
--- a/docs/en/graphics/background/sky.md
+++ b/docs/en/graphics/background/sky.md
@@ -6,35 +6,35 @@ group: Background
label: Graphics/Background
---
-Sky is a type of background that is drawn before the camera renders. This type of background is very useful for 3D games and applications as it can provide a sense of depth, making the environment appear much larger than it actually is. The sky itself can contain any objects (such as clouds, mountains, buildings, and other unreachable objects) to create a sense of a distant three-dimensional environment. Galacean can also use the sky to generate realistic environmental lighting in the scene, for more details refer to [Baking](/en/docs/graphics-light-bake).
+The sky is a type of background drawn before the camera renders. This type of background is useful for 3D games and applications because it provides a sense of depth, making the environment appear much larger than its actual size. The sky itself can contain any objects (such as clouds, mountains, buildings, and other unreachable objects) to create the feeling of a distant three-dimensional environment. Galacean can also use the sky to produce realistic ambient lighting in the scene. For more details, refer to [Baking](/en/docs/graphics/light/bake/).
-In Sky mode, developers can set the `material` and `mesh` themselves, and with Galacean's built-in `Skybox` and `Procedural Sky`, they can easily set the desired sky effect.
+In sky mode, developers can set `material` and `mesh` themselves. With Galacean's built-in `skybox` and `procedural sky`, you can set the desired sky effect with one click.
-## Setting up Skybox
+## Setting the Skybox
-In the editor, you can set up a skybox for the background by following these steps:
+In the editor, simply follow these steps to set the skybox for the background:
### 1. Create Skybox Texture
-> You can download free HDR textures from [Poly Haven](https://polyhaven.com/) or [BimAnt HDRI](http://hdri.bimant.com/)
+> You can download free HDR maps from [Poly Haven](https://polyhaven.com/) or [BimAnt HDRI](http://hdri.bimant.com/)
-The skybox texture is a [cubemap texture](/en/docs/graphics-texture-cube), first prepare the HDR, then follow the path **[Asset Panel](/en/docs/assets/interface)** -> **Right-click to upload** -> **Select TextureCube(.hdr)** -> **Choose the corresponding HDR texture** -> **Cubemap asset created** to complete the operation.
+A skybox texture is a [cube texture](/en/docs/graphics/texture/cube/). After preparing the HDR, follow the path **[Asset Panel](/en/docs/assets/interface)** -> **Right-click Upload** -> **Select TextureCube(.hdr)** -> **Choose the corresponding HDR map** -> **Cube texture asset creation complete**.

### 2. Create Skybox Material
-After creating the cubemap asset, follow the path **[Asset Panel](/en/docs/assets/interface)** -> **Right-click to create** -> **Select Material** -> **Select the generated asset** -> **[Inspector Panel](/en/docs/interface/inspector)** -> **Click on the Shader property in the Base column** -> **Select Sky Box** -> **Click on HDR in the Base column** -> **Select the cubemap created in the first step** to create the skybox material.
+After creating the cube texture asset, follow the path **[Asset Panel](/en/docs/assets/interface)** -> **Right-click Create** -> **Select Material** -> **Select the generated asset** -> **[Inspector Panel](/en/docs/interface/inspector)** -> **Click the Shader property in the Base section** -> **Select Sky Box** -> **Click HDR in the Base section** -> **Choose the cube texture created in the first step** to create the skybox material.

-### 3. Set up Skybox
+### 3. Set the Skybox
-Finally, just follow the path **[Hierarchy Panel](/en/docs/interface/hierarchy)** -> **Select Scene** -> **[Inspector Panel](/en/docs/interface/inspector)** -> **Background section** -> **Set Mode to Sky** -> **Select the material created in the second step for Material** -> **Set Mesh to the built-in Cuboid** to see the background of the scene change to a skybox.
+Finally, follow the path **[Hierarchy Panel](/en/docs/interface/hierarchy)** -> **Select Scene** -> **[Inspector Panel](/en/docs/interface/inspector)** -> **Background section** -> **Set Mode to Sky** -> **Select the material created in the second step** -> **Set Mesh to the built-in Cuboid** to see the background of the scene change to the skybox.

-### Code for setting up Skybox
+### Code to Set Skybox
```typescript
// 创建天空盒纹理
@@ -59,13 +59,13 @@ background.sky.material = skyMaterial;
background.sky.mesh = PrimitiveMesh.createCuboid(engine, 2, 2, 2);
```
-## Setting up Procedural Sky
+## Setting Procedural Sky
-Procedural Sky is the default background in the editor for 3D projects. You can also follow the path **[Hierarchy Panel](/en/docs/interface/hierarchy)** -> **Select Scene** -> **[Inspector Panel](/en/docs/interface/inspector)** -> **Background section** -> **Set Mode to Sky** -> **Select the built-in SkyMat material** -> **Set Mesh to the built-in Sphere**
+Procedural sky is the default background in 3D projects in the editor. You can also follow the path **[Hierarchy Panel](/en/docs/interface/hierarchy)** -> **Select Scene** -> **[Inspector Panel](/en/docs/interface/inspector)** -> **Background section** -> **Set Mode to Sky** -> **Select the built-in SkyMat material** -> **Set Mesh to the built-in Sphere**

-### Code for setting up Procedural Sky
+### Code to Set Procedural Sky
```typescript
// 创建大气散射材质
@@ -79,17 +79,18 @@ background.sky.mesh = PrimitiveMesh.createSphere(engine);
### Properties
-In the **[Inspector Panel](/en/docs/interface/inspector)** of the atmospheric scattering material, you can see many adjustable properties:
+In the atmospheric scattering material's **[Inspector Panel](/en/docs/interface/inspector)**, you can see many adjustable properties:
-
-| Property Name | Explanation |
-| :------------------------------------------------------------------------- | :-------------------------------------------------------------------------------------------- |
-| [exposure](/apis/core/#SkyProceduralMaterial-exposure) | The exposure of the sky, the higher the value, the brighter the sky. |
-| [sunMode](/apis/core/#SkyProceduralMaterial-sunMode) | The method used to generate the sun in the sky, including `None`, `Simple`, and `HighQuality`, where None does not generate a sun, Simple generates a simple sun, and HighQuality generates a sun with a customizable appearance. |
-| [sunSize](/apis/core/#SkyProceduralMaterial-sunSize) | The size of the sun, the larger the value, the larger the sun. |
-| [sunSizeConvergence](/apis/core/#SkyProceduralMaterial-sunSizeConvergence) | The convergence of the sun's size, only effective when the sun generation mode is `HighQuality`. |
-| [atmosphereThickness](/apis/core/#SkyProceduralMaterial-atmosphereThickness) | The density of the atmosphere, higher density absorbs more light. |
-| [skyTint](/apis/core/#SkyProceduralMaterial-skyTint) | The color of the sky. |
-| [groundTint](/apis/core/#SkyProceduralMaterial-groundTint) | The color of the ground. |
+> The built-in atmospheric scattering material cannot have its properties freely adjusted; developers can create and adjust their own.
+
+| Property Name | Description |
+| :-------------------------------------------------------------------------- | :------------------------------------------------------------------------------------------------------------------------------------------------------ |
+| [exposure](/en/apis/core/#SkyProceduralMaterial-exposure) | The exposure of the sky, the higher the value, the brighter the sky. |
+| [sunMode](/en/apis/core/#SkyProceduralMaterial-sunMode) | The method used to generate the sun in the sky, including `None`, `Simple`, and `HighQuality`. None does not generate a sun, Simple generates a simple sun, and HighQuality generates a sun with a definable appearance. |
+| [sunSize](/en/apis/core/#SkyProceduralMaterial-sunSize) | The size of the sun, the larger the value, the larger the sun. |
+| [sunSizeConvergence](/en/apis/core/#SkyProceduralMaterial-sunSizeConvergence) | The convergence of the sun's size, effective only when the sun generation mode is `HighQuality`. |
+| [atmosphereThickness](/en/apis/core/#SkyProceduralMaterial-atmosphereThickness) | The density of the atmosphere, higher density absorbs more light. |
+| [skyTint](/en/apis/core/#SkyProceduralMaterial-skyTint) | The color of the sky. |
+| [groundTint](/en/apis/core/#SkyProceduralMaterial-groundTint) | The color of the ground. |
diff --git a/docs/en/graphics/background/solidColor.md b/docs/en/graphics/background/solidColor.md
index d2f9da007d..39ebc0296c 100644
--- a/docs/en/graphics/background/solidColor.md
+++ b/docs/en/graphics/background/solidColor.md
@@ -6,15 +6,15 @@ group: Background
label: Graphics/Background
---
-When the background type of the scene is set to Solid Color, the rendering area of the canvas will be filled with the corresponding solid color background before camera rendering.
+When the background type of the scene is set to solid color, the rendering area of the canvas will be filled with the corresponding solid color background before the camera renders.
-## Set Solid Color Background
+## Setting a Solid Color Background
-Navigate to **[Hierarchy Panel](/en/docs/interface/hierarchy)** -> **Select Scene** -> **[Inspector Panel](/en/docs/interface/inspector)** -> **Background section** and set **Mode** to **Solid Color**. Then choose the desired background color, and you will see the background of the scene change in real-time.
+Navigate to **[Hierarchy Panel](/en/docs/interface/hierarchy)** -> **Select Scene** -> **[Inspector Panel](/en/docs/interface/inspector)** -> **Background Section** and set **Mode** to **Solid Color**, then choose the desired background color. You will see the background of the scene change in real-time.

-Similarly, you can also set this in scripts using the following code:
+Similarly, you can also set it in the script with the following code:
```typescript
// 获取当前场景的背景实例
@@ -29,9 +29,9 @@ background.solidColor.set(0, 0, 0, 0);
## Properties
-It is important to note that all background-related properties are within the scene's `background` property. You need to modify these properties after obtaining the instance of this property for the changes to take effect.
+It is important to note that the relevant properties of the background are all in the `background` property of the scene. After obtaining this property instance, you need to modify the relevant properties for them to take effect.
-| Property | Description |
-| :---------- | :----------- |
-| solidColor | Set the background color |
+| Property | Function |
+| :--------- | :---------------- |
+| solidColor | Sets the background color |
diff --git a/docs/en/graphics/background/texture.md b/docs/en/graphics/background/texture.md
index 394f66de27..28d37f5e97 100644
--- a/docs/en/graphics/background/texture.md
+++ b/docs/en/graphics/background/texture.md
@@ -6,15 +6,15 @@ group: Background
label: Graphics/Background
---
-When the background type of the scene is set to texture, the rendering area of the canvas will be filled with the corresponding texture according to the fill rule before camera rendering.
+When the background type of the scene is set to texture, the rendering area of the canvas will be filled with the corresponding texture according to the fill rules before the camera renders.
-## Set Texture Background
+## Setting Texture Background
-Based on the path **[Hierarchy Panel](/en/docs/interface/hierarchy)** -> **Select Scene** -> **[Inspector Panel](/en/docs/interface/inspector)** -> Set **Mode** to **Texture** in the **Background** section, then choose the desired texture. You can see the background of the scene change in real-time.
+According to the path **[Hierarchy Panel](/en/docs/interface/hierarchy)** -> **Select Scene** -> **[Inspector Panel](/en/docs/interface/inspector)** -> **Background Section** set **Mode** to **Texture**, then select the desired texture, you can see the background of the scene change in real-time.

-Similarly, you can also set it in scripts using the following code:
+Similarly, in the script, you can also set it with the following code:
```typescript
// 获取当前场景的背景实例
@@ -32,23 +32,23 @@ background.textureFillMode = BackgroundTextureFillMode.Fill;
## Properties
-It is important to note that all background-related properties are in the `background` property of the scene. Only after obtaining this property instance can you modify the related properties to take effect.
+It should be noted that the relevant properties of the background are all in the `background` property of the scene. After obtaining this property instance, modifying the relevant properties will take effect.
-| Property | Description |
-| :--------------- | :------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
-| texture | Set the background texture |
-| textureFillMode | Set the fill mode of the background texture, options are [AspectFitWidth](/apis/core/#BackgroundTextureFillMode-AspectFitWidth), [AspectFitHeight](/apis/core/#BackgroundTextureFillMode-AspectFitHeight), or [Fill](/apis/core/#BackgroundTextureFillMode-Fill), default is `BackgroundTextureFillMode.AspectFitHeight` |
+| Property | Function |
+| :-------------- | :-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
+| texture | Set the background texture |
+| textureFillMode | Set the fill mode of the background texture, optional [AspectFitWidth](/en/apis/core/#BackgroundTextureFillMode-AspectFitWidth), [AspectFitHeight](/en/apis/core/#BackgroundTextureFillMode-AspectFitHeight) or [Fill](/en/apis/core/#BackgroundTextureFillMode-Fill), default is `BackgroundTextureFillMode.AspectFitHeight` |
-### Fill Modes
+### Fill Mode
-Set the texture adaptation mode by `background.textureFillMode = BackgroundTextureFillMode.AspectFitWidth`.
+Set the texture adaptation mode through `background.textureFillMode = BackgroundTextureFillMode.AspectFitWidth`.
-Currently, there are three fill modes for texture adaptation:
+Currently, there are three texture adaptation modes:
-| Fill Mode | Description |
-| ------------------------------------------------------------------------ | ----------------------------------------------------- |
-| [AspectFitWidth](/apis/core/#BackgroundTextureFillMode-AspectFitWidth) | Maintain aspect ratio, scale the texture width to fit the canvas width, center vertically. |
-| [AspectFitHeight](/apis/core/#BackgroundTextureFillMode-AspectFitHeight) | Maintain aspect ratio, scale the texture height to fit the canvas height, center horizontally. |
-| [Fill](/apis/core/#BackgroundTextureFillMode-Fill) | Fill the width and height of the canvas with the texture. |
+| Fill Mode | Description |
+| ----------------------------------------------------------------------- | -------------------------------------------------- |
+| [AspectFitWidth](/en/apis/core/#BackgroundTextureFillMode-AspectFitWidth) | Maintain aspect ratio, scale the texture width to the width of the Canvas, centered vertically. |
+| [AspectFitHeight](/en/apis/core/#BackgroundTextureFillMode-AspectFitHeight) | Maintain aspect ratio, scale the texture height to the height of the Canvas, centered horizontally. |
+| [Fill](/en/apis/core/#BackgroundTextureFillMode-Fill) | Fill the width and height of the Canvas with the texture. |
-Please paste the Markdown content you need to be translated.
+It seems like your message is empty. Please paste the Markdown content you want translated, and I'll help you with the translation while adhering to the rules you've provided.
diff --git a/docs/en/graphics/camera/camera.md b/docs/en/graphics/camera/camera.md
index 402546fc4b..c851dc3bc2 100644
--- a/docs/en/graphics/camera/camera.md
+++ b/docs/en/graphics/camera/camera.md
@@ -6,52 +6,52 @@ group: Camera
label: Graphics/Camera
---
-A camera is an abstract concept in a graphics engine for [3D projection](https://en.wikipedia.org/wiki/3D_projection), similar to a camera or eye in the real world. Galacean's camera implements automatic frustum culling, rendering only objects within the view frustum.
+A camera is an abstract concept in a graphics engine for [3D projection](https://en.wikipedia.org/wiki/3D_projection), similar to a camera or eyes in the real world. Without a camera, the canvas will render nothing. Galacean's camera implements automatic frustum culling, rendering only objects within the frustum.
## Types of Cameras
### Perspective Projection
-Perspective projection follows our model of objects appearing larger when closer and smaller when farther away. Take a look at the diagram of the perspective model:
+Perspective projection conforms to our model of objects appearing larger when closer and smaller when farther away. Here is a diagram of the perspective model:
-As shown in the diagram, the near clipping plane ([nearClipPlane](/apis/core/#Camera-nearClipPlane)), far clipping plane ([farClipPlane](/apis/core/#Camera-farClipPlane)), and field of view ([fieldOfView](/apis/core/#Camera-fieldOfView)) form a view frustum ([_View Frustum_](https://en.wikipedia.org/wiki/Viewing_frustum)). Objects inside the view frustum are projected onto the camera and rendered on the canvas, while objects outside the view frustum are clipped.
+As shown in the diagram above, the near clipping plane ([nearClipPlane](/en/apis/core/#Camera-nearClipPlane)), far clipping plane ([farClipPlane](/en/apis/core/#Camera-farClipPlane)), and field of view ([fieldOfView](/en/apis/core/#Camera-fieldOfView)) form a view frustum ([_View Frustum_](https://en.wikipedia.org/wiki/Viewing_frustum)). Objects within the frustum are projected into the camera and rendered on the canvas, while objects outside the frustum are clipped.
### Orthographic Projection
-Orthographic projection shows objects at the same size regardless of their distance from the viewer. The visible area generated by the orthographic projection model is called an orthographic box, depicted as follows:
+In orthographic projection, objects appear the same size regardless of their distance from the camera. The visible area created by orthographic projection is called a box-shaped view volume, as shown below:
-As shown in the diagram, there are top, bottom, left, and right values. Galacean simplifies the orthographic properties to align more with developers' habits, using only [orthographicSize](/apis/core/#Camera-orthographicSize). The relationships between the properties and [orthographicSize](/apis/core/#Camera-orthographicSize) are as follows:
+As shown in the diagram above, there are top, bottom, left, and right boundaries. Galacean simplifies orthographic properties to better suit developers' habits, using only [orthographicSize](/en/apis/core/#Camera-orthographicSize). The relationships between the various properties and [orthographicSize](/en/apis/core/#Camera-orthographicSize) are as follows:
- `top = orthographicSize`
- `bottom = -orthographicSize`
- `right = orthographicSize * aspectRatio`
- `left = -orthographicSize * aspectRatio`
-### Choosing Between Projections
+### How to Choose
-By comparing perspective and orthographic projections, we can identify their differences:
+Comparing perspective projection and orthographic projection reveals their differences:
-- Visual area model
-- Presence of the near/far effect
+- View volume model
+- Whether objects appear larger when closer and smaller when farther away
-The following example demonstrates the visual differences between orthographic and perspective cameras. In summary, choose an orthographic camera for 2D effects and a perspective camera for 3D effects.
+The following example visually demonstrates the difference between rendering with an orthographic camera and a perspective camera. In short, choose an orthographic camera for 2D effects and a perspective camera for 3D effects.
## Camera Orientation
-In Galacean, local coordinates and world coordinates follow the `right-hand coordinate system`, so the camera's `forward` direction is along the `-Z` axis, and the direction of the camera's view is also in the `-Z` direction.
+In Galacean, both local and world coordinates follow the `right-hand coordinate system`, so the camera's `forward` direction is the `-Z` axis, and the camera's viewing direction is also the `-Z` direction.
## Getting Started
-Now that we've covered the basic concepts of cameras, let's get started:
+Having introduced the basic concepts of the camera, let's get started:
-- Add a [camera component](/en/docs/graphics-camera-component) to the scene
-- Use [camera controls](/en/docs/graphics-camera-control) to easily manipulate the [camera component](/en/docs/graphics-camera-component)
-- Utilize [multiple cameras](/docs/graphics/camera/multiCamera/) in the scene
-- Access [camera depth textures](/docs/graphics/camera/texture/)
+- Add a [camera component](/en/docs/graphics/camera/component/) to the scene
+- Use [camera controls](/en/docs/graphics/camera/control/) to more conveniently manipulate the [camera component](/en/docs/graphics/camera/component/)
+- Use [multiple cameras](/en/docs/graphics/camera/multiCamera/) in the scene
+- Obtain [camera textures](/en/docs/graphics/camera/texture/)
diff --git a/docs/en/graphics/camera/component.md b/docs/en/graphics/camera/component.md
index b27976bde1..1070262f35 100644
--- a/docs/en/graphics/camera/component.md
+++ b/docs/en/graphics/camera/component.md
@@ -8,15 +8,15 @@ label: Graphics/Camera
The camera component can project a 3D scene onto a 2D screen. Based on the camera component, we can customize various rendering effects.
-First, you need to attach the camera component to an activated [Entity](/en/docs/core/entity) in the scene. The editor project usually comes with a camera component, but you can also add it manually.
+First, you need to mount the camera component onto an activated [Entity](/en/docs/core/entity) in the scene. Editor projects usually come with a camera component by default, but you can also add one manually.
-After adding it, you can view the camera properties in the inspector, and the camera preview in the bottom left corner allows you to easily see the camera effect when the project is running:
+After adding it, you can view the camera properties in the inspector, and the camera preview in the lower left corner allows you to conveniently see the camera effect during the actual project runtime:
-You can also attach the camera component to an [Entity](/en/docs/core/entity) in scripts using the following code:
+You can also mount the camera component to an [Entity](/en/docs/core/entity) in the script with the following code:
```typescript
// 创建实体
@@ -27,11 +27,11 @@ const camera = entity.addComponent(Camera);
## Properties
-Customize rendering effects by modifying the properties of the camera component. Below are the properties exposed in the **[Inspector Panel](/en/docs/interface/inspector)** for the camera component.
+By modifying the properties of the camera component, you can customize the rendering effects. Below are the property settings exposed by the camera component in the **[Inspector Panel](/en/docs/interface/inspector)**.
-You can also get the camera component in scripts and set the corresponding properties.
+You can also get the camera component and set the corresponding properties through the script.
```typescript
// 从挂载相机的节点上获取相机组件
@@ -58,71 +58,51 @@ camera.enablePostProcess = true;
camera.enableHDR = true;
```
-The functionality of each property is as follows:
-
-| Type | Property | Description |
-| :------- | :------------------------------------------------------------ | :------------------------------------------------------------------------------------------------------ |
-| General | [isOrthographic](/apis/core/#Camera-isOrthographic) | Determines whether to use orthographic or perspective projection. Default is `false`. |
-| | [nearClipPlane](/apis/core/#Camera-nearClipPlane) | Near clipping plane |
-| | [farClipPlane](/apis/core/#Camera-farClipPlane) | Far clipping plane |
-| | [viewport](/apis/core/#Camera-viewport) | Viewport, determines the area in the target device where the content is rendered. |
-| | [priority](/apis/core/#Camera-priority) | Rendering priority, used to determine the order in which the camera's content is rendered in a multi-camera scenario. |
-| | [enableFrustumCulling](/apis/core/#Camera-enableFrustumCulling) | Whether to enable frustum culling, default is `true`. |
-| | [clearFlags](/apis/core/#Camera-clearFlags) | Flags to clear the canvas buffer before rendering with this camera. |
-| | [cullingMask](/apis/core/#Camera-cullingMask) | Culling mask, used to selectively render rendering components in the scene. |
-| | [aspectRatio](/apis/core/#Camera-aspectRatio) | Aspect ratio of the rendering target, usually automatically calculated based on the canvas size, but can be manually changed (not recommended). |
-| | [renderTarget](/apis/core/#Camera-renderTarget) | Render target, determines where the content is rendered. |
-| | [pixelViewport](/apis/core/#Camera-pixelViewport) | The viewport of the camera on the screen (in pixel coordinates). In pixel screen coordinates, the top left corner is (0, 0) and the bottom right corner is (1.0, 1.0). |
-| Perspective Projection | [fieldOfView](/apis/core/#Camera-fieldOfView) | Field of view |
-| Orthographic Projection | [orthographicSize](/apis/core/#Camera-orthographicSize) | Half size of the camera in orthographic mode |
-| Rendering Related | [depthTextureMode](/apis/core/#Camera-depthTextureMode) | Depth texture mode, default is `DepthTextureMode.None`. If enabled, the `camera_DepthTexture` depth texture can be used in shaders. |
-| | [opaqueTextureEnabled](/apis/core/#Camera-opaqueTextureEnabled) | Whether to enable opaque texture, default is off. If enabled, the `camera_OpaqueTexture` opaque texture can be used in shaders in the transparent queue. |
-| | [opaqueTextureDownsampling](/apis/core/#Camera-opaqueTextureDownsampling) | When enabling opaque texture, downsampling can be set based on clarity requirements and performance considerations. |
-| | [msaaSamples](/apis/core/#Camera-msaaSamples) | Multi-sample anti-aliasing samples when use independent canvas mode, such as `enableHDR`、`enablePostProcess`、`opaqueTextureEnabled`. |
-| | [enableHDR](/apis/core/#Camera-enableHDR) | Enable HDR rendering, allowing the shader output color to be stored using floating point numbers, which can get a wider range of values for post-processing and other situations. |
-| | [enablePostProcess](/apis/core/#Camera-enablePostProcess) | Enable post process. The specific configuration refs to [Post Process Tutorial](/docs/graphics/postProcess/postProcess).|
-
-### Clipping Masks
-
-The camera component can selectively render rendering components in the scene by setting `cullingMask`.
+The functionality corresponding to each property is as follows:
+
+| Type | Property | Description |
+| :-- | :-- | :-- |
+| General | [isOrthographic](/en/apis/core/#Camera-isOrthographic) | Determines whether to use perspective projection or orthographic projection by setting [isOrthographic](/en/apis/core/#Camera-isOrthographic). Set to `false` for perspective effect, default is `false`. |
+| | [nearClipPlane](/en/apis/core/#Camera-nearClipPlane) | Near clipping plane. Objects closer to the camera than this value will not be rendered properly. |
+| | [farClipPlane](/en/apis/core/#Camera-farClipPlane) | Far clipping plane. Objects farther from the camera than this value will not be rendered properly. |
+| | [viewport](/en/apis/core/#Camera-viewport) | Viewport, determines the range of content rendered to the target device. Modifying this value can determine the final rendering position in the rendering target. |
+| | [priority](/en/apis/core/#Camera-priority) | Rendering priority, used to determine the order in which cameras render their content in the case of multiple cameras. |
+| | [enableFrustumCulling](/en/apis/core/#Camera-enableFrustumCulling) | Whether to enable frustum culling. When enabled, objects outside the rendering range will be culled. Default is `true`. |
+| | [clearFlags](/en/apis/core/#Camera-clearFlags) | Flags to clear the canvas buffer before rendering this camera. By setting these flags, you can selectively retain the results of the previous camera rendering. |
+| | [cullingMask](/en/apis/core/#Camera-cullingMask) | Culling mask, used to selectively render rendering components in the scene. |
+| | [aspectRatio](/en/apis/core/#Camera-aspectRatio) | Aspect ratio of the rendering target, generally automatically calculated based on the canvas size, but can also be manually changed (not recommended). |
+| | [renderTarget](/en/apis/core/#Camera-renderTarget) | Rendering target, determines which target the content is rendered to. |
+| | [pixelViewport](/en/apis/core/#Camera-pixelViewport) | The camera's viewport on the screen (in pixel coordinates). If the rendering target is the canvas and the viewport is the entire canvas, the top-left corner is (0, 0) and the bottom-right corner is (canvas.width, canvas.height). |
+| Perspective Projection | [fieldOfView](/en/apis/core/#Camera-fieldOfView) | Field of view, default is 45 degrees (0, 180). |
+| Orthographic Projection | [orthographicSize](/en/apis/core/#Camera-orthographicSize) | In orthographic mode, half the distance from the top to the bottom of the camera's view. |
+| Rendering Related | [depthTextureMode](/en/apis/core/#Camera-depthTextureMode) | Depth texture mode, default is `DepthTextureMode.None`. If enabled, the `camera_DepthTexture` depth texture can be used in the shader. For details, refer to [Camera Texture](/en/docs/graphics/camera/texture/). |
+| | [opaqueTextureEnabled](/en/apis/core/#Camera-opaqueTextureEnabled) | Whether to enable opaque texture. Default is off. If enabled, the `camera_OpaqueTexture` opaque texture can be used in the shader of the transparent queue. |
+| | [opaqueTextureDownsampling](/en/apis/core/#Camera-opaqueTextureDownsampling) | When opaque texture is enabled, downsampling can be set according to clarity and performance requirements. |
+| | [msaaSamples](/en/apis/core/#Camera-msaaSamples) | Number of samples for multi-sample anti-aliasing, effective only when the standalone canvas is enabled, such as `enableHDR`, `enablePostProcess`, `opaqueTextureEnabled`. |
+| | [enableHDR](/en/apis/core/#Camera-enableHDR) | Whether to enable HDR rendering, allowing the shader's output color to be stored using floating-point numbers, providing a wider range of values for post-processing and other scenarios. |
+| | [enablePostProcess](/en/apis/core/#Camera-enablePostProcess) | Whether to enable post-processing. For post-processing configuration, see [Post-Processing Tutorial](/en/docs/graphics/postProcess/postProcess). |
+
+### Culling Mask
+
+The camera component can selectively render the rendering components in the scene by setting the `cullingMask`.
-### Render Targets
+### Render Target
-The camera component can render the rendering results to different targets by setting `renderTarget`.
+The camera component can render the result to different targets by setting the `renderTarget`.
### Frustum Culling
-The `enableFrustumCulling` property is enabled by default because for a 3D world, the logic "what is not visible does not need to be rendered" is a very natural optimization. Disabling frustum culling means turning off this optimization. If you want to keep this optimization but only want a node to always render, you can set the bounding box of the node's renderer to be infinitely large.
+The `enableFrustumCulling` property is enabled by default because, in a 3D world, "things that are not visible do not need to be rendered" is a very natural logic and is the most basic performance optimization. Disabling frustum culling means turning off this optimization. If you want to keep this optimization but always render a specific node, you can set the bounding box of the node's renderer to be infinite.
## Methods
-The camera component provides various methods (mainly related to `rendering` and `space transformation`) to facilitate developers in achieving the desired customization capabilities.
-
-| Type | Property | Description |
-| :-------- | :---------------------------------------------------------------- | :---------------------------------------- |
-| Rendering | [resetProjectionMatrix](/apis/core/#Camera-resetProjectionMatrix) | Reset the custom projection matrix to automatic mode. |
-| | [resetAspectRatio](/apis/core/#Camera-resetAspectRatio) | Reset the custom aspect ratio to automatic mode. |
-| | [render](/apis/core/#Camera-render) | Manual rendering. |
-| | [setReplacementShader](/apis/core/#Camera-setReplacementShader) | Set the global rendering replacement shader. |
-| | [resetReplacementShader](/apis/core/#Camera-resetReplacementShader)| Clear the global rendering replacement shader. |
-| Space Transformation | [worldToViewportPoint](/apis/core/#Camera-worldToViewportPoint) | Convert a point from world space to viewport space. |
-| | [viewportToWorldPoint](/apis/core/#Camera-viewportToWorldPoint) | Convert a point from viewport space to world space. |
-| | [viewportPointToRay](/apis/core/#Camera-viewportPointToRay) | Generate a world space ray from a point in viewport space. |
-| | [screenToViewportPoint](/apis/core/#Camera-screenToViewportPoint) | Convert a point from screen space to viewport space. |
-| | [viewportToScreenPoint](/apis/core/#Camera-viewportToScreenPoint) | Convert a point from viewport space to screen space. |
-| | [worldToScreenPoint](/apis/core/#Camera-worldToScreenPoint) | Convert a point from world space to screen space. |
-| | [screenToWorldPoint](/apis/core/#Camera-screenToWorldPoint) | Convert a point from screen space to world space. |
-| | [screenPointToRay](/apis/core/#Camera-screenPointToRay) | Generate a world space ray from a point in screen space. |
-
-## Get Camera Component
-
-Assuming you know which node the camera component is mounted on, you can directly retrieve it using `getComponent` or `getComponentsIncludeChildren`:
+The camera component provides various methods (mainly related to `rendering` and `space transformation`) to facilitate developers in achieving the desired customization capabilities. Before that, you need to learn how to get the camera component. If you know which node the camera component is mounted on, you can directly get it through `getComponent` or `getComponentsIncludeChildren`:
```typescript
// 从挂载相机的节点上获取相机组件
@@ -131,14 +111,39 @@ const camera = entity.getComponent(Camera);
const cameras = entity.getComponentsIncludeChildren(Camera, []);
```
-If you are unsure about the node where the camera component is mounted, you can also access all camera components in the scene using a more hacky approach:
+If you are not sure which node the camera component is mounted on, you can also get all the camera components in the scene through a more hacky way:
```typescript
-// Retrieve all camera components in this scene (not recommended)
-const cameras = scene._activeCameras;
+// Get all camera components in this scene (not recommended)
+const cameras = scene._componentsManager._activeCameras;
```
-## onBeginRender and onEndRender
+| Type | Property | Description |
+| :-- | :-- | :-- |
+| Rendering | [resetProjectionMatrix](/en/apis/core/#Camera-resetProjectionMatrix) | Reset the custom projection matrix and revert to automatic mode. |
+| | [resetAspectRatio](/en/apis/core/#Camera-resetAspectRatio) | Reset the custom rendering aspect ratio and revert to automatic mode. |
+| | [render](/en/apis/core/#Camera-render) | Manual rendering. |
+| | [setReplacementShader](/en/apis/core/#Camera-setReplacementShader) | Set a global rendering replacement shader. |
+| | [resetReplacementShader](/en/apis/core/#Camera-resetReplacementShader) | Clear the global rendering replacement shader. |
+| Space Transformation | [worldToViewportPoint](/en/apis/core/#Camera-worldToViewportPoint) | Convert a point from world space to viewport space. |
+| | [viewportToWorldPoint](/en/apis/core/#Camera-viewportToWorldPoint) | Convert a point from viewport space to world space. |
+| | [viewportPointToRay](/en/apis/core/#Camera-viewportPointToRay) | Generate a world space ray from a point in viewport space. |
+| | [screenToViewportPoint](/en/apis/core/#Camera-screenToViewportPoint) | Convert a point from screen space to viewport space. |
+| | [viewportToScreenPoint](/en/apis/core/#Camera-viewportToScreenPoint) | Convert a point from viewport space to screen space. |
+| | [worldToScreenPoint](/en/apis/core/#Camera-worldToScreenPoint) | Convert a point from world space to screen space. |
+| | [screenToWorldPoint](/en/apis/core/#Camera-screenToWorldPoint) | Convert a point from screen space to world space. |
+| | [screenPointToRay](/en/apis/core/#Camera-screenPointToRay) | Generate a world space ray from a point in screen space. |
+
+### Shader Replacement
-Camera components also include [onBeginRender](/apis/core/#Script-onBeginRender) and [onEndRender](/apis/core/#Script-onEndRender) additional lifecycle callbacks. The sequence of these callbacks can be referred to in the [Script Lifecycle Sequence Diagram](/en/docs/script)
+With the ability of `setReplacementShader` to globally replace shaders, you can observe specific rendering effects:
+
+
+
+### Space Transformation
+
+It should be noted that the Z of the point passed into methods like `screenToWorldPoint` and `viewportToWorldPoint` represents the distance from the returned point to the camera.
+
+## onBeginRender and onEndRender
+The camera component additionally includes two lifecycle callbacks, [onBeginRender](/en/apis/core/#Script-onBeginRender) and [onEndRender](/en/apis/core/#Script-onEndRender). Their sequence can be referenced in the [script lifecycle sequence diagram](/en/docs/script/class/#脚本生命周期).
diff --git a/docs/en/graphics/camera/control.md b/docs/en/graphics/camera/control.md
index 4941f7bef1..64c592de4a 100644
--- a/docs/en/graphics/camera/control.md
+++ b/docs/en/graphics/camera/control.md
@@ -6,56 +6,57 @@ group: Camera
label: Graphics/Camera
---
-Camera controls are components used in conjunction with camera components to display three-dimensional scenes. These components customize corresponding parameters based on different functions, controlling the display of the three-dimensional scene by affecting the camera's properties.
+Camera controls are components that work together with the camera component to display a 3D scene. These components customize corresponding parameters based on different functions and control the display of the 3D scene by affecting the camera's properties.
-Camera controls inherit powerful scripts and are mounted on an `Entity` containing the `Camera` component. Therefore, they can naturally access the `Camera` and respond to external inputs and perform corresponding operations in lifecycle functions. **These controls cannot currently be added or operated in the editor and must be added by developers in scripts.**
+Camera controls inherit from powerful scripts and are mounted on an `Entity` that contains the `Camera` component. Therefore, they can naturally access the `Camera`, respond to external inputs in lifecycle functions, and perform corresponding operations. **These controls cannot currently be added in the editor and must be added by developers in the script.**
+
+> Note that before adding camera controls, make sure the node has already added the `Camera` component.
## Orbit Controller
-`OrbitControl` is used to simulate orbit interaction, suitable for 360-degree rotation interaction around a target object. It is important to note that **the orbit controller must be added after adding the camera component**.
+`OrbitControl` is used to simulate orbit interaction, suitable for 360-degree rotation interaction around a target object. Note that **the orbit controller must be added after the camera component**.
-| Property | Description |
-| :---------------- | :-------------------------------------------------------------- |
-| `target` | The position to observe |
-| `autoRotate` | Whether to auto-rotate, default is false, rotation speed can be adjusted through autoRotateSpeed |
-| `autoRotateSpeed` | The speed of auto-rotation |
-| `enableDamping` | Whether to enable camera damping, default is true |
-| `dampingFactor` | Rotation damping parameter, default is 0.1 |
-| `enableKeys` | Whether to support keyboard operation (arrow keys) |
-| `enablePan` | Whether to support camera panning, default is true |
-| `keyPanSpeed` | Magnitude of operation when the key is continuously pressed |
-| `enableRotate` | Whether to support camera rotation, default is true |
-| `rotateSpeed` | Camera rotation speed, default is 1.0 |
-| `enableZoom` | Whether to support camera zoom, default is true |
-| `minAzimuthAngle` | Minimum radians for reasonable range of horizontal operations on onUpdate, default is negative infinity |
-| `maxAzimuthAngle` | Maximum radians for reasonable range of horizontal operations on onUpdate, default is positive infinity |
-| `minDistance` | Minimum value for reasonable range of distance operations on onUpdate |
-| `maxDistance` | Maximum value for reasonable range of distance operations on onUpdate |
-| `minPolarAngle` | Minimum radians for reasonable range of vertical operations on onUpdate |
-| `maxPolarAngle` | Maximum radians for reasonable range of vertical operations on onUpdate |
+| Property | Description |
+| :--------------- | :----------------------------------------------------------------- |
+| `target` | The target position to observe |
+| `autoRotate` | Whether to auto-rotate, default is false, can adjust speed via autoRotateSpeed |
+| `autoRotateSpeed`| Speed of auto-rotation |
+| `enableDamping` | Whether to enable camera damping, default is true |
+| `dampingFactor` | Rotation damping parameter, default is 0.1 |
+| `enableKeys` | Whether to support keyboard operations (arrow keys) |
+| `enablePan` | Whether to support camera panning, default is true |
+| `keyPanSpeed` | Magnitude of operation when the keyboard is continuously pressed |
+| `enableRotate` | Whether to support camera rotation, default is true |
+| `rotateSpeed` | Camera rotation speed, default is 1.0 |
+| `enableZoom` | Whether to support camera zoom, default is true |
+| `minAzimuthAngle`| Minimum azimuth angle for horizontal operations during onUpdate, default is negative infinity |
+| `maxAzimuthAngle`| Maximum azimuth angle for horizontal operations during onUpdate, default is positive infinity |
+| `minDistance` | Minimum distance for reasonable operations during onUpdate |
+| `maxDistance` | Maximum distance for reasonable operations during onUpdate |
+| `minPolarAngle` | Minimum polar angle for vertical operations during onUpdate |
+| `maxPolarAngle` | Maximum polar angle for vertical operations during onUpdate |
## Free Controller
-`FreeControl` is generally used for roaming control, commonly seen in game scenes. It is important to note that **the free controller must be added after adding the camera component**.
+`FreeControl` is generally used for roaming control, commonly seen in game scenes. Note that **the free controller must be added after the camera component**.
-| Property | Description |
-| :-------------- | :--------------------------------------- |
-| `floorMock` | Whether to simulate the floor, default is true |
-| `floorY` | Used in conjunction with `floorMock` to declare the position information of the floor |
-| `movementSpeed` | Movement speed |
-| `rotateSpeed` | Rotation speed |
+| Property | Description |
+| :-------------- | :---------------------------------------- |
+| `floorMock` | Whether to simulate the ground, default is true |
+| `floorY` | Used with `floorMock`, declares the ground position |
+| `movementSpeed` | Movement speed |
+| `rotateSpeed` | Rotation speed |
-#### Orthogonal Controller
+#### Orthographic Controller
-`OrthoControl` is typically used to control scaling and translation in 2D scenes:
+`OrthoControl` is generally used to control zooming and panning in 2D scenes:
| Property | Description |
-| :---------- | :---------- |
+| :---------- | :-----------|
| `zoomSpeed` | Zoom speed |
-
diff --git a/docs/en/graphics/camera/multiCamera.md b/docs/en/graphics/camera/multiCamera.md
index c75207514b..5f06e2da37 100644
--- a/docs/en/graphics/camera/multiCamera.md
+++ b/docs/en/graphics/camera/multiCamera.md
@@ -6,12 +6,12 @@ group: Camera
label: Graphics/Camera
---
-In the case of multiple cameras, many customized rendering effects can be achieved by combining the camera component's [viewport](/apis/core/#Camera-viewport), [cullingMask](/apis/core/#Camera-cullingMask), [clearFlags](/apis/core/#Camera-clearFlags), and other properties.
+In the case of multiple cameras, many customized rendering effects can be achieved by combining camera component properties such as [viewport](/en/apis/core/#Camera-viewport), [cullingMask](/en/apis/core/#Camera-cullingMask), [clearFlags](/en/apis/core/#Camera-clearFlags), and others.
-For example, by setting the [viewport](/apis/core/#Camera-viewport), multiple cameras can render scene content at different positions on the canvas.
+For example, by setting the [viewport](/en/apis/core/#Camera-viewport), multiple cameras can render scene content in different positions on the canvas.
-Another example is achieving picture-in-picture effects by setting the [cullingMask](/apis/core/#Camera-cullingMask).
+Another example is achieving a picture-in-picture effect by setting the [cullingMask](/en/apis/core/#Camera-cullingMask).
diff --git a/docs/en/graphics/camera/texture.md b/docs/en/graphics/camera/texture.md
new file mode 100644
index 0000000000..4c631a4e74
--- /dev/null
+++ b/docs/en/graphics/camera/texture.md
@@ -0,0 +1,19 @@
+---
+order: 4
+title: Camera Textures
+type: Graphics
+group: Camera
+label: Graphics/Camera
+---
+
+## Depth Texture
+
+The camera can enable the depth texture through the [depthTextureMode](/en/apis/galacean/#Camera) property. Once enabled, the depth texture can be accessed in the Shader via the `camera_DepthTexture` property. Depth textures can be used to implement soft particles and water edge transitions, as well as some simple post-processing effects.
+
+
+
+Note: Depth textures only render non-transparent objects.
+
+## Opaque Texture
+
+The camera can enable the opaque texture through the [opaqueTextureEnabled](/en/apis/galacean/#Camera) property. Once enabled, the `camera_OpaqueTexture` can be used in the shader of the transparent queue. Additionally, you can set downsampling according to clarity requirements and performance demands by configuring [opaqueTextureDownsampling](/en/apis/galacean/#Camera).
diff --git a/docs/en/graphics/light/ambient.md b/docs/en/graphics/light/ambient.md
index 279adb8ab7..8a3f28b4f5 100644
--- a/docs/en/graphics/light/ambient.md
+++ b/docs/en/graphics/light/ambient.md
@@ -2,37 +2,37 @@
order: 4
title: Ambient Light
type: Graphics
-group: Lighting
+group: Light
label: Graphics/Light
---
-In addition to real-time computed direct light sources, we generally need to pre-bake ambient light as ambient light for real-time sampling. This method can effectively capture global ambient light and atmosphere, making objects better blend into their environment.
+In addition to real-time computed direct light sources, we generally need to pre-bake lighting offline as ambient light for real-time sampling. This method effectively captures the global illumination and atmosphere of the environment, making objects blend better into their surroundings.

## Editor Usage
-### 1. Ambient Diffuse Reflection
+### 1. Ambient Diffuse
-| Property | Description |
+| Property | Function |
| :-- | :-- |
-| Source | Specify whether the diffuse reflection source is `Background` or `Solid Color`, with the default source being `Background`. `Background` means using the baked spherical harmonic parameters as the diffuse reflection color; `Solid Color` means using a solid color as the diffuse reflection color. |
-| Intensity | Diffuse reflection intensity |
+| Source | Specifies whether the diffuse source is `Background` or `Solid Color`, with the default source being `Background`. `Background` means using the baked spherical harmonics parameters as the diffuse color; `Solid Color` means using a solid color as the diffuse color. |
+| Intensity | Diffuse intensity |
-### 2. Ambient Specular Reflection
+### 2. Ambient Specular
-| Property | Description |
+| Property | Function |
| :-- | :-- |
-| Source | Specify whether the specular reflection source is `Background` or `Custom`, with the default source being `Background`. `Background` means using the pre-filtered environment map obtained based on the background baking as the specular reflection; `Custom` means you can bake an HDR map separately as the environment reflection. |
-| Intensity | Specular reflection intensity |
+| Source | Specifies whether the specular source is `Background` or `Custom`, with the default source being `Background`. `Background` means using the pre-filtered environment map baked from the background as the specular reflection; `Custom` means you can separately bake an HDR map as the environment reflection. |
+| Intensity | Specular intensity |
## Script Usage
-After obtaining the baked product URL through the [baking tutorial](/en/docs/graphics-light-bake), load and parse it using the engine's EnvLoader:
+After obtaining the URL of the baked product through the [baking tutorial](/en/docs/graphics/light/bake/), load and parse it through the engine's EnvLoader:
```typescript
engine.resourceManager
diff --git a/docs/en/graphics/light/bake.md b/docs/en/graphics/light/bake.md
index 2bfd0cabb1..5437c15a81 100644
--- a/docs/en/graphics/light/bake.md
+++ b/docs/en/graphics/light/bake.md
@@ -2,42 +2,42 @@
order: 5
title: Baking
type: Graphics
-group: Lighting
+group: Light
label: Graphics/Light
---
-Baking refers to Galacean precomputing lighting calculations and baking the results into a binary file (including [diffuse spherical harmonics parameters](https://en.wikipedia.org/wiki/Spherical_harmonics) and [pre-filtered environment maps](https://learnopengl.com/PBR/IBL/Specular-IBL/)), then sampling them in real-time during runtime.
+Baking refers to Galacean performing lighting calculations in advance and baking the results into a binary file (containing [diffuse spherical harmonics parameters](https://www.wikiwand.com/zh-hans/%E7%90%83%E8%B0%90%E5%87%BD%E6%95%B0) and [pre-filtered environment maps](https://learnopengl-cn.github.io/07%20PBR/03%20IBL/02%20Specular%20IBL/)), which are then sampled in real-time during runtime.
-We provide baking tools in the [Editor](https://galacean.antgroup.com/editor) and [glTF Viewer](https://galacean.antgroup.com/#/gltf-viewer).
+We provide baking tools in the [editor](https://galacean.antgroup.com/editor) and the [glTF viewer](https://galacean.antgroup.com/engine/gltf-viewer).
## Editor Usage
### 1. Baking Switch
-The editor defaults to automatic baking, which will automatically bake when modifying the background (color, exposure, rotation, etc.) or changing the baking resolution.
+The editor has automatic baking enabled by default. It will automatically bake after modifying the background (color, exposure, rotation, etc.) or changing the baking resolution.
-You can also turn off automatic baking and manually bake when needed.
+You can also disable automatic baking and perform manual baking when needed.
### 2. Baking Resolution
-Represents the resolution of the pre-filtered environment map after baking, defaulting to 128 resolution. The baked product is about 500KB at 128 resolution; the product at 64 resolution is about 125KB. You can choose the appropriate baking resolution based on the scene.
+This indicates the resolution of the pre-filtered environment map after baking. The default resolution is 128, with the baked product being approximately 500KB; a resolution of 64 results in a baked product of about 125KB. You can choose the appropriate baking resolution based on the scene.
### 3. Setting Background
-Refer to the [background tutorial](/en/docs/graphics-background-sky) to set the scene's background. The editor will perform lighting baking based on the set baking resolution and baking switch. Any modifications to the background (color, rotation, exposure, changing HDR textures, etc.) will be automatically baked according to the baking switch. **If you want to set a solid color or transparent background without baking the solid color background, you can first turn off the automatic baking switch, then switch to a [solid color background](/en/docs/graphics-background-solidColor).**
+Refer to the [background tutorial](/en/docs/graphics/background/sky/) to set the scene background. The editor will perform lighting baking based on the baking resolution and baking switch settings. Any modifications to the background (color, rotation, exposure, changing HDR maps, etc.) will depend on the baking switch to decide whether to bake automatically. **If you want to set a solid color background or a transparent background but do not want to bake the solid color background, you can first turn off the automatic baking switch and then switch to the [solid color background](/en/docs/graphics/background/solidColor/).**
## glTF Viewer
-We also provide baking tools in the official [glTF Viewer](https://galacean.antgroup.com/#/gltf-viewer). Simply drag and drop an HDR texture onto the webpage to automatically download the baked product:
+We also provide a baking tool in the [glTF viewer](https://galacean.antgroup.com/engine/gltf-viewer) on our official website. Simply drag and drop an HDR map onto the webpage to automatically download the baked product:

diff --git a/docs/en/graphics/light/directional.md b/docs/en/graphics/light/directional.md
index 0f4fdf34b6..70405883cd 100644
--- a/docs/en/graphics/light/directional.md
+++ b/docs/en/graphics/light/directional.md
@@ -6,22 +6,19 @@ group: Lighting
label: Graphics/Light
---
-**Directional Light** represents light that is emitted uniformly from a specific direction, with parallel light rays. The sunlight shining on the Earth's surface can be considered as directional light because the distance between the Sun and the Earth is much greater than the Earth's radius. Therefore, the sunlight shining on the Earth can be seen as a set of parallel light rays coming from the same direction, which is directional light.
+**Directional Light** represents light that is emitted uniformly in a certain direction, with parallel light rays. Sunlight hitting the Earth's surface can be considered directional light because the distance between the Sun and the Earth is much greater than the Earth's radius. Therefore, sunlight hitting the Earth can be seen as a set of parallel light rays coming from the same direction, i.e., directional light.
-Directional light has 3 main properties: _Color_ ([color](/apis/core/#DirectLight-color)), _Intensity_ ([intensity](/apis/core/#DirectLight-intensity)), and _Direction_ ([direction](/apis/core/#DirectLight-direction)). The _Direction_ is represented by the orientation of the node where the directional light is located.
-
-| Property | Description |
-| :-------- | :------------------------------- |
-| Intensity | Controls the intensity of the parallel light, **the higher the value, the brighter** |
-| Color | Controls the color of the parallel light, default is white |
-| Culling Mask | Controls which objects the light needs to illuminate, default is Everything. Needs to be used in conjunction with the Entity's Layer |
-
-> Directional lights can cast shadows. For more information, see the [shadow documentation](/en/docs/graphics/light/shadow).
-
+Directional light has 3 main characteristics: _color_ ([color](/en/apis/core/#DirectLight-color)), _intensity_ ([intensity](/en/apis/core/#DirectLight-intensity)), and _direction_ ([direction](/en/apis/core/#DirectLight-direction)). The _direction_ is represented by the orientation of the node where the directional light is located.
+| Property | Function |
+| :----------- | :----------------------------------------------------------------------- |
+| Intensity | Controls the intensity of the parallel light, **the higher the value, the brighter** |
+| Color | Controls the color of the parallel light, default is white |
+| Culling Mask | Controls the objects that need to be illuminated by the light, default is Everything. Needs to be used in conjunction with the Entity's Layer |
+> Directional light can cast shadows. For related configurations, refer to the [shadow documentation](/en/docs/graphics/light/shadow).
## Script Usage
diff --git a/docs/en/graphics/light/light.md b/docs/en/graphics/light/light.md
index 6540ba9d21..5437541c90 100644
--- a/docs/en/graphics/light/light.md
+++ b/docs/en/graphics/light/light.md
@@ -1,33 +1,37 @@
---
order: 0
-title: Overview of Lighting
+title: Lighting Overview
type: Graphics
group: Lighting
label: Graphics/Light
---
-Proper use of lighting can provide realistic rendering effects. This section contains the following relevant information:
+Proper use of lighting can provide realistic rendering effects. This section includes the following information:
- Types of Light Sources
- - [Directional Light](/en/docs/graphics-light-directional)
- - [Point Light](/en/docs/graphics-light-point)
- - [Spotlight](/en/docs/graphics-light-spot)
- - [Ambient Light](/en/docs/graphics-light-ambient)
-- [Baking](/en/docs/graphics-light-bake)
-- [Shadows](/en/docs/graphics-light-shadow})
+ - Direct Light
+ - [Directional Light](/en/docs/graphics/light/directional/)
+ - [Point Light](/en/docs/graphics/light/point/)
+ - [Spotlight](/en/docs/graphics/light/spot/)
+ - Indirect Light
+ - [Ambient Light](/en/docs/graphics/light/ambient/)
+- [Baking](/en/docs/graphics/light/bake/)
+- [Shadows](/en/docs/graphics/light/shadow/)
+
+> It should be noted that the number of direct lights affects performance loss. The engine limits the number of each type of direct light to no more than 10. It is generally recommended to use ambient light + some direct lights for embellishment.
## Direct Light
-Direct light generally shines from a specific area or direction, reflects once, and enters the eye (camera) directly, as shown in the example below:
+Direct light usually shines from an area or in a specific direction, entering the eye (camera) directly after one reflection, as shown in the following example:
-## Ambient Light
+## Indirect Light
-Ambient light emits from all directions and enters the eye, as shown in the example below:
+Ambient light is emitted from all around and enters the eye, as shown in the following example:
## Real-time Lighting and Baked Lighting
-Real-time lighting refers to Galacean calculating lighting in real-time during runtime. Baked lighting refers to Galacean precomputing lighting and [baking](/en/docs/graphics-light-bake) the results into a binary file (including [diffuse irradiance coefficients](https://en.wikipedia.org/wiki/Spherical_harmonics) and [pre-filtered environment maps](https://learnopengl.com/PBR/IBL/Specular-IBL/)), then sampling it in real-time during runtime.
+Real-time lighting refers to Galacean calculating lighting in real-time during runtime. Baked lighting refers to Galacean performing lighting calculations in advance and baking the results into binary files (including [diffuse spherical harmonics parameters](https://www.wikiwand.com/zh-hans/%E7%90%83%E8%B0%90%E5%87%BD%E6%95%B0) and [pre-filtered environment maps](https://learnopengl-cn.github.io/07%20PBR/03%20IBL/02%20Specular%20IBL/)), and then sampling them in real-time during runtime.
diff --git a/docs/en/graphics/light/shadow.md b/docs/en/graphics/light/shadow.md
index 3ca103cda5..c8c9a51c84 100644
--- a/docs/en/graphics/light/shadow.md
+++ b/docs/en/graphics/light/shadow.md
@@ -6,67 +6,45 @@ group: Lighting
label: Graphics/Light
---
-Shadows can effectively enhance the three-dimensionality and realism of the rendered image. In real-time rendering, the so-called ShadowMap technology is generally used to draw shadows. Simply put, the light source is used as a virtual camera to render the depth of the scene. Then, when rendering the image from the scene camera, if the depth of the object is deeper than the previously saved depth information, it is considered to be blocked by other objects, and the shadow is rendered accordingly.
+Shadows can effectively enhance the three-dimensionality and realism of the rendered scene. To achieve this, the so-called ShadowMap technique is usually used. Simply put, it involves rendering the scene's depth from the light source as a virtual camera, and then when rendering the scene from the camera's perspective, if an object's depth is deeper than the previously saved depth information, it is considered to be occluded by other objects, and thus a shadow is rendered.
## Scene Configuration
-
+
-There are some configurations in the scene that can affect the global shadow:
+There are some configurations in the scene that can affect global shadows:
-| Parameters | Application |
+| Parameter | Application |
| :-- | :-- |
-| [Cast Shadow](/apis/core/#Scene-castShadows) | Whether to cast shadows. This is the master switch. |
-| [Transparent](/apis/core/#Scene-enableTransparentShadow) | Whether to cast transparent shadows. When turned on, transparent objects can also cast shadows. |
-| [Resolution](/apis/core/#Scene-shadowResolution) | Shadowmap resolution. |
-| [Cascades](/apis/core/#Scene-shadowCascades) | Cascade shadow quantity settings. Generally used to split shadow resolution in large scenes. |
-| [ShadowTwoCascadeSplits](/apis/core/#Scene-shadowTwoCascadeSplits) | Parameters for dividing two-level cascade shadows. |
-| [ShadowFourCascadeSplits](/apis/core/#Scene-shadowFourCascadeSplits) | Parameters for dividing four-level cascade shadows. |
-| [Distance](/apis/core/#Scene-shadowDistance) | Farthest shadow distance. Shadows cannot be seen beyond this distance. |
-| [Fade Border](/apis/core/#Scene-shadowFadeBorder) | Shadow attenuation distance, which indicates the proportion of the shadow distance from which attenuation begins. The range is [0~1]. When it is 0, it means no attenuation. |
+| [Cast Shadow](/en/apis/core/#Scene-castShadows) | Whether to cast shadows. This is the main switch. |
+| [Transparent](/en/apis/core/#Scene-enableTransparentShadow) | Whether to cast transparent shadows. When enabled, transparent objects can also cast shadows. |
+| [Resolution](/en/apis/core/#Scene-shadowResolution) | The resolution of the Shadowmap. The `Low` option uses a resolution of 512, the `Medium` option uses a resolution of 1024, the `High` option uses a resolution of 2048, and the `VeryHigh` option uses a resolution of 4096. |
+| [Cascades](/en/apis/core/#Scene-shadowCascades) | The number of [cascaded shadows](https://learn.microsoft.com/en-us/windows/win32/dxtecharts/cascaded-shadow-maps). Generally used for large scenes to divide the shadowmap resolution, which can improve shadow aliasing at different distances. After enabling two-level cascaded shadows, you can configure it through [ShadowTwoCascadeSplits](/en/apis/core/#Scene-shadowTwoCascadeSplits), and after enabling four-level cascaded shadows, you can configure it through [ShadowFourCascadeSplits](/en/apis/core/#Scene-shadowFourCascadeSplits). |
+| [Distance](/en/apis/core/#Scene-shadowDistance) | The farthest shadow distance (distance from the camera), beyond which shadows are not visible. |
+| [Fade Border](/en/apis/core/#Scene-shadowFadeBorder) | The shadow fade distance, indicating the proportion of the shadow distance at which fading starts, ranging from [0~1]. A value of 0 means no fading. |
## Light Configuration
-To cast shadows, you need a [directional light](/en/docs/graphics/light/directional) in the scene, and then you can configure some properties that determine the shadowmap:
+To cast shadows, there needs to be a [directional light](/en/docs/graphics/light/directional) in the scene. Currently, the engine can only enable shadows for one directional light `DirectLight`, mainly because shadow rendering doubles the DrawCall, which can severely impact rendering performance. In the absence of a specified [main light(scene.sun)](/en/apis/core/#Scene-sun), the engine will default to selecting the light with the highest intensity to cast shadows:
-| Parameters | Application |
-| :------------------------------------------------ | :----------------------------------- |
-| [Shadow Type](/apis/core/#Light-shadowType) | Shadow casting type. |
-| [Shadow Bias](/apis/core/#Light-shadowBias) | Shadow bias. |
-| [Normal Bias](/apis/core/#Light-shadowNormalBias) | Shadow normal bias. |
-| [Near Plane](/apis/core/#Light-shadowNearPlane) | Near plane when rendering depth map. |
-| [Strength](/apis/core/#Light-shadowStrength) | Shadow strength. |
+| Parameter | Application |
+| :------------------------------------------------ | :------------------------------------------------- |
+| [Shadow Type](/en/apis/core/#Light-shadowType) | The type of shadow casting. Different types affect rendering performance and visual effects. |
+| [Shadow Bias](/en/apis/core/#Light-shadowBias) | The offset of the shadow. Prevents shadow distortion. |
+| [Normal Bias](/en/apis/core/#Light-shadowNormalBias) | The normal offset of the shadow. Avoids shadow distortion. |
+| [Near Plane](/en/apis/core/#Light-shadowNearPlane) | The near clipping plane when rendering the depth map. Affects the shadow clipping plane and precision. |
+| [Strength](/en/apis/core/#Light-shadowStrength) | The strength of the shadow. Controls the transparency of the shadow. |
-Here we need to explain the shadow bias:
-
-
-
-Due to depth accuracy issues, artifacts are generated when sampling from the camera. So it is usually necessary to set the shadow bias to produce clean shadows, as shown in the right figure. But if the offset is too large, the shadow will deviate from the projected object, and you can see that the shadow and the heel in the right picture are separated. Therefore, this parameter is a parameter that needs to be carefully adjusted when using shadows.
-
-Currently, the engine only supports casting shadows for one directional light `DirectLight`, mainly because the rendering of shadows doubles the DrawCall, which will seriously affect the rendering performance. Generally speaking, `DirectLight` is used to imitate sunlight, so only one is supported. There are two points to note about the shadow of a directional light.
-
-### Cascade Shadows
-
-First is cascade shadows. Since a directional light is only the direction of the light, the position of the light source is meaningless. So it is difficult to determine how to set the frustum used when drawing the depth map starting from the light source. And if the depth map is only rendered once in the entire scene, the objects in the distance are very small, which will seriously waste the depth map and produce a lot of blanks. So the engine uses the Stable Cascade Shadows (CSSM) technique:
-
-
-
-This technique divides the camera's view cone into two or four blocks, and then renders the scene two or four times along the direction of the light, and determines the size of each block by dividing the parameters, thereby maximizing the utilization of the depth map. The engine uses four-level cascade shadows by default when shadows are turned on, so the size of each level can be controlled by adjusting shadowFourCascadeSplits.
-
-### Shadow selection
-
-It was mentioned above that **only one directional light `DirectLight` can be used to turn on shadows**, but what happens if shadows are turned on for two `DirectLight` in the scene? In the absence of a determined main light, the engine will choose the light with the strongest light intensity to cast shadows by default. Light intensity is determined by the intensity of the light and the brightness of the light color. The light color is converted to a de-brightness value using the Hue-Saturation-Brightness formula.
-
-## Projectors and receivers
+## Projectiles and Receivers
-In the [mesh renderer component](/en/docs/graphics/renderer/meshRenderer), `receiveShadows` can determine whether the object receives shadows, and `castShadows` can determine whether the object casts shadows.
+In the [Mesh Renderer Component](/en/docs/graphics/renderer/meshRenderer), `receiveShadows` determines whether the object receives shadows, and `castShadows` determines whether the object casts shadows.
-## Transparent shadows
+## Transparent Shadows
-Starting from version `1.3`, the engine supports casting shadows of `alpha cutoff` objects and `transparent` objects. Among them, transparent objects casting shadows need to turn on the `Transparent` switch in the scene panel:
+Starting from version `1.3`, the engine supports casting shadows for alpha cutoff (Alpha Cutoff) objects and transparent (Transparent) objects. For transparent objects to cast shadows, you need to enable the `Transparent` switch in the scene panel:
-
+
diff --git a/docs/en/graphics/light/spot.md b/docs/en/graphics/light/spot.md
index c7e83e35d2..4128da77a3 100644
--- a/docs/en/graphics/light/spot.md
+++ b/docs/en/graphics/light/spot.md
@@ -6,20 +6,20 @@ group: Lighting
label: Graphics/Light
---
-A **spotlight** is like a flashlight in real life, emitting light in a cone from a specific point in a particular direction.
+**Spotlight** is like the light emitted from a flashlight in real life, emitting conically from a point in a specific direction.
-Spotlights have several main characteristics: _color_ ([color](/apis/core/#SpotLight-color)), _intensity_ ([intensity](/apis/core/#SpotLight-intensity)), _effective distance_ ([distance](/apis/core/#SpotLight-distance)), _spread angle_ ([angle](/apis/core/#SpotLight-angle)), and _penumbra angle_ ([penumbra](/apis/core/#SpotLight-penumbra)). The spread angle indicates when there is light when the angle between the light source and the direction is less than a certain value, and the penumbra angle indicates that within the effective angle range, the light intensity gradually decays to 0 as the angle increases.
+The spotlight has several main characteristics: _color_ ([color](/en/apis/core/#SpotLight-color)), _intensity_ ([intensity](/en/apis/core/#SpotLight-intensity)), _effective distance_ ([distance](/en/apis/core/#SpotLight-distance)), _scatter angle_ ([angle](/en/apis/core/#SpotLight-angle)), _penumbra attenuation angle_ ([penumbra](/en/apis/core/#SpotLight-penumbra)). The scatter angle indicates the angle within which there is light relative to the direction of the light source, and the penumbra attenuation angle indicates that within the effective angle range, the light intensity gradually attenuates to 0 as the angle increases.
-| Property | Description |
+| Attribute | Function |
| :--------------------- | :------------------------------------------------------------------------ |
-| Angle | Indicates when there is light when the angle between the light source and the direction is less than a certain value |
+| Angle (Scatter Angle) | Indicates the angle within which there is light relative to the direction of the light source |
| Intensity | Controls the intensity of the spotlight, **the higher the value, the brighter** |
-| Color | Controls the color of the spotlight |
-| Distance | Effective distance, light intensity decays with distance |
-| Penumbra | Indicates that within the effective angle range, the light intensity gradually decays to 0 as the angle increases |
-| Culling Mask | Controls which objects the light needs to illuminate, default is Everything. Needs to be used with Entity's Layer |
+| Color | Controls the color of the spotlight |
+| Distance | Effective distance, light intensity attenuates with distance |
+| Penumbra (Attenuation Angle) | Indicates that within the effective angle range, the light intensity gradually attenuates to 0 as the angle increases |
+| Culling Mask | Controls the objects that need to be illuminated by the light, default is Everything. Needs to be used in conjunction with the Entity's Layer |
### Script Usage
diff --git a/docs/en/graphics/material/editor.md b/docs/en/graphics/material/editor.md
index c5faf3dc72..e18ab5784e 100644
--- a/docs/en/graphics/material/editor.md
+++ b/docs/en/graphics/material/editor.md
@@ -1,12 +1,12 @@
---
order: 2
-title: Editor Usage
+title: Using the Editor
type: Material
-group: Grid
+group: Mesh
label: Graphics/Material
---
-## Editor Usage
+## Using the Editor
### 1. Manually Create Material
@@ -14,14 +14,15 @@ label: Graphics/Material
### 2. Import Model
-Refer to the [Import and Use of Models](/en/docs/graphics-model-use) tutorial, we can first import the model into the editor. In general, the model is automatically bound with materials, and users do not need to take any action. If you want to modify the material, you need to click the `duplicate & remap` button to generate a copy of the material, and then edit this copy of the material.
+Refer to the [Import and Use Model](/en/docs/graphics/model/use/) tutorial. We can first import the model into the editor. Generally, the model is already automatically bound to the material, and the user does not need to do anything; if you want to modify the material, we need to click the `duplicate & remap` button to generate a copy of the material and then edit the material copy.
-Switching shaders will not reset shader data. For example, if the base color is red, even if you switch shaders, the base color will remain red.
+Switching shaders will not reset shader data. For example, if the base color is red, the base color will remain red even if the shader is switched.
### 3. Adjust Material
-For specific operations, please refer to the [Shader Tutorial](/en/docs/graphics-shader).
+For specific operations, see the [Shader Tutorial](/en/docs/graphics/shader/intro/).
+
diff --git a/docs/en/graphics/material/material.md b/docs/en/graphics/material/material.md
index 2410726503..3c8e3ccdb4 100644
--- a/docs/en/graphics/material/material.md
+++ b/docs/en/graphics/material/material.md
@@ -6,13 +6,13 @@ group: Mesh
label: Graphics/Material
---
-Material refers to a set of properties used to describe the appearance and surface characteristics of an object. The material determines how the model interacts with light during the rendering process, thereby affecting its visual presentation.
+Materials refer to a collection of properties used to describe the appearance and surface characteristics of an object. Materials determine how a model interacts with light during rendering, thus affecting its visual presentation.
-This section contains the following relevant information:
+This section includes the following related information:
-- [Material Composition](/en/docs/graphics-material-composition)
-- [Editor Usage](/en/docs/graphics-material-editor)
-- [Script Usage](/en/docs/graphics-material-script})
+- [Material Composition](/en/docs/graphics/material/composition/)
+- [Editor Usage](/en/docs/graphics/material/editor/)
+- [Script Usage](/en/docs/graphics/material/script/)
diff --git a/docs/en/graphics/material/script.md b/docs/en/graphics/material/script.md
index 19e3e7ded5..9bb7e0de34 100644
--- a/docs/en/graphics/material/script.md
+++ b/docs/en/graphics/material/script.md
@@ -6,9 +6,9 @@ group: Mesh
label: Graphics/Material
---
-The materials exported by the editor only include the basic [Material](/apis/core/#Material) class, while you can create the engine's pre-packaged [PBRMaterial](/apis/core/#PBRMaterial), [UnlitMaterial](/apis/core/#UnlitMaterial), [BlinnPhongMaterial](/apis/core/#BlinnPhongMaterial) through code.
+The material exported by the editor is only the base class [Material](/en/apis/core/#Material), while through code you can create the engine's pre-packaged [PBRMaterial](/en/apis/core/#PBRMaterial), [UnlitMaterial](/en/apis/core/#UnlitMaterial), and [BlinnPhongMaterial](/en/apis/core/#BlinnPhongMaterial).
-## Get Material
+## Getting Material
### 1. Get from an existing renderer
@@ -26,7 +26,7 @@ const material = renderer.getMaterial();
### 2. Replace the material in the renderer
-You can also directly replace the material type, for example, assign a PBR material to a model:
+We can also directly replace the material type, for example, reassigning a PBR material to the model:
```typescript
// 获取想要修改的 renderer
@@ -54,7 +54,7 @@ const unlitMaterial = new UnlitMaterial(engine);
const customMaterial = new Material(engine, Shader.find("***"));
```
-## Modify Material
+## Modifying Material
### 1. Modify built-in materials
diff --git a/docs/en/graphics/mesh/bufferMesh.md b/docs/en/graphics/mesh/bufferMesh.md
index c1a335a5cb..f000b30697 100644
--- a/docs/en/graphics/mesh/bufferMesh.md
+++ b/docs/en/graphics/mesh/bufferMesh.md
@@ -6,33 +6,33 @@ group: Mesh
label: Graphics/Mesh
---
-[BufferMesh](/apis/core/#BufferMesh}) allows free manipulation of vertex buffer and index buffer data, as well as some instructions related to geometry drawing. It is efficient, flexible, and concise. Developers can use this class to efficiently and flexibly implement custom geometries.
+[BufferMesh](/en/apis/core/#BufferMesh) allows free manipulation of vertex buffers and index buffer data, as well as some instructions related to geometry drawing. It is efficient, flexible, and concise. Developers who want to efficiently and flexibly implement custom geometries can use this class.
-## Schematic Diagram
+## Diagram
-Let's take a look at the schematic diagram of `BufferMesh`
+Let's first overview the diagram of `BufferMesh`.

`BufferMesh` has three core elements:
-| Name | Description |
-| :---------------------------------------------------- | :---------------------------------------------------------------------- |
-| [VertexBufferBinding](/apis/core/#VertexBufferBinding) | Vertex buffer binding, used to pack vertex buffer and vertex stride (in bytes). |
-| [VertexElement](/apis/core/#VertexElement) | Vertex element, used to describe vertex semantics, vertex offset, vertex format, and vertex buffer binding index information. |
-| [IndexBufferBinding](/apis/core/#IndexBufferBinding) | Index buffer binding (optional), used to pack index buffer and index format. |
+| Name | Description |
+| :---------------------------------------------------- | :----------------------------------------------------------------------- |
+| [VertexBufferBinding](/en/apis/core/#VertexBufferBinding) | Vertex buffer binding, used to package vertex buffers and vertex strides (bytes). |
+| [VertexElement](/en/apis/core/#VertexElement) | Vertex element, used to describe vertex semantics, vertex offsets, vertex formats, and vertex buffer binding indices. |
+| [IndexBufferBinding](/en/apis/core/#IndexBufferBinding) | Index buffer binding (optional), used to package index buffers and index formats. |
-Among them, [IndexBufferBinding](/apis/core/#IndexBufferBinding}) is optional, which means there are only two necessary core elements. They are set through the [setVertexBufferBindings()](/apis/core/#BufferMesh-setVertexBufferBindings) interface and the [setVertexElements()](/apis/core/#BufferMesh-setVertexElements) interface. The last step is to add a submesh through [addSubMesh](/apis/core/#BufferMesh-addSubMesh) and set the vertex or index drawing count. [SubMesh](/apis/core/#SubMesh) contains three attributes: starting drawing offset, drawing count, primitive topology. Developers can add multiple [SubMesh](/apis/core/#SubMesh) independently, and each sub-geometry can correspond to a unique material.
+Among them, [IndexBufferBinding](/en/apis/core/#IndexBufferBinding) is optional, which means that there are only two necessary core elements, set through the [setVertexBufferBindings()](/en/apis/core/#BufferMesh-setVertexBufferBindings) interface and the [setVertexElements()](/en/apis/core/#BufferMesh-setVertexElements) interface. The final step is to add sub [SubMesh](/en/apis/core/#SubMesh) through [addSubMesh](/en/apis/core/#BufferMesh-addSubMesh) and set the number of vertices or indices to draw. [SubMesh](/en/apis/core/#SubMesh) contains three properties: start drawing offset, drawing count, and primitive topology. Developers can add multiple [SubMesh](/en/apis/core/#SubMesh), each sub-geometry can correspond to an independent material.
-## Common Use Cases
+## Common Cases
-Here are a few common use cases of [MeshRenderer](/apis/core/#MeshRenderer) and [BufferMesh](/apis/core/#BufferMesh) because this class is more low-level and flexible, detailed code examples are provided.
+Here are some common use cases of [MeshRenderer](/en/apis/core/#MeshRenderer) and [BufferMesh](/en/apis/core/#BufferMesh). Because this class is low-level and flexible, detailed code examples are provided.
### Interleaved Vertex Buffer
-This is a common way to implement custom Mesh, Particle, etc., with advantages such as compact memory usage and fewer CPU data uploads to the GPU per frame. The main feature of this case is that multiple [VertexElement](/apis/core/#VertexElement) correspond to one *VertexBuffer* ([Buffer](/apis/core/#Buffer)), and only one *VertexBuffer* is used to associate different vertex elements with the Shader.
+A common method, such as custom Mesh, Particle implementation, has the advantages of compact video memory and fewer CPU data uploads to the GPU per frame. The main feature of this case is that multiple [VertexElement](/en/apis/core/#VertexElement) correspond to one *VertexBuffer* ([Buffer](/en/apis/core/#Buffer)), and only one *VertexBuffer* is used to associate different vertex elements with the Shader.
```typescript
// add MeshRenderer component
@@ -67,7 +67,7 @@ renderer.mesh = mesh;
-It has advantages when mixing dynamic and static vertex buffers, such as _position_ being static, but _color_ being dynamic. Independent vertex buffer can update only color data to the GPU. The main feature of this case is that one [VertexElement](/apis/core/#VertexElement) corresponds to one _VertexBuffer_, and you can independently update data by calling the [setData](/apis/core/#Buffer-setData) method of the [Buffer](/apis/core/#Buffer) object.
+It has advantages when mixing dynamic vertex buffers and static vertex buffers, such as _position_ being static but _color_ being dynamic. Independent vertex buffers can update only the color data to the GPU. The main feature of this case is that one [VertexElement](/en/apis/core/#VertexElement) corresponds to one _VertexBuffer_, and the [setData](/en/apis/core/#Buffer-setData) method of the [Buffer](/en/apis/core/#Buffer) object can be called separately to update the data independently.
```typescript
// add MeshRenderer component
@@ -111,7 +111,7 @@ renderer.mesh = mesh;
-GPU instance rendering is a common technique in 3D engines, where objects with the same geometry shape can be rendered at different positions simultaneously, significantly improving rendering performance. The main feature of this example is the use of the [VertexElement](/apis/core/#VertexElement) instance functionality. The last parameter of its constructor represents the instance step rate (the number of instances to draw for each vertex advancement in the buffer, non-instance elements must be 0), while [BufferMesh](/apis/core/#BufferMesh)'s [instanceCount](/apis/core/#BufferMesh-instanceCount) indicates the number of instances.
+GPU Instance rendering is a common technique in 3D engines. For example, it allows rendering objects with the same geometric shape at different positions in one go, significantly improving rendering performance. The main feature of this example is the use of the instance functionality of [VertexElement](/en/apis/core/#VertexElement). The last parameter of its constructor indicates the instance step rate (the number of instances drawn per vertex advance in the buffer, non-instance elements must be 0). The [instanceCount](/en/apis/core/#BufferMesh-instanceCount) of [BufferMesh](/en/apis/core/#BufferMesh) indicates the number of instances.
```typescript
// add MeshRenderer component
@@ -151,7 +151,7 @@ renderer.mesh = mesh;
## Index Buffer
-Using an index buffer allows reusing vertices within a vertex buffer, thus saving memory. The usage is simple, by adding an index buffer object on top of the original base. The following code is modified from the first **interleaved vertex buffer** example.
+Using an index buffer allows reusing vertices in the vertex buffer, thereby saving video memory. Its usage is straightforward, just adding an index buffer object on top of the original setup. The following code is modified based on the first **Interleaved Vertex Buffer** example.
```typescript
// add MeshRenderer component
diff --git a/docs/en/graphics/mesh/mesh.md b/docs/en/graphics/mesh/mesh.md
index 94aaf09a75..ff021fa119 100644
--- a/docs/en/graphics/mesh/mesh.md
+++ b/docs/en/graphics/mesh/mesh.md
@@ -1,38 +1,38 @@
---
order: 0
-title: Grid Overview
+title: Mesh Overview
type: Graphics
-group: Grid
+group: Mesh
label: Graphics/Mesh
---
-A grid is a data object of the [Mesh Renderer](/en/docs/graphics-renderer-meshRenderer), which describes various information about vertices (such as position, topology, vertex color, UV, etc.).
+A mesh is the data object of the [Mesh Renderer](/en/docs/graphics/renderer/meshRenderer/), which describes various information of vertices (position, topology, vertex color, UV, etc.).
## Mesh Assets
-Mesh assets are typically sourced from:
+Mesh assets generally come from:
-- Importing models to acquire [model-embedded mesh assets](/en/docs/graphics-model-assets) created by third-party tools through [model importation](/en/docs/graphics-model-importGlTF}).
-- Editor's [built-in mesh assets](/en/docs/graphics-mesh-primitiveMesh}).
-- Developers creating [mesh assets](/en/docs/graphics-mesh-primitiveMesh}) themselves.
+- By [importing models](/en/docs/graphics/model/importGlTF/), obtaining [model built-in mesh assets](/en/docs/graphics/model/assets/) created by third-party tools
+- [Built-in mesh assets](/en/docs/graphics/mesh/primitiveMesh/) of the editor
+- Developers [creating mesh assets](/en/docs/graphics/mesh/primitiveMesh/) themselves
## Usage
-When setting a mesh for the mesh renderer, simply select the corresponding mesh asset.
+When you need to set a mesh for the mesh renderer, you only need to select the corresponding mesh asset.
-Similarly, in scripts, the use of meshes will be more flexible, but also more complex. Let's first look at
+Correspondingly, in the script, the use of the mesh will be more free, and the complexity will also be higher. First, let's look at
-| Type | Description |
-| :----------------------------------------------- | :--------------------------------------------------------------------------------------------------------------------------------------------------- |
-| [ModelMesh](/en/docs/graphics-mesh-modelMesh) | Encapsulates methods for setting vertex and index data, making it very simple and easy to use. Developers who want to quickly customize geometries can use this class. |
-| [BufferMesh](/en/docs/graphics-mesh-bufferMesh) | Allows for free manipulation of vertex buffer and index buffer data, as well as some geometry drawing-related instructions. It is efficient, flexible, and concise. Developers who want to efficiently and flexibly implement custom geometries can use this class. |
-| [Primitive Mesh](/en/docs/graphics-mesh-primitiveMesh) | Essentially a preset ModelMesh, containing common shapes like cuboids, spheres, planes, cylinders, tori, cylinders with capsules, etc. |
+| Type | Description |
+| :----------------------------------------------- | :---------------------------------------------------------------------------------------------------------------------------------------------------------- |
+| [ModelMesh](/en/docs/graphics/mesh/modelMesh/) | Encapsulates commonly used methods for setting vertex data and index data, very simple and easy to use. Developers can use this class to quickly customize geometry |
+| [BufferMesh](/en/docs/graphics/mesh/bufferMesh/) | Allows free manipulation of vertex buffers and index buffer data, as well as some instructions related to geometry drawing. It is efficient, flexible, and concise. Developers can use this class to efficiently and flexibly implement custom geometry |
+| [Built-in Geometry](/en/docs/graphics/mesh/primitiveMesh/) | Essentially pre-set ModelMesh, including commonly used cuboids, spheres, planes, cylinders, toruses, cylinders, and capsules. |
## Usage
-In the editor, meshes appear in the form of mesh assets, which can be accessed through
+In the editor, the mesh appears in the form of mesh assets, we can use
```typescript
const meshRenderer = entity.addComponent(MeshRenderer);
@@ -43,6 +43,6 @@ meshRenderer.mesh = new BufferMesh(engine);
## Common Geometries
-Constructing mesh data for geometries manually can be a painful process, so Galacean provides some practical geometries.
+Constructing geometric mesh data by yourself is a rather painful process, so Galacean has built-in some more practical geometries.
-- [Primitive Meshes](/en/docs/graphics-model}): Includes common shapes like cuboids, spheres, planes, cylinders, tori, cylinders with capsules, etc.
+- [Built-in Geometry](/en/docs/graphics/mesh/primitiveMesh/): Includes commonly used cuboids, spheres, planes, cylinders, toruses, cylinders, and capsules.
diff --git a/docs/en/graphics/mesh/primitiveMesh.md b/docs/en/graphics/mesh/primitiveMesh.md
index ed46f2f73e..128d752b2a 100644
--- a/docs/en/graphics/mesh/primitiveMesh.md
+++ b/docs/en/graphics/mesh/primitiveMesh.md
@@ -1,34 +1,34 @@
---
order: 3
-title: 原始网格
-type: 图形
-group: 网格
-label: 图形/网格
+title: Primitive Mesh
+type: Graphics
+group: Mesh
+label: Graphics/Mesh
---
-常用几何体统一在 [PrimitiveMesh](/apis/core/#PrimitiveMesh) 中提供。
+[PrimitiveMesh](/en/apis/core/#PrimitiveMesh) provides convenient methods for creating mesh objects such as cubes and spheres.
-## 编辑器使用
+## Editor Usage
-编辑器已经内置了`立方体`、`球`、`圆柱体` 等基础几何体,可以直接在左侧节点树点击 `+` 置入模型:
+The editor has built-in basic geometries such as `Cube`, `Sphere`, and `Cylinder`, which can be directly placed in the model by clicking `+` on the left node tree:
-当然,我们也可以在组件面板点击 `1` 添加 `Mesh Renderer`组件,点击 `2` 绑定想要的基础几何体:
+Of course, we can also click `1` in the component panel to add the `Mesh Renderer` component, and click `2` to bind the desired basic geometry:
-内置几何体无法满足需求?您可以在 **[资产面板](/en/docs/assets/interface)** 中 **右键** → **Create** → **PrimitiveMesh** 创建一个 `Mesh` 资产,并通过调整 `Mesh` 的各项参数来满足需求。
+Built-in geometries not meeting your needs? You can create a `Mesh` asset in the **[Asset Panel](/en/docs/assets/interface)** by **right-clicking** → **Create** → **PrimitiveMesh**, and adjust the parameters of the `Mesh` to meet your requirements.
-## 脚本使用
+## Script Usage
-目前提供的几何体如下:
+The currently provided geometries are as follows:
-- [createCuboid](/apis/core/#PrimitiveMesh-createCuboid) **立方体**
+- [createCuboid](/en/apis/core/#PrimitiveMesh-createCuboid) **Cuboid**
```typescript
const entity = rootEntity.createChild("cuboid");
@@ -41,7 +41,7 @@ material.emissiveColor.set(1, 1, 1, 1);
renderer.setMaterial(material);
```
-- [createSphere](/apis/core/#PrimitiveMesh-createSphere) **球体**
+- [createSphere](/en/apis/core/#PrimitiveMesh-createSphere) **Sphere**
```typescript
const entity = rootEntity.createChild("sphere");
@@ -54,7 +54,7 @@ material.emissiveColor.set(1, 1, 1, 1);
renderer.setMaterial(material);
```
-- [createPlane](/apis/core/#PrimitiveMesh-createPlane) **平面**
+- [createPlane](/en/apis/core/#PrimitiveMesh-createPlane) **Plane**
```typescript
const entity = rootEntity.createChild("plane");
@@ -67,7 +67,7 @@ material.emissiveColor.set(1, 1, 1, 1);
renderer.setMaterial(material);
```
-- [createCylinder](/apis/core/#PrimitiveMesh-createCylinder) **圆柱**
+- [createCylinder](/en/apis/core/#PrimitiveMesh-createCylinder) **Cylinder**
```typescript
const entity = rootEntity.createChild("cylinder");
@@ -80,7 +80,7 @@ material.emissiveColor.set(1, 1, 1, 1);
renderer.setMaterial(material);
```
-- [createTorus](/apis/core/#PrimitiveMesh-createTorus) **圆环**
+- [createTorus](/en/apis/core/#PrimitiveMesh-createTorus) **Torus**
```typescript
const entity = rootEntity.createChild("torus");
@@ -93,7 +93,7 @@ material.emissiveColor.set(1, 1, 1, 1);
renderer.setMaterial(material);
```
-- [createCone](/apis/core/#PrimitiveMesh-createCone) **圆锥**
+- [createCone](/en/apis/core/#PrimitiveMesh-createCone) **Cone**
```typescript
const entity = rootEntity.createChild("cone");
@@ -106,7 +106,7 @@ material.emissiveColor.set(1, 1, 1, 1);
renderer.setMaterial(material);
```
-- [createCapsule](/apis/core/#PrimitiveMesh-createCapsule) **胶囊体**
+- [createCapsule](/en/apis/core/#PrimitiveMesh-createCapsule) **Capsule**
```typescript
const entity = rootEntity.createChild("capsule");
diff --git a/docs/en/graphics/model/assets.md b/docs/en/graphics/model/assets.md
index c6b95e7092..5b7bd3d1cf 100644
--- a/docs/en/graphics/model/assets.md
+++ b/docs/en/graphics/model/assets.md
@@ -6,57 +6,56 @@ group: Model
label: Graphics/Model
---
-After the model is imported, new model assets will be added to the **[Assets Panel](/en/docs/assets/interface)**. Clicking on the asset thumbnail will display basic information about the model.
+After the model is imported, the imported model asset will be added to the **[Asset Panel](/en/docs/assets/interface)**. By clicking on the asset thumbnail, you can see the basic information of this model.
-| Area | Function | Description |
+| Area | Function | Explanation |
| :--------- | :--------------- | :----------------------------------------------------------------- |
-| Viewport | Preview | Similar to a glTF viewer, developers can easily observe the model from different angles and animations |
-| Basic Info | URL | CDN link of the model |
-| | DrawCall | Number of draw calls for this model |
-| | ComputeTangents | Processing of tangent information in the model's vertex data |
-| Material Remapping | Material list in the model | Corresponding remapped materials |
-| Export | Cut first frame | Whether to trim the first frame |
+| View Area | Preview | Similar to glTF viewer, developers can easily observe the model's different animations from various angles |
+| Basic Info | URL | The CDN link of the model |
+| | DrawCall | The number of times this model calls for drawing |
+| | ComputeTangents | Processing of tangent information in the model's vertex data |
+| Material Remapping | Material List in the Model | Corresponding remapped materials |
+| Export | Cut first frame | Whether to cut the first frame |
| | isGLB | Whether to export in GLB format |
-| | Export glb/glTF | Export the model locally |
+| | Export glb/glTF | Export the model to local |
## Sub-assets of the Model
-Hover over the model asset thumbnail, click on the triangle button that appears on the right side, and information about the sub-assets contained in the model asset, such as meshes, textures, animations, and materials, will be displayed in the resource panel.
+Hover the mouse over the model asset thumbnail and click the triangle button that appears on the right. The mesh, textures, animations, materials, and other sub-asset information contained in the model asset will be displayed in the resource panel.
-### Mesh Sub-asset
+### Mesh Sub-assets
-Clicking on the mesh sub-asset thumbnail will display basic information about the mesh as follows:
+Click on the mesh sub-asset thumbnail to see the basic information of the mesh as follows:
-| Area | Function | Description |
-| :--------- | :--------------- | :----------------------------- |
-| Vertex Data| Vertex Info List | Format and stride of vertex information |
-| Submesh | Submesh List | Drawing information of submeshes |
+| Area | Function | Explanation |
+| :------- | :------------- | :------------------------ |
+| Vertex Data | Vertex Info List | The format and stride of the vertex information |
+| Sub-mesh | Sub-mesh List | Drawing information of the sub-mesh |
-### Texture Sub-asset
+### Texture Sub-assets
-The basic information of a texture sub-asset is the only difference from a [texture](/en/docs/graphics-texture) asset, as most texture information is read-only.
+The only difference between the basic information of texture sub-assets and [texture](/en/docs/graphics/texture/texture/) assets is that the texture information is mostly read-only.
-### Material Sub-Asset
+### Material Sub-assets
-Similarly, the [material](/en/docs/graphics-material) sub-asset is as follows:
+Similarly, [material](/en/docs/graphics/material/material/) sub-assets work the same way:
-In general, users do not need to perform any operations on the material that comes with the model; however, in certain scenarios, developers may want to manually adjust the material, such as changing the color. In this case, we can duplicate the original material by clicking **duplicate & remap**, and then make modifications based on the original material parameters:
+In general, users do not need to perform any operations on the materials that come with the model; however, in certain scenarios, developers may want to manually tweak the materials, such as changing the color. In this case, we can duplicate the original material by clicking **duplicate & remap**, and then modify it based on the original material parameters:
-### Animation Sub-Asset
+### Animation Sub-assets
-Animation sub-assets appear in the model asset in the form of [animation clips](/en/docs/animation/clip) and are also **read-only**.
+Animation sub-assets appear in the model assets in the form of [animation clips](/en/docs/animation/clip), and they are also **read-only**.
-
diff --git a/docs/en/graphics/model/glTF.md b/docs/en/graphics/model/glTF.md
index 03cb364fec..ac5f87d15c 100644
--- a/docs/en/graphics/model/glTF.md
+++ b/docs/en/graphics/model/glTF.md
@@ -8,56 +8,59 @@ label: Graphics/Model
> For more details, please visit the [glTF official website](https://www.khronos.org/gltf/)
-**glTF** (GL Transmission Format) is a specification released by [Khronos](https://www.khronos.org/) that efficiently transmits and loads 3D scenes. It is considered the "JPEG" format in the 3D field, covering features of traditional model formats like FBX and OBJ. Its plugin mechanism allows users to flexibly customize desired functionalities, as seen [here](https://github.com/KhronosGroup/glTF/tree/master/extensions/2.0/Khronos).
+**glTF** (GL Transmission Format) is a specification released by [khronos](https://www.khronos.org/) that enables efficient transmission and loading of 3D scenes. It is the "JPEG" format in the 3D field, covering the functionalities of traditional model formats like FBX and OBJ. It supports almost all features in 3D scenes, and its [plugin mechanism](https://github.com/KhronosGroup/glTF/tree/master/extensions/2.0/Khronos) allows users to flexibly customize and implement desired functionalities.
## Ecosystem
-## Export Products
+## Exported Products
-The export products of glTF are generally divided into two types:
+The exported products of glTF are generally divided into two types:
-- **(.gltf + .bin + png)**: Suitable for scenes with large image sizes, separating images and models to asynchronously load models and textures.
-- **(.glb)**: Suitable for scenes with large model files, saving all data in binary format. The model can only be displayed after all data is parsed. Galacean supports both types of products.
+- **(.gltf + .bin + png)**: Suitable for scenarios with large image sizes, so images and models are separated, allowing asynchronous loading of models and textures.
+- **(.glb)**: Suitable for scenarios with large model files, where all data is saved in binary format. The model can only be displayed after all data is parsed. Galacean supports both types of products.
-The choice of export type can be determined based on the actual project requirements.
+Both types of products are supported in Galacean. The choice of export type can be decided based on the actual project requirements.
## Galacean's Support for glTF
-**glTF 2.0** is the recommended primary 3D scene transmission format by Galacean. Galacean provides excellent support for the core features and plugins of **glTF 2.0**:
+**glTF2.0** is currently the recommended 3D scene transmission format for Galacean. Galacean provides good support for the core functionalities and plugins of **glTF2.0**:
- Supports meshes, materials, and texture information in glTF, compiling them into runtime mesh assets, material assets, and texture assets.
-- Supports animations in glTF (including skeletal animations and BlendShapes).
-- Supports node information in glTF (including pose information), compiling them into runtime entity objects while maintaining the original hierarchy.
-- Supports glTF cameras, compiling them into runtime camera components.
-- Supports some glTF plugins.
+- Supports animations in glTF (including skeletal animations and BlendShape).
+- Supports node information in glTF (including pose information), which will be compiled into runtime entity objects while maintaining the original hierarchy.
+- Supports cameras in glTF, compiling them into runtime camera components.
+- Supports some plugins of glTF.
-glTF has many features, and the official website offers numerous [examples](https://github.com/KhronosGroup/glTF-Sample-Models/tree/master/2.0) for reference. Galacean also provides a replicated version for quick browsing. You can switch between different glTF models using the **glTF List** below.
+glTF has many features, and the official website provides a large number of [examples](https://github.com/KhronosGroup/glTF-Sample-Models/tree/master/2.0) for reference. Galacean also provides a replicated version for quick browsing. You can switch between different glTF models through the following **glTF List**.
### Plugin Support
-Galacean currently supports the following glTF plugins. If a glTF file contains any of these plugins, the corresponding functionalities will be automatically loaded:
+Galacean currently supports the following glTF plugins. If the glTF file contains the corresponding plugins, the respective functionalities will be automatically loaded:
-| Plugin | Functionality |
-| :----------------------------------------------------------------------------------------------------------------------------------------------------------------- | :------------------------------------------------------------------------------------------------------------------------------------- |
-| [KHR_draco_mesh_compression](https://github.com/oasis-engine/engine/blob/main/packages/loader/src/gltf/extensions/KHR_draco_mesh_compression.ts) | Supports Draco compressed models, saving memory. |
-| [KHR_lights_punctual](https://github.com/oasis-engine/engine/blob/main/packages/loader/src/gltf/extensions/KHR_lights_punctual.ts) | Supports multiple light sources, parsed as engine light sources. See [Lighting Tutorial](/en/docs/graphics-light) for details. |
-| [KHR_materials_pbrSpecularGlossiness](https://github.com/oasis-engine/engine/blob/main/packages/loader/src/gltf/extensions/KHR_materials_pbrSpecularGlossiness.ts) | Supports PBR [Specular-Glossiness Workflow](/apis/core/#PBRSpecularMaterial). |
-| [KHR_materials_unlit](https://github.com/oasis-engine/engine/blob/main/packages/loader/src/gltf/extensions/KHR_materials_unlit.ts) | Supports [Unlit Materials](/en/docs/graphics-shader-unlit). |
-| [KHR_materials_variants](https://github.com/oasis-engine/engine/blob/main/packages/loader/src/gltf/extensions/KHR_materials_variants.ts) | Allows multiple materials for a renderer, then switches materials using the [setMaterial](/apis/core/#Renderer-setMaterial) interface. |
-| [KHR_mesh_quantization](https://github.com/oasis-engine/engine/blob/main/packages/loader/src/gltf/extensions/KHR_mesh_quantization.ts) | Supports [vertex data compression](https://github.com/KhronosGroup/glTF/tree/master/extensions/2.0/Khronos/KHR_mesh_quantization#extending-mesh-attributes), saving memory by converting vertex data to integers. |
-| [KHR_texture_transform](https://github.com/oasis-engine/engine/blob/main/packages/loader/src/gltf/extensions/KHR_texture_transform.ts) | Supports texture scaling and offset transformations. Refer to the [TilingOffset](https://oasisengine.cn/#/examples/latest/tiling-offset) example. |
-| [KHR_materials_clearcoat](https://github.com/ant-galaxy/oasis-engine/blob/main/packages/loader/src/gltf/extensions/KHR_materials_clearcoat.ts) | Supports clearcoat extension for materials. Refer to the [Clearcoat](https://oasisengine.cn/#/examples/latest/pbr-clearcoat) example. |
-| [GALACEAN_materials_remap](https://github.com/ant-galaxy/oasis-engine/blob/main/packages/loader/src/gltf/extensions/GALACEAN_materials_remap.ts) | Supports editor material mapping. |
+| Plugin | Functionality |
+| :-- | :-- |
+| [KHR_draco_mesh_compression](https://github.com/KhronosGroup/glTF/tree/main/extensions/2.0/Khronos/KHR_draco_mesh_compression) | Supports Draco compressed models, saving video memory |
+| [KHR_texture_basisu](https://github.com/KhronosGroup/glTF/tree/main/extensions/2.0/Khronos/KHR_texture_basisu) | Supports KTX2 texture compression, saving video memory |
+| [KHR_lights_punctual](https://github.com/KhronosGroup/glTF/tree/main/extensions/2.0/Khronos/KHR_lights_punctual) | Supports multiple light sources, parsed into engine light sources. See [Lighting Tutorial](/en/docs/graphics/light/light/) for details |
+| [KHR_materials_pbrSpecularGlossiness](https://github.com/KhronosGroup/glTF/tree/main/extensions/2.0/Archived/KHR_materials_pbrSpecularGlossiness) | Supports PBR [Specular-Glossiness Workflow](/en/apis/core/#PBRSpecularMaterial) |
+| [KHR_materials_unlit](https://github.com/KhronosGroup/glTF/tree/main/extensions/2.0/Khronos/KHR_materials_unlit) | Supports [Unlit Materials](/en/docs/graphics/shader/builtins/unlit/) |
+| [KHR_materials_variants](https://github.com/KhronosGroup/glTF/tree/main/extensions/2.0/Khronos/KHR_materials_variants) | Allows multiple materials in the renderer, with material switching via the [setMaterial](/en/apis/core/#Renderer-setMaterial) interface |
+| [KHR_mesh_quantization](https://github.com/KhronosGroup/glTF/tree/main/extensions/2.0/Khronos/KHR_mesh_quantization) | Supports [vertex data compression](https://github.com/KhronosGroup/glTF/tree/master/extensions/2.0/Khronos/KHR_mesh_quantization#extending-mesh-attributes), saving video memory. For example, vertex data is usually floating-point numbers, but this plugin can save it as integers |
+| [KHR_texture_transform](https://github.com/KhronosGroup/glTF/tree/main/extensions/2.0/Khronos/KHR_texture_transform) | Supports texture scaling and offset transformations. See the [TilingOffset](/en/embed/tiling-offset) example for reference |
+| [KHR_materials_clearcoat](https://github.com/KhronosGroup/glTF/tree/main/extensions/2.0/Khronos/KHR_materials_clearcoat) | Supports the clear coat extension of materials. See the [Clearcoat](/en/embed/pbr-clearcoat) example for reference |
+| [KHR_materials_ior](https://github.com/KhronosGroup/glTF/tree/main/extensions/2.0/Khronos/KHR_materials_ior) | Supports setting the index of refraction for materials |
+| [KHR_materials_anisotropy](https://github.com/KhronosGroup/glTF/tree/main/extensions/2.0/Khronos/KHR_materials_anisotropy) | Supports setting anisotropy for materials. See the [Anisotropy](/en/embed/pbr-anisotropy) example for reference |
+| [GALACEAN_materials_remap](https://github.com/galacean/engine/blob/main/packages/loader/src/gltf/extensions/GALACEAN_materials_remap.ts) | Supports editor material mapping |
-### Plugin Extension
+### Plugin Extensions
-If the built-in plugins provided by the official cannot meet your needs, we also offer a way to extend plugins.
+If the built-in plugins do not meet your needs, we also provide a method to extend plugins.
-For example, if Unity exports the following glTF plugin and wants to extend `Unity_Material_Plugin` based on materials to generate new custom materials, and then wants to add a light on a node based on the light plugin `Unity_Light_Plugin`:
+For example, if Unity exports the following glTF plugin and you want to extend the `Unity_Material_Plugin` to generate a new custom material based on the material, and then use the `Unity_Light_Plugin` to add a light to a node:
```json
{
@@ -84,7 +87,7 @@ For example, if Unity exports the following glTF plugin and wants to extend `Uni
#### 1. Custom Creation and Parsing
-Following the example above, we register a material plugin, where the second parameter `GLTFExtensionMode.CreateAndParse` indicates that this plugin is used for creating instances and parsing:
+Following the example above, we register a material plugin. The second parameter `GLTFExtensionMode.CreateAndParse` indicates that this plugin is used for creating instances and parsing:
```ts
@registerGLTFExtension("Unity_Material_Plugin", GLTFExtensionMode.CreateAndParse)
@@ -100,7 +103,7 @@ class UnityMaterialPluginParser extends GLTFExtensionParser {
#### 2. Incremental Parsing
-Following the example above, we register a light plugin, where the second parameter `GLTFExtensionMode.AdditiveParse` indicates that this plugin performs incremental parsing based on the original instance, such as adding a light source to this entity:
+Following the example above, we register a light plugin. The second parameter `GLTFExtensionMode.AdditiveParse` indicates that this plugin performs incremental parsing on the original instance, such as adding a light source to this entity:
```ts
@registerGLTFExtension("Unity_Light_Plugin", GLTFExtensionMode.AdditiveParse)
diff --git a/docs/en/graphics/model/importGlTF.md b/docs/en/graphics/model/importGlTF.md
index 97902df77c..9198dddd39 100644
--- a/docs/en/graphics/model/importGlTF.md
+++ b/docs/en/graphics/model/importGlTF.md
@@ -1,38 +1,38 @@
---
order: 2
-title: Importing Models
+title: Import Model
type: Graphics
-group: Models
+group: Model
label: Graphics/Model
---
-> Models are exported from modeling software such as [Blender](https://docs.blender.org/manual/en/2.80/addons/io_scene_gltf2.html) in FBX or glTF format, or can be downloaded from model websites like [Sketchfab](https://sketchfab.com/).
+> Models are exported in FBX or glTF format using modeling software such as [Blender](https://docs.blender.org/manual/en/2.80/addons/io_scene_gltf2.html), or can be downloaded from model websites such as [Sketchfab](https://sketchfab.com/).
-Once you have your model ready, you can import it into the Galacean editor for editing. You can import models in the following file formats:
+Once the model is ready, you can import it into the Galacean editor for editing. You can import models using the following file formats:
- **(.gltf + .bin + images)**
- **(.glb + images)**
- **(.fbx)**
-It's important to note that the editor will convert FBX files into a runtime-parseable [glTF format](/en/docs/graphics-model-glTF). Next, let's see how to import model files into the editor.
+It should be noted that the editor will convert FBX to a [glTF format](/en/docs/graphics/model/glTF/) that can also be parsed at runtime. Next, let's practice how to import model files into the editor.
## Drag and Drop Import
-Drag the model file or a compressed **.zip** file into the Assets panel:
+Drag the model file, or the compressed **.zip** file into the asset panel:
## Button Upload
-Click on the top right **Assets panel** -> **GLTF/GLB/FBX**
+Click the upper right corner **Asset Panel** -> **GLTF/GLB/FBX**
-## Right-Click Upload
+## Right-click Upload
-Navigate to **Assets panel** -> **Right-click** -> **Upload** -> **GLTF/GLB/FBX**
+Follow **Asset Panel** -> **Right-click** -> **Upload** -> **GLTF/GLB/FBX**
-After importing, the imported model asset will appear in the **[Assets panel](/en/docs/assets/interface)**. Let's explore what the [model asset contains](/en/docs/graphics-model-assets).
+After the import is complete, the imported model assets will be added to the **[Asset Panel](/en/docs/assets/interface)**. Let's [see what the model assets contain](/en/docs/graphics/model/assets/).
diff --git a/docs/en/graphics/model/model.md b/docs/en/graphics/model/model.md
index 65747f9eb1..f629f5112e 100644
--- a/docs/en/graphics/model/model.md
+++ b/docs/en/graphics/model/model.md
@@ -6,19 +6,19 @@ group: Model
label: Graphics/Model
---
-A model typically refers to a three-dimensional model created by designers using 3D modeling software, containing a series of information such as [mesh](/en/docs/graphics-mesh), [material](/en/docs/graphics-material), [texture](/en/docs/graphics-texture), and [animation](/en/docs/animation-overview). In Galacean, it is also considered as an asset. The model asset workflow is usually as follows:
+Models typically refer to 3D models created by designers using 3D modeling software, containing a series of [meshes](/en/docs/graphics/mesh/mesh/), [materials](/en/docs/graphics/material/material/), [textures](/en/docs/graphics/texture/texture/), and [animations](/en/docs/animation/overview/). In Galacean, they are also considered assets. The model asset workflow is usually as follows:
```mermaid
flowchart LR
- Model exported from modeling software --> Import model into Galacean editor --> Adjust model
+ Modeling software exports model --> Import model into Galacean editor --> Adjust model
```
-This chapter mainly addresses the following questions that developers may encounter:
+This chapter mainly addresses the following issues developers might encounter:
-- Requirements for model formats. The editor currently supports importing models in `glTF` or `FBX` formats, but ultimately, the editor will convert them into a [glTF](/en/docs/graphics-model-glTF) format that can also be parsed at runtime.
-- [Importing models](/en/docs/graphics-model-importGlTF}) into the editor
-- What are [model assets](/en/docs/graphics-model-assets})
-- [Loading and using models](/en/docs/graphics-model-use})
-- [Restoring artistic effects in the editor](/en/docs/graphics-model-restoration})
-- [Model optimization](/en/docs/graphics-model-opt})
+- Requirements for model formats. The editor currently supports importing models in `glTF` or `FBX` formats, but the editor will ultimately convert them into the [glTF](/en/docs/graphics/model/glTF/) format, which can also be parsed at runtime.
+- [Importing models](/en/docs/graphics/model/importGlTF/) into the editor
+- What are [model assets](/en/docs/graphics/model/assets/)
+- [Loading and using models](/en/docs/graphics/model/use/)
+- [Restoring artistic effects in the editor](/en/docs/graphics/model/restoration/)
+- [Model optimization](/en/docs/graphics/model/opt/)
diff --git a/docs/en/graphics/model/opt.md b/docs/en/graphics/model/opt.md
index 4d0c3b5827..f9bc071598 100644
--- a/docs/en/graphics/model/opt.md
+++ b/docs/en/graphics/model/opt.md
@@ -6,7 +6,7 @@ group: Model
label: Graphics/Model
---
-Model optimization generally starts with the following points:
+Model optimization generally starts from the following points:
- Mesh: **Reduce the number of vertices and faces**, **compress mesh data**
- Texture: **Adjust texture size** (e.g., from **1024 \* 1024** -> **512 \* 512**), use **compressed textures**
@@ -14,12 +14,11 @@ Model optimization generally starts with the following points:
## Best Practices
-In the editor, we can optimize models in the following ways:
+In the editor, we can optimize the model in the following ways:
-1. Use [Quantize](https://github.com/KhronosGroup/glTF/blob/main/extensions/2.0/Khronos/KHR_mesh_quantization/README.md) to compress mesh data, select the GlTF Quantize option when exporting the project to quantize compress the mesh
-1. Further compress mesh data using [Meshopt](https://github.com/KhronosGroup/glTF/blob/main/extensions/2.0/Vendor/EXT_meshopt_compression/README.md)
+1. Use [Quantize](https://github.com/KhronosGroup/glTF/blob/main/extensions/2.0/Khronos/KHR_mesh_quantization/README.md) to compress mesh data. When exporting the project, check the GlTF Quantize option to quantize and compress the mesh.
+1. Use [Meshopt](https://github.com/KhronosGroup/glTF/blob/main/extensions/2.0/Vendor/EXT_meshopt_compression/README.md) to further compress mesh data.
-Compression may have some impact on the model mesh accuracy, but in most cases, it is difficult to distinguish with the naked eye.
-
+Compression may affect the precision of the model mesh, but in most cases, it is difficult to distinguish with the naked eye.
diff --git a/docs/en/graphics/model/restoration.md b/docs/en/graphics/model/restoration.md
index b4ccb46095..8bc24d2d5d 100644
--- a/docs/en/graphics/model/restoration.md
+++ b/docs/en/graphics/model/restoration.md
@@ -12,56 +12,55 @@ label: Graphics/Model
The Galacean engine currently has 3 ways to debug materials:
-1. Modify material properties through code, refer to the [tutorial](/en/docs/graphics-material).
+1. Modify material properties through code, refer to [tutorial](/en/docs/graphics/material/material).
-2. Visual debugging through the Galacean Editor, refer to the [tutorial](/en/docs/graphics-material).
+2. Visual debugging through the Galacean Editor, refer to [tutorial](/en/docs/graphics/material/material).
-3. **Export after adjusting in 3D modeling software [glTF](/en/docs/graphics-model-glTF)**
+3. **Export from 3D modeling software after adjusting [glTF](/en/docs/graphics/model/glTF/)**
-The first two methods directly use the engine for rendering, what you see is what you get, with no visual differences.
+The first two methods directly use the engine for rendering, providing a WYSIWYG experience with no visual differences.
-However, designers generally use the third method, adjusting visual effects in modeling software such as C4D, Blender, and then exporting to the engine for preview, only to find that the rendering results are inconsistent, with significant deviations, mainly due to:
+However, designers generally use the 3rd method, which involves adjusting the visual effects in modeling software like C4D, Blender, etc., and then exporting to the engine for preview. They often find that the rendering results are inconsistent or even significantly different. The main reasons are:
-- **Different software rendering algorithms.**
+- **Different rendering algorithms in different software.**
-- **Different lighting.**
+- **Different lighting conditions.**
- **Some assets cannot be exported to glTF files.**
-To achieve the maximum visual fidelity in the face of these differences, you can use the following methods:
+To address these differences, you can achieve the highest degree of visual fidelity through the following methods:
-- **Through baking textures, [export Unlit materials to the engine](/en/docs/graphics-material-Unlit)**
+- **Export Unlit materials to the engine through baked maps.**
-- **Use the same environment map (usually an HDRI file), direct lighting, and other variables.**
+- **Use the same environment maps (usually HDRI files), direct lighting, and other variables.**
-- **In the modeling software, only adjust properties and assets that can be exported to glTF.**
+- **Only adjust properties and assets in the modeling software that can be exported to glTF.**
-If you encounter the above problems, you can refer to this tutorial first, identify the specific reasons, and then refer to the corresponding solutions. If you still cannot resolve the issue, you can contact our team, as we will continuously improve this tutorial.
+If you encounter the above issues, you can refer to this tutorial to identify the specific reasons and then follow the corresponding solutions. If the problem persists, you can contact our team, and we will continuously improve this tutorial.
## Reasons
-### Rendering Algorithm Differences
+### Differences in Rendering Algorithms
-Currently, the most widely used algorithm in real-time rendering is the PBR algorithm, which has advantages such as energy conservation, physical correctness, and ease of operation. However, the specific implementation algorithms of different software are different, resulting in different rendering results. Galacean uses the **Cook-Torrance BRDF** reflectance equation and has been optimized for mobile devices.
+Currently, the most widely used algorithm in real-time rendering is the PBR algorithm, which has advantages like energy conservation, physical correctness, and ease of use. However, the specific implementation algorithms in different software are different, resulting in different rendering outcomes. Galacean uses the **Cook-Torrance BRDF** reflectance equation, optimized for mobile devices.
-It is worth mentioning that although different algorithms can cause certain visual differences, the physical laws remain consistent. For example, the higher the metallicness, the stronger the environmental reflection, and the weaker the diffuse reflection; the rougher the surface, the blurrier the environmental reflection, as shown in the image below:
+It is worth mentioning that although different algorithms can cause some visual differences, their physical principles are consistent. For example, the higher the metallicity, the stronger the environmental reflection and the weaker the diffuse reflection; the higher the roughness, the more blurred the environmental reflection, as shown below:

-### Lighting Differences
+### Differences in Lighting
-Similar to the real world, 3D scenes can also add [direct and ambient light](/en/docs/graphics-light). By default, the Galacean scene **does not** have light sources, only a bluish [solid color diffuse reflection](/apis/core/#AmbientLight-diffuseSolidColor), as shown in the left image below; whereas many modeling software come with light sources:
+Like the real world, 3D scenes can also add [direct and ambient light](/en/docs/graphics/light/light/). By default, Galacean scenes have **no** light sources, only a blue-tinted [solid color diffuse](/en/apis/core/#AmbientLight-diffuseSolidColor), as shown in the first image on the left; whereas many modeling software come with built-in light sources:

-The ambient light is based on [cubemap textures](/en/docs/graphics-texture-cube) in IBL mode, requiring binding an HDRI texture to simulate the surrounding environment, which can be downloaded from the [internet](https://polyhaven.com/hdris). By default, the Galacean scene does not have an HDRI texture bound, while many modeling software come with a visually appealing surrounding environment:
+Ambient light based on [cube textures](/en/docs/graphics/texture/cube) enables IBL mode, requiring an HDRI map to simulate the surrounding environment, which can be [downloaded online](https://polyhaven.com/hdris). By default, Galacean scenes do not have an HDRI map bound, whereas many modeling software come with a visually appealing surrounding environment:

-### glTF Support Differences
-
-The communication channel between the Galacean engine and modeling software is the [glTF file](/en/docs/graphics-model-glTF). glTF supports standard [PBR properties](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-material-pbrmetallicroughness) and [common material properties](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-material), and supports plugins like [ClearCoat](https://github.com/KhronosGroup/glTF/tree/main/extensions/2.0/Khronos/KHR_materials_clearcoat), as shown in the image below. Therefore, as long as the operations in the modeling software can be exported to glTF, the engine can load them through the loader, while those additional operations, such as some parameters of [vRay](https://www.chaosgroup.com/cn/vray/3ds-max) materials, cannot be exported to glTF files.
+### Differences in glTF Support
+The connection channel between the Galacean engine and modeling software is the [glTF file](/en/docs/graphics/model/glTF/). glTF supports standard [PBR properties](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-material-pbrmetallicroughness) and [general material properties](https://www.khronos.org/registry/glTF/specs/2.0/glTF-2.0.html#reference-material), and supports plugins like [ClearCoat](https://github.com/KhronosGroup/glTF/tree/main/extensions/2.0/Khronos/KHR_materials_clearcoat), as shown below. Therefore, as long as the operations in the modeling software can be exported to glTF, the engine can load them through the loader. However, those additional operations, such as some parameters of [vRay](https://www.chaosgroup.com/cn/vray/3ds-max) materials, cannot be exported to glTF files.

@@ -69,70 +68,68 @@ The communication channel between the Galacean engine and modeling software is t
## Solution
-The primary prerequisite for ensuring visual fidelity is to debug materials in the same scene, with the same lighting, and the same ambient lighting, and then choose between real-time rendering or baking solutions.
+The primary prerequisite for ensuring visual fidelity is to debug the material in the same scene, i.e., the same lighting, the same ambient light, etc., and then choose either a real-time rendering solution or a baking solution.
### Unified Lighting
-- Direct Lighting
+- Direct Light
-As mentioned earlier, the engine does not come with direct lighting by default. Therefore, the simplest way to maintain fidelity is to remove lights in the modeling software, ensuring that both the modeling software and the Galacean engine only have ambient lighting (best performance).
+As mentioned earlier, the engine does not come with direct light by default. The simplest way to maintain fidelity is to delete the lights in the modeling software, ensuring that both the modeling software and the Galacean engine only have ambient light (best performance).
-If certain scenes indeed require direct lighting, ensure that the modeling software can export the [glTF light plugin](https://github.com/KhronosGroup/glTF/tree/main/extensions/2.0/Khronos/KHR_lights_punctual) (Google search keywords "\***\* modeling software KHR_lights_punctual" ), such as when exporting from Blender, select **Punctual Lights**.
+If some scenes indeed require adding direct light, please ensure that the modeling software can export the [glTF light plugin](https://github.com/KhronosGroup/glTF/tree/main/extensions/2.0/Khronos/KHR_lights_punctual) (Google search keyword "\***\* modeling software KHR_lights_punctual" ), such as selecting **Punctual Lights** when exporting glTF from Blender.
-If the modeling software does not support exporting this lighting plugin, you can transfer to Blender for export or verbally describe the lighting data to the developers.
+If the modeling software does not support exporting this lighting plugin, you can transfer it to Blender for export, or verbally describe the lighting data to the developers.
-- Ambient Lighting
+- Ambient Light
-As mentioned earlier, the engine does not come with an environment map, i.e., HDRI map, by default, but modeling software usually includes it, such as Blender:
+As mentioned earlier, the engine does not come with an environment map, i.e., HDRI map, by default, but modeling software usually does, such as Blender:
-You can download your favorite HDRI images from [online](https://polyhaven.com/hdris), then debug in the modeling software. Once satisfied, deliver the final HDRI to the developers (as glTF does not support exporting HDR).
+We can first [download](https://polyhaven.com/hdris) our favorite HDRI images from the internet, then debug them in the modeling software. Once satisfied, deliver the final HDRI to the developers (since glTF does not support exporting HDR).
-The method to bind an environment map in modeling software is simple. You can Google search keywords "\*\*\* modeling software environment IBL" , using Blender as an example:
+Binding an environment map in the modeling software is very simple. You can Google search the keyword "\*\*\* modeling software environment IBL". Taking Blender as an example:
-### Real-Time Rendering Solution
+### Real-time Rendering Solution
- Rendering Solution
-After unifying the lighting, you can choose a rendering solution. If you want materials to be affected by lighting, have real-time light interaction, or have transparency and refraction requirements, you should choose a real-time rendering solution, i.e., the engine's PBR solution.
+After unifying the lighting, we can choose the rendering solution. If you want the material to be affected by lighting, interact with light and shadow in real-time, or have some transparency and refraction requirements, then you should choose the real-time rendering solution, i.e., the engine's PBR solution.
-- Debugging Materials
+- Debugging Material
-As mentioned earlier, Galacean PBR uses the **Cook-Torrance BRDF** reflectance equation, which is closest to the Principled BSDF - GGX algorithm in Blender:
+As mentioned earlier, Galacean PBR uses the **Cook-Torrance BRDF** reflectance equation, which is relatively close to the Principled BSDF - GGX algorithm in Blender:
-You can refer to how to debug material parameters that can be exported to glTF through the [Blender official tutorial](https://docs.blender.org/manual/en/2.80/addons/io_scene_gltf2.html#). Similarly, for other modeling software, you can Google search the keywords "\*\*\* modeling software export glTF".
+You can refer to the [Blender official tutorial](https://docs.blender.org/manual/en/2.80/addons/io_scene_gltf2.html#) to debug the material parameters that can be exported to glTF. The same applies to other modeling software; you can Google the keyword “\*\*\* modeling software export glTF”.
-Another convenient way to reference is to import a glTF demo in the modeling software ([click to download](https://gw.alipayobjects.com/os/bmw-prod/85faf9f8-8030-45b2-8ba3-09a61b3db0c3.glb)). This demo has comprehensive PBR properties that you can use for debugging. For example, after importing into Blender, the material panel will display as follows:
+Another relatively simple reference method is to import the glTF demo in the modeling software ([click to download](https://gw.alipayobjects.com/os/bmw-prod/85faf9f8-8030-45b2-8ba3-09a61b3db0c3.glb)). The PBR properties in this demo are quite comprehensive and can be used as a reference for debugging. For example, after importing into Blender, the material panel displays as follows:

-- Export Validation
+- Export Verification
-After exporting to glTF, you can drag the file into the [glTF Viewer](https://galacean.antgroup.com/#/gltf-viewer) to check if the colors, textures, parameters, etc., are correct:
+After exporting the glTF, you can drag the file into the [glTF Viewer](https://galacean.antgroup.com/engine/gltf-viewer) to check whether the corresponding colors, textures, parameters, etc., are correct:
### Baking Solution
-Different from real-time rendering, if your rendering scene is completely static, without the need for light-shadow interactions, refraction, transparency effects, etc., then using a baking solution will better suit your artistic creation. This is because the baking solution can ignore issues related to lighting, glTF support, etc. You can confidently use the built-in renderer of modeling software, powerful plugins like [vRay](https://www.chaosgroup.com/cn/vray/3ds-max), and finally export to the [glTF Unlit plugin](https://github.com/KhronosGroup/glTF/tree/main/extensions/2.0/Khronos/KHR_materials_unlit).
-
-We also provide several tutorials for baking solutions. You can learn more details by searching for keywords like "\*\*\* modeling software bake KHR_materials_unlit":
+Unlike real-time rendering, if your rendering scene is completely static, does not require light and shadow interaction, refraction, transparency, etc., then using a baking solution will better meet your artistic creation needs. This is because the baking solution can ignore the lighting, glTF support issues mentioned above; you can safely use the built-in renderer of the modeling software, powerful plugins like [vRay](https://www.chaosgroup.com/cn/vray/3ds-max), and finally export to [glTF Unlit plugin](https://github.com/KhronosGroup/glTF/tree/main/extensions/2.0/Khronos/KHR_materials_unlit) through baked maps.
-- [C4D Baking Tutorial](/en/docs/art-bake-c4d)
+We also provide several tutorials for the baking solution. You can also Google keywords like “\*\*\* modeling software baking KHR_materials_unlit” to learn more details:
-- [Blender Baking Tutorial](/en/docs/art-bake-blender)
+- [《C4D Baking Tutorial》](/en/docs/art/bake-c4d/)
-- [Exporting Unlit Materials](/en/docs/graphics-material-Unlit)
+- [《Blender Baking Tutorial》](/en/docs/art/bake-blender)
-### Galacean Preview Plugin (Under Development)
+### Galacean Preview Plugin (Planned)
-In the future, we will invest in plugin development to embed the Galacean preview plugin in various modeling software, ensuring a WYSIWYG experience and eliminating steps like glTF file validation.
+In the future, we will also invest in plugin developers to embed the Galacean preview plugin in various modeling software to ensure WYSIWYG, eliminating the steps of glTF file verification.
diff --git a/docs/en/graphics/model/use.md b/docs/en/graphics/model/use.md
index e2e2b97b60..bce6b17fab 100644
--- a/docs/en/graphics/model/use.md
+++ b/docs/en/graphics/model/use.md
@@ -6,18 +6,18 @@ group: Model
label: Graphics/Model
---
-When loading and using model assets, you will generally encounter the following two situations:
+Loading and using model assets generally involves the following two scenarios:
-- Models that are preloaded with the scene file and used in scripts
-- Models that are not preloaded, loaded and used in scripts
+- Using models preloaded with the scene file in the script
+- Loading and using models not preloaded in the script
-In the editor, **models placed in the scene** will be preloaded with the scene file. Follow the steps **Asset Panel** -> **Left-click and drag the model thumbnail** -> **Drag it to the [Viewport](/en/docs/interface/viewport)** -> **Release the left mouse button** -> **Adjust the coordinates** to place the model in the corresponding scene.
+In the editor, **models placed in the scene** will be preloaded with the scene file. Follow the steps **Assets Panel** -> **Left-click and drag the model thumbnail** -> **Drag to [Viewport](/en/docs/interface/viewport)** -> **Release the left mouse button** -> **Adjust coordinates** to place the model in the corresponding scene.
-> The editor cannot directly adjust the scale property of model nodes, so in most cases, you need to drag the model node under an entity node and then adjust the scale property of the entity node.
+> The editor cannot directly adjust the scale property of the model node. Therefore, in most cases, you need to drag the model node under an entity node and then adjust the scale property of the entity node.
-In this case, during runtime, you just need to find the specific node in the scene to access the corresponding model object.
+In this case, you only need to find the specific node in the scene at runtime to get the corresponding model object.
```typescript
// 根据节点名寻找模型节点
@@ -28,7 +28,7 @@ const model2 = scene.findEntityByPath("ModelPath");
## Loading Models
-As long as we have the URL information of the model, we can easily load it.
+As long as we have the URL information of the model, we can easily load the model.
```typescript
engine.resourceManager
@@ -41,15 +41,15 @@ engine.resourceManager
});
```
-In the editor, you can directly get the URL of the model asset (**[Asset Panel](/en/docs/assets/interface)** -> **Right-click on the model asset thumbnail** -> **Copy file info / Copy relative path**):
+In the editor, you can directly get the URL of the model asset (**[Assets Panel](/en/docs/assets/interface)** -> **Right-click the model asset thumbnail** -> **Copy file info / Copy relative path**):
-For models not imported into the editor, the corresponding URL is the path where the model asset is stored.
+For models not imported into the editor, the corresponding URL is the path where the model assets are stored.
## Loading Progress
-When loading models, you can also get the total task/detailed task loading progress through the [onProgress](/apis/core/#AssetPromise-onProgress) event.
+When loading models, you can also get the loading progress of the total task/detailed task through the [onProgress](/en/apis/core/#AssetPromise-onProgress) event.
```typescript
this.engine.resourceManager
@@ -67,13 +67,13 @@ this.engine.resourceManager
## Using Models
-The loaded model object will return a root node containing rendering and animation information, and its usage is no different from regular nodes.
+The loaded model object will return a root node containing rendering information and animation information. Its usage is no different from ordinary nodes.
-### 1. Selecting the Scene Root Node
+### 1. Select Scene Root Node
-glTF may contain multiple scene root nodes `sceneRoots`, and developers can manually select the root node they wish to instantiate.
+glTF may contain multiple scene root nodes `sceneRoots`. Developers can manually select the root node they wish to instantiate.
```typescript
engine.resourceManager
@@ -86,9 +86,9 @@ engine.resourceManager
});
```
-### 2. Playing Animations
+### 2. Play Animation
-If the model contains animation information, you can get the [Animator](/apis/core/#Animator) component from the root node and then choose to play any animation clip.
+If the model carries animation information, you can get the [Animator](/en/apis/core/#Animator) component from the root node and then choose to play any animation clip.
```typescript
engine.resourceManager
@@ -107,9 +107,9 @@ engine.resourceManager
});
```
-### 3. Multiple Material Switching
+### 3. Multi-Material Switching
-The glTF [Multiple Material Extension](https://github.com/KhronosGroup/glTF/tree/master/extensions/2.0/Khronos/KHR_materials_variants) can be used to switch materials.
+The glTF [multi-material extension](https://github.com/KhronosGroup/glTF/tree/master/extensions/2.0/Khronos/KHR_materials_variants) can be used to switch materials.
```typescript
engine.resourceManager
diff --git a/docs/en/graphics/particle/renderer-animation-module.md b/docs/en/graphics/particle/renderer-animation-module.md
index bbe26e052d..0b9ff81e13 100644
--- a/docs/en/graphics/particle/renderer-animation-module.md
+++ b/docs/en/graphics/particle/renderer-animation-module.md
@@ -6,7 +6,7 @@ group: Particle
label: Graphics/Particle
---
-[`TextureSheetAnimationModule`](/apis/core/#TextureSheetAnimationModule) inherits from `ParticleGeneratorModule` and is used to control the texture sheet animation of a particle system.
+[`TextureSheetAnimationModule`](/en/apis/core/TextureSheetAnimationModule) inherits from `ParticleGeneratorModule` and is used to control the texture sheet animation of the particle system.
@@ -14,9 +14,9 @@ label: Graphics/Particle
| Property | Description |
| --------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------ |
-| [startFrame](/apis/core/#TextureSheetAnimationModule-startFrame) | [ParticleCompositeCurve](/apis/core/#ParticleCompositeCurve) object representing the starting frame of the texture sheet |
-| [frameOverTime](/apis/core/#TextureSheetAnimationModule-frameOverTime) | [ParticleCompositeCurve](/apis/core/#ParticleCompositeCurve) object representing the curve of how the frames change over time in the texture sheet |
-| [type](/apis/core/#TextureSheetAnimationModule-type) | Enum `TextureSheetAnimationType` representing the type of texture sheet animation |
-| [cycleCount](/apis/core/#TextureSheetAnimationModule-cycleCount) | Type `number` representing the cycle count of the texture sheet animation |
-| [tiling](/apis/core/#TextureSheetAnimationModule-tiling) | Object `Vector2` representing the tiling of the texture sheet. Can be accessed and modified using `get` and `set` methods |
+| [startFrame](/en/apis/core/TextureSheetAnimationModule#startFrame) | [ParticleCompositeCurve](/en/apis/core/ParticleCompositeCurve) object, representing the start frame of the texture sheet |
+| [frameOverTime](/en/apis/core/TextureSheetAnimationModule#frameOverTime) | [ParticleCompositeCurve](/en/apis/core/ParticleCompositeCurve) object, representing the curve of the texture sheet frame over time |
+| [type](/en/apis/core/TextureSheetAnimationModule#type) | `TextureSheetAnimationType` enum, representing the type of texture sheet animation |
+| [cycleCount](/en/apis/core/TextureSheetAnimationModule#cycleCount) | `number` type, representing the cycle count of the texture sheet animation |
+| [tiling](/en/apis/core/TextureSheetAnimationModule#tiling) | `Vector2` object, representing the tiling of the texture sheet. Can be accessed and modified through `get` and `set` methods |
diff --git a/docs/en/graphics/particle/renderer-color-module.md b/docs/en/graphics/particle/renderer-color-module.md
index b080e8dde7..d16ffd5cfe 100644
--- a/docs/en/graphics/particle/renderer-color-module.md
+++ b/docs/en/graphics/particle/renderer-color-module.md
@@ -6,18 +6,18 @@ group: Particle
label: Graphics/Particle
---
-[`ColorOverLifetimeModule`](/apis/core/#ColorOverLifetimeModule) inherits from `ParticleGeneratorModule` and is used to handle color changes during the lifetime of a particle system.
+[`ColorOverLifetimeModule`](/en/apis/core/ColorOverLifetimeModule) inherits from `ParticleGeneratorModule` and is used to handle color changes during the lifecycle of a particle system.
## Properties
-| Property | Description |
-| ----------------------------------------------------- | ------------------------------------------------------------------------------------------------------- |
-| [color](/apis/core/#ColorOverLifetimeModule-color) | An [ParticleCompositeGradient](/apis/core/#ParticleCompositeGradient) object representing the color gradient over the particle's lifetime |
+| Property | Description |
+| ------------------------------------------------- | -------------------------------------------------------------------------------------------------------- |
+| [color](/en/apis/core/ColorOverLifetimeModule#color) | [ParticleCompositeGradient](/en/apis/core/ParticleCompositeGradient) object, representing the color gradient of particles over their lifecycle |
## Gradient Editing
-For the [ParticleCompositeGradient](/apis/core/#ParticleCompositeGradient) object, there is a built-in gradient editor in the editor. The top of the gradient bar represents the color key, and the bottom represents the alpha value key. Each key's position on the gradient bar represents its time. Double-clicking on an existing key creates a new key, and long-pressing a key and dragging downwards deletes the key.
+For the [ParticleCompositeGradient](/en/apis/core/ParticleCompositeGradient) object, the editor has a built-in gradient editor. The top of the gradient bar represents color keys, and the bottom represents alpha value keys. Each key's position on the gradient bar represents its time. Double-clicking an existing key can create a new key, and long-pressing a key and dragging it downwards can delete the key.
diff --git a/docs/en/graphics/particle/renderer-emission-module.md b/docs/en/graphics/particle/renderer-emission-module.md
index 1ad1501eed..d678d9ecdf 100644
--- a/docs/en/graphics/particle/renderer-emission-module.md
+++ b/docs/en/graphics/particle/renderer-emission-module.md
@@ -6,37 +6,37 @@ group: Particle
label: Graphics/Particle
---
-[EmissionModule](/apis/core/#EmissionModule) is the emission module of `ParticleGeneratorModule`. This module is used to handle the emission behavior of the particle system, including particle emission rate, emission shape, and burst behavior.
+[EmissionModule](/en/apis/core/EmissionModule) is the emission module of `ParticleGeneratorModule`. This module is used to handle the emission behavior of the particle system, including particle emission rate, emission shape, and burst behavior.
## Properties
-| Property | Description |
-| --------------------------------------------------------------- | ---------------------------------------------------------------------------------------------------------------- |
-| [rateOverTime](/apis/core/#EmissionModule-rateOverTime) | This is a [ParticleCompositeCurve](/apis/core/#ParticleCompositeCurve) object that represents the emission rate of particles. The default value is `10`. |
-| [rateOverDistance](/apis/core/#EmissionModule-rateOverDistance) | This is a [ParticleCompositeCurve](/apis/core/#ParticleCompositeCurve) object that represents the distance emission rate of particles. The default value is `0`. |
-| [shape](/apis/core/#EmissionModule-shape) | This is a `BaseShape` object that represents the shape of the emitter. |
+| Property | Description |
+| --------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------- |
+| [rateOverTime](/en/apis/core/EmissionModule#rateOverTime) | This is a [ParticleCompositeCurve](/en/apis/core/ParticleCompositeCurve) object, representing the particle emission rate. The default value is `10`. |
+| [rateOverDistance](/en/apis/core/EmissionModule#rateOverDistance) | This is a [ParticleCompositeCurve](/en/apis/core/ParticleCompositeCurve) object, representing the particle distance emission rate. The default value is `0`. |
+| [shape](/en/apis/core/EmissionModule#shape) | This is a `BaseShape` object, representing the shape of the emitter. |
## Methods
-| Method | Description |
-| -------------------------------------------------------------------- | ------------------------ |
-| [addBurst(burst: Burst)](/apis/core/#EmissionModule-addBurst) | Adds a burst behavior |
-| [removeBurst(burst: Burst)](/apis/core/#EmissionModule-removeBurst) | Removes a burst behavior |
-| [removeBurstByIndex(index: number)](/apis/core/#EmissionModule-removeBurstByIndex) | Removes a burst behavior by index |
-| [clearBurst()](/apis/core/#EmissionModule-clearBurst) | Clears all burst behaviors |
+| Method | Description |
+| ---------------------------------------------------------------------------------- | ------------------------- |
+| [addBurst(burst: Burst)](/en/apis/core/EmissionModule#addBurst) | Add a burst behavior |
+| [removeBurst(burst: Burst)](/en/apis/core/EmissionModule#removeBurst) | Remove a burst behavior |
+| [removeBurstByIndex(index: number)](/en/apis/core/EmissionModule#removeBurstByIndex) | Remove a burst behavior by index |
+| [clearBurst()](/en/apis/core/EmissionModule#clearBurst) | Clear all burst behaviors |
## Shapes
-The engine currently has the following built-in emitter shapes, which provide corresponding auxiliary displays when selecting the particle component.
+Currently, the engine has the following built-in emitter shapes, providing corresponding shape auxiliary displays when the particle component is selected.
-| Emitter Shape Type | Description |
-| ---------------------------------------------------------------- | -------------------------------------- |
-| [BoxShape](/apis/core/#EmissionModule-BoxShape) | `BaseShape` object, emitter shape is a cube |
-| [CircleShape](/apis/core/#EmissionModule-CircleShape) | `BaseShape` object, emitter shape is a circle |
-| [ConeShape](/apis/core/#EmissionModule-ConeShape) | `BaseShape` object, emitter shape is a cone-like |
-| [HemisphereShape](/apis/core/#EmissionModule-HemisphereShape) | `BaseShape` object, emitter shape is a hemisphere |
-| [SphereShape](/apis/core/#EmissionModule-SphereShape) | `BaseShape` object, emitter shape is a sphere |
+| Emitter Shape Type | Description |
+| --------------------------------------------------------------- | ------------------------------------ |
+| [BoxShape](/en/apis/core/EmissionModule#BoxShape) | `BaseShape` object, emitter shape is a cube |
+| [CircleShape](/en/apis/core/EmissionModule#CircleShape) | `BaseShape` object, emitter shape is a circle |
+| [ConeShape](/en/apis/core/EmissionModule#ConeShape) | `BaseShape` object, emitter shape is a cone |
+| [HemisphereShape](/en/apis/core/EmissionModule#HemisphereShape) | `BaseShape` object, emitter shape is a hemisphere |
+| [SphereShape](/en/apis/core/EmissionModule#SphereShape) | `BaseShape` object, emitter shape is a sphere |
-Please paste the Markdown content you need to be translated.
+It looks like you haven't pasted the Markdown content yet. Please provide the content you want translated, and I'll help you with the translation while adhering to the rules you've specified.
diff --git a/docs/en/graphics/particle/renderer-main-module.md b/docs/en/graphics/particle/renderer-main-module.md
index 097c0a8d82..8dde5f534f 100644
--- a/docs/en/graphics/particle/renderer-main-module.md
+++ b/docs/en/graphics/particle/renderer-main-module.md
@@ -6,36 +6,46 @@ group: Particle
label: Graphics/Particle
---
-[MainModule](/apis/core/#MainModule) is the main module of `ParticleGeneratorModule`, containing the most basic particle generation parameters. These properties are mostly used to control the initial state of newly created particles.
+[MainModule](/en/apis/core/MainModule) is the main module of `ParticleGeneratorModule`, containing the most basic particle generation parameters. These properties are mostly used to control the initial state of newly created particles.
## Properties
-| Property | Description |
-| --------------------------------------------------------- | ------------------------------------------------------- |
-| [duration](/apis/core/#MainModule-duration) | Duration of the particle generator (in seconds) |
-| [isLoop](/apis/core/#MainModule-isLoop) | Specifies if the particle generator loops |
-| [startDelay](/apis/core/#MainModule-startDelay) | Delay at the start of particle emission (in seconds) |
-| [startLifetime](/apis/core/#MainModule-startLifetime) | Initial lifetime of particles upon emission |
-| [startSpeed](/apis/core/#MainModule-startSpeed) | Initial speed of particles when first generated |
-| [startSize3D](/apis/core/#MainModule-startSize3D) | Whether to specify particle size along each axis |
-| [startSize](/apis/core/#MainModule-startSize) | Initial size of particles when first generated |
-| [startSizeX](/apis/core/#MainModule-startSizeX) | Initial size along the x-axis when particles are emitted |
-| [startSizeY](/apis/core/#MainModule-startSizeY) | Initial size along the y-axis when particles are emitted |
-| [startSizeZ](/apis/core/#MainModule-startSizeZ) | Initial size along the z-axis when particles are emitted |
-| [startRotation3D](/apis/core/#MainModule-startRotation3D) | Whether to enable 3D particle rotation |
-| [startRotation](/apis/core/#MainModule-startRotation) | Initial rotation of particles when first generated |
-| [startRotationX](/apis/core/#MainModule-startRotationX) | Initial rotation along the x-axis when particles are emitted |
-| [startRotationY](/apis/core/#MainModule-startRotationY) | Initial rotation along the y-axis when particles are emitted |
-| [startRotationZ](/apis/core/#MainModule-startRotationZ) | Initial rotation along the z-axis when particles are emitted |
-| [flipRotation](/apis/core/#MainModule-flipRotation) | Rotates some particles in the opposite direction |
-| [startColor](/apis/core/#MainModule-startColor) | Initial color mode of particles |
-| [gravityModifier](/apis/core/#MainModule-gravityModifier) | Proportion of gravity defined by Physics.gravity applied to this particle generator |
-| [simulationSpace](/apis/core/#MainModule-simulationSpace) | Selects the space in which particles are simulated, either world space or local space |
-| [simulationSpeed](/apis/core/#MainModule-simulationSpeed) | Overrides the default playback speed of the particle generator |
-| [scalingMode](/apis/core/#MainModule-scalingMode) | Controls how the particle generator applies its Transform component to the particles it emits |
-| [playOnEnabled](/apis/core/#MainModule-playOnEnabled) | If set to true, the particle generator will automatically start playing when enabled |
-| [maxParticles](/apis/core/#MainModule-maxParticles) | Maximum number of particles |
-
-Please paste the Markdown content you need to be translated.
+
+
+You can debug each property one by one in the provided example to help you better understand and control the main particle module, thereby achieving various complex and beautiful visual effects.
+
+Duration [duration](/en/apis/core/MainModule#duration) determines how long the particle generator runs, in seconds. A longer duration means the particle system will generate more particles, creating a continuous effect.
+
+Is Loop [isLoop](/en/apis/core/MainModule#isLoop) if set to true, the particle generator will automatically restart after the duration ends, forming a continuous effect, such as smoke or flowing water.
+
+Start Delay [startDelay](/en/apis/core/MainModule#startDelay) determines the delay time before the particle generator starts emitting after being activated, which is useful for effects that require a time difference, such as fireworks.
+
+Start Lifetime [startLifetime](/en/apis/core/MainModule#startLifetime) determines how long each particle can live before disappearing. A longer lifetime means particles will stay on the screen longer.
+
+Start Speed [startSpeed](/en/apis/core/MainModule#startSpeed) determines the speed at which particles are emitted. Higher initial speed will cause particles to spread quickly, like an explosion effect; lower speed will make particles drift slowly, like a smoke effect.
+
+Start Size 3D [startSize3D](/en/apis/core/MainModule#startSize3D) allows setting different sizes for the x, y, z axes of the particles to achieve anisotropic particle effects, such as elongated flames. The initial size of particles when the particle generator first generates particles [startSize](/en/apis/core/MainModule#startSize) controls the size of each particle. Larger initial size is suitable for simulating large clouds of smoke or flames, while smaller size is suitable for fine dust or splashes. For specific axis initial size, the initial size along the x-axis [startSizeX](/en/apis/core/MainModule#startSizeX), along the y-axis [startSizeY](/en/apis/core/MainModule#startSizeY), and along the z-axis [startSizeZ](/en/apis/core/MainModule#startSizeZ) respectively control the size of particles on the x, y, z axes, making the particle shapes more diverse and delicate.
+
+Start Rotation 3D [startRotation3D](/en/apis/core/MainModule#startRotation3D) allows particles to rotate in 3D space, increasing the three-dimensionality and complexity of particles, such as falling leaves in three-dimensional space. The initial rotation of particles when the particle generator first generates particles (startRotation) sets the rotation angle of particles when emitted, suitable for effects where particles need to move in a specific direction, such as directional flames. The initial rotation along the x-axis [startRotationX](/en/apis/core/MainModule#startRotationX), along the y-axis [startRotationY](/en/apis/core/MainModule#startRotationY), and along the z-axis [startRotationZ](/en/apis/core/MainModule#startRotationZ) respectively control the rotation of particles on the x, y, z axes, increasing the freedom of particle movement. Flip Rotation [flipRotation](/en/apis/core/MainModule#flipRotation) ranges from 0 to 1, making some particles rotate in the opposite direction, increasing the randomness and naturalness of the particle system, suitable for simulating complex motion trajectories.
+
+Start Color [startColor](/en/apis/core/MainModule#startColor) determines the color of particles, which can be used to simulate different material effects, such as the red-orange color of flames or the gray-white color of smoke.
+
+Gravity Modifier [gravityModifier](/en/apis/core/MainModule#gravityModifier) adjusts the degree to which particles are affected by gravity, making particles look more realistic, such as falling raindrops or rising smoke.
+
+选择模拟粒子的空间 [simulationSpace](/en/apis/core/MainModule#simulationSpace) 决定了粒子是相对于世界还是相对于生成器自身运动。世界空间适合固定位置的效果,如烟雾;本地空间适合随对象移动的效果,如火焰尾迹。
+
+模拟速度 [simulationSpeed](/en/apis/core/MainModule#simulationSpeed) 可整体加快或减慢粒子的运动速度,适用于时间慢动作或加速效果。
+
+缩放模式 [scalingMode](/en/apis/core/MainModule#scalingMode) 决定了粒子生成器在发射粒子时,如何处理位置、旋转和缩放等变换操作。使用 scalingMode 能确保粒子生成器和粒子之间的变换关系符合预期。scalingMode 有以下几种模式:
+
+- Local:粒子会继承粒子生成器的局部变换,即粒子的变换是在生成器的本地坐标系中进行的。
+
+- World:粒子会继承粒子生成器的全局变换,即粒子的变换是在世界坐标系中进行的。
+
+- Hierarchy:粒子会继承整个变换层级中的变换,即粒子会考虑到生成器的父级及更上级的变换。
+
+如果设置为 true,粒子生成器将在启动时自动开始播放 [playOnEnabled](/en/apis/core/MainModule#playOnEnabled) 。开启这个选项会确保粒子系统一启动就开始发射粒子,适用于需要立即显示效果的场景。
+
+最大粒子数 [maxParticles](/en/apis/core/MainModule#maxParticles) 限制了粒子系统的最大粒子数,以防止性能问题。较大的值适用于需要大量粒子的效果,如浓烟。
diff --git a/docs/en/graphics/particle/renderer-rotation-module.md b/docs/en/graphics/particle/renderer-rotation-module.md
index def01bf50b..132d8d5472 100644
--- a/docs/en/graphics/particle/renderer-rotation-module.md
+++ b/docs/en/graphics/particle/renderer-rotation-module.md
@@ -1,21 +1,21 @@
---
order: 4
-title: Lifecycle Rotation Module
+title: Rotation Over Lifetime Module
type: Graphics
group: Particle
label: Graphics/Particle
---
-[`RotationOverLifetimeModule`](/apis/core/#RotationOverLifetimeModule) inherits from `ParticleGeneratorModule` and is used to control the rotation changes of particles within the lifecycle of a particle system.
+[`RotationOverLifetimeModule`](/en/apis/core/RotationOverLifetimeModule) inherits from `ParticleGeneratorModule` and is used to control the rotation changes of the particle system over its lifetime.
## Properties
-| Property | Description |
-| ------------------------------------------------------------------ | --------------------------------------------------------------------------------------------------- |
-| [separateAxes](/apis/core/#RotationOverLifetimeModule-separateAxes) | A `boolean` indicating whether rotation is done separately on each axis. If disabled, only the z-axis will be used |
-| [rotationX](/apis/core/#RotationOverLifetimeModule-rotationX) | A [ParticleCompositeCurve](/apis/core/#ParticleCompositeCurve) object representing the x-axis rotation of particles within their lifetime |
-| [rotationY](/apis/core/#RotationOverLifetimeModule-rotationY) | A [ParticleCompositeCurve](/apis/core/#ParticleCompositeCurve) object representing the y-axis rotation of particles within their lifetime |
-| [rotationZ](/apis/core/#RotationOverLifetimeModule-rotationZ) | A [ParticleCompositeCurve](/apis/core/#ParticleCompositeCurve) object representing the z-axis rotation of particles within their lifetime |
+| Property | Description |
+| ------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------- |
+| [separateAxes](/en/apis/core/RotationOverLifetimeModule#separateAxes) | `boolean` type, indicates whether to rotate separately on each axis. If disabled, only the z-axis will be used |
+| [rotationX](/en/apis/core/RotationOverLifetimeModule#rotationX) | [ParticleCompositeCurve](/en/apis/core/ParticleCompositeCurve) object, represents the x-axis rotation of the particle over its lifetime |
+| [rotationY](/en/apis/core/RotationOverLifetimeModule#rotationY) | [ParticleCompositeCurve](/en/apis/core/ParticleCompositeCurve) object, represents the y-axis rotation of the particle over its lifetime |
+| [rotationZ](/en/apis/core/RotationOverLifetimeModule#rotationZ) | [ParticleCompositeCurve](/en/apis/core/ParticleCompositeCurve) object, represents the z-axis rotation of the particle over its lifetime |
diff --git a/docs/en/graphics/particle/renderer-size-module.md b/docs/en/graphics/particle/renderer-size-module.md
index 40e3d8ba91..fd075db8e7 100644
--- a/docs/en/graphics/particle/renderer-size-module.md
+++ b/docs/en/graphics/particle/renderer-size-module.md
@@ -1,32 +1,32 @@
---
order: 3
-title: Lifecycle Size Module
+title: Size Over Lifetime Module
type: Graphics
group: Particle
label: Graphics/Particle
---
-[`SizeOverLifetimeModule`](/apis/core/#SizeOverLifetimeModule) is a subclass of `ParticleGeneratorModule` used to handle size changes of particles during their lifecycle.
+[`SizeOverLifetimeModule`](/en/apis/core/SizeOverLifetimeModule) is a subclass of `ParticleGeneratorModule` used to handle size changes over the lifetime of a particle system.
## Properties
-| Property | Description |
-| ---------------------------------------------------------------- | ------------------------------------------------------------------------------------------------- |
-| [separateAxes](/apis/core/#SizeOverLifetimeModule-separateAxes) | A boolean value that specifies whether the size changes independently on each axis |
-| [sizeX](/apis/core/#SizeOverLifetimeModule-sizeX) | A [ParticleCompositeCurve](/apis/core/#ParticleCompositeCurve) object representing the size change curve on the x-axis |
-| [sizeY](/apis/core/#SizeOverLifetimeModule-sizeY) | A [ParticleCompositeCurve](/apis/core/#ParticleCompositeCurve) object representing the size change curve on the y-axis |
-| [sizeZ](/apis/core/#SizeOverLifetimeModule-sizeZ) | A [ParticleCompositeCurve](/apis/core/#ParticleCompositeCurve) object representing the size change curve on the z-axis |
-| [size](/apis/core/#SizeOverLifetimeModule-size) | A [ParticleCompositeCurve](/apis/core/#ParticleCompositeCurve) object to get or set the size change curve of particles |
+| Property | Description |
+| ------------------------------------------------------------- | --------------------------------------------------------------------------------------------------- |
+| [separateAxes](/en/apis/core/SizeOverLifetimeModule#separateAxes) | Boolean value specifying whether the size changes independently for each axis |
+| [sizeX](/en/apis/core/SizeOverLifetimeModule#sizeX) | [ParticleCompositeCurve](/en/apis/core/ParticleCompositeCurve) object representing the size change curve along the x-axis |
+| [sizeY](/en/apis/core/SizeOverLifetimeModule#sizeY) | [ParticleCompositeCurve](/en/apis/core/ParticleCompositeCurve) object representing the size change curve along the y-axis |
+| [sizeZ](/en/apis/core/SizeOverLifetimeModule#sizeZ) | [ParticleCompositeCurve](/en/apis/core/ParticleCompositeCurve) object representing the size change curve along the z-axis |
+| [size](/en/apis/core/SizeOverLifetimeModule#size) | [ParticleCompositeCurve](/en/apis/core/ParticleCompositeCurve) object to get or set the size change curve of particles |
-## Curve Editor
+## Polyline Editing
-For the [ParticleCompositeCurve](/apis/core/#ParticleCompositeCurve) object, a curve editor is built into the editor for visual adjustment of the curve.
+For the [ParticleCompositeCurve](/en/apis/core/ParticleCompositeCurve) object, a polyline editor is built into the editor for visual curve adjustments.
-Alternatively, in code:
+Or in code:
```ts
sizeOverLifetime.enabled = true;
diff --git a/docs/en/graphics/particle/renderer-velocity-module.md b/docs/en/graphics/particle/renderer-velocity-module.md
index 4f2e0bb75f..96cfff665d 100644
--- a/docs/en/graphics/particle/renderer-velocity-module.md
+++ b/docs/en/graphics/particle/renderer-velocity-module.md
@@ -1,22 +1,23 @@
---
order: 5
-title: Life Speed Module
+title: Lifecycle Velocity Module
type: Graphics
group: Particle
label: Graphics/Particle
---
-### Life Speed Module
+### Lifecycle Velocity Module
-[`VelocityOverLifetimeModule`](/apis/core/#VelocityOverLifetimeModule) inherits from `ParticleGeneratorModule` and is used to control the speed changes during the lifetime of the particle system.
+[`VelocityOverLifetimeModule`](/en/apis/core/VelocityOverLifetimeModule) inherits from `ParticleGeneratorModule` and is used to control the velocity changes of a particle system over its lifetime.
## Properties
-| Property | Description |
+| Property | Description |
| ------------------------------------------------------------ | --------------------------------------------------------------------------------------------------- |
-| [space](/apis/core/#VelocityOverLifetimeModule-velocityZ) | Selects the space for speed changes, which can be world space or local space |
-| [velocityX](/apis/core/#VelocityOverLifetimeModule-velocityX) | [ParticleCompositeCurve](/apis/core/#ParticleCompositeCurve) object representing the x-axis rotation of particles during their lifetime |
-| [velocityY](/apis/core/#VelocityOverLifetimeModule-velocityY) | [ParticleCompositeCurve](/apis/core/#ParticleCompositeCurve) object representing the y-axis rotation of particles during their lifetime |
-| [velocityZ](/apis/core/#VelocityOverLifetimeModule-velocityZ) | [ParticleCompositeCurve](/apis/core/#ParticleCompositeCurve) object representing the z-axis rotation of particles during their lifetime |
+| [space](/en/apis/core/VelocityOverLifetimeModule#velocityZ) | Select the space for velocity changes, which can be world space or local space |
+| [velocityX](/en/apis/core/VelocityOverLifetimeModule#velocityX) | [ParticleCompositeCurve](/en/apis/core/ParticleCompositeCurve) object representing the x-axis rotation of particles over their lifetime |
+| [velocityY](/en/apis/core/VelocityOverLifetimeModule#velocityY) | [ParticleCompositeCurve](/en/apis/core/ParticleCompositeCurve) object representing the y-axis rotation of particles over their lifetime |
+| [velocityZ](/en/apis/core/VelocityOverLifetimeModule#velocityZ) | [ParticleCompositeCurve](/en/apis/core/ParticleCompositeCurve) object representing the z-axis rotation of particles over their lifetime |
+
diff --git a/docs/en/graphics/particle/renderer.md b/docs/en/graphics/particle/renderer.md
index 3841680462..a9ec72326d 100644
--- a/docs/en/graphics/particle/renderer.md
+++ b/docs/en/graphics/particle/renderer.md
@@ -1,26 +1,26 @@
---
order: 0
-title: Particle Renderer
+title: Particle
type: Graphics
group: Particle
label: Graphics/Particle
---
-The Particle Renderer [ParticleRenderer](/apis/core/#ParticleRenderer) of Galacean Engine is a commonly used rendering component with rich properties, allowing for vibrant particle effects by adjusting various property values.
+The particle (Particle Renderer) [ParticleRenderer](/en/apis/core/ParticleRenderer) of the Galacean Engine is a commonly used rendering component with rich properties. By adjusting various property values, you can achieve colorful particle effects.
-
+
-## Particle Component
+## Components
-The particle component can be added to an already activated Entity in the scene through a shortcut on the hierarchy tree panel or the inspector panel.
+The particle component can be mounted on an activated Entity in the scene through the shortcut at the top of the hierarchy panel or by adding a component in the inspector panel.

-Once added, you can view the particle properties in the inspector panel. The particle panel at the bottom left of the view window can control the playback of particle effects in the view window.
+After adding, you can view the particle properties in the inspector panel. The particle panel at the bottom left corner of the view window can control the playback of particle effects in the view window.
-
+
-You can also attach the particle component in scripts.
+You can also mount the particle component in the script.
```ts
// 创建实体
@@ -31,13 +31,13 @@ let particleRenderer = particleEntity.addComponent(ParticleRenderer);
## Rendering Material
-[ParticleMaterial](/apis/core/#ParticleMaterial) is the default material for particles.
+[ParticleMaterial](/en/apis/core/ParticleMaterial) is the default material for particles.
-In the editor, create a particle material by adding a material and selecting the particle material. After editing, go back to the particle observer panel to select and use the material.
+In the editor, create it by adding material - selecting particle material. After editing, go back to the particle inspector panel to select and use the material.
-Or in scripts:
+Or in the script:
```ts
// Add particle material
@@ -45,42 +45,50 @@ const material = new ParticleMaterial(engine);
particleRenderer.setMaterial(material);
```
-| Property | Description |
-| ----------------------------------------------------- | ----------- |
-| [baseColor](/apis/core/#ParticleMaterial-baseColor) | Base color |
-| [baseTexture](/apis/core/#ParticleMaterial-baseColor) | Base texture |
+| Property | Description |
+| ---------------------------------------------------- | ----------- |
+| [baseColor](/en/apis/core/ParticleMaterial#baseColor) | Base Color |
+| [baseTexture](/en/apis/core/ParticleMaterial#baseColor) | Base Texture|
## Playback Control
-The particle panel that appears when selecting an entity with a particle component allows you to control the playback of particle effects in the view window.
+The particle panel that appears when an entity with a particle component is selected allows you to control the playback of particle effects in the view window.
-It is important to note that adjustments made to particle playback on this panel are only for preview purposes in the view window and do not change the properties of the particle component. If you need to change the playback-related properties of the particle, adjustments need to be made in the observer panel.
+It should be noted that adjustments to particle playback on this panel are only for previewing in the view window and do not change the properties of the particle component. If you need to change the playback-related properties of the particle, you need to adjust them in the inspector panel.
-
+
-| Preview Options | Description |
-| --------------- | ------------------------------------------------------------------ |
-| Restart | Stop the current particle effect playback and immediately restart |
-| Stop | Stop the playback of the particle effect and reset to the initial state |
-| Pause | Pause the particle effect on the selected entity and its child nodes |
-| Play | Start playing the particle effect on the selected entity and its child nodes |
-| Speed | Adjust the current playback speed |
-| Preview | Choose to play the particle effect on the selected entity and its child nodes, or play all particle effects in the scene |
+| Preview Playback Options | Description |
+| ------------------------ | ------------------------------------------------ |
+| Replay | Stops the current particle effect playback and immediately starts playing from the beginning |
+| Stop | Stops the particle effect playback and resets to the initial state |
+| Pause / Play | Pauses / Starts playing the particle effect |
+| Selected / Global | Plays the currently selected particle or all particles in the scene |
+| Bounding Box | Bounding box of the currently selected particle |
+
+Or in the code,
+
+```ts
+// 播放
+particleRenderer.generator.play();
+// 停止
+particleRenderer.generator.stop();
+// 调整播放速度
+particleRenderer.generator.main.simulationSpeed = 2;
+```
## Particle Generator
-The `ParticleRenderer`'s [generator](/apis/core/#ParticleGenerator) property is mainly responsible for particle generation and playback functions. The functions related to particle generation consist of multiple modules, including the main module, emitter module, life size module, life color module, life speed module, life rotation module, and texture table animation module. In the editor's particle observer panel, you can visually see each module and its sub-options.
+The [generator](/en/apis/core/ParticleGenerator) property of `ParticleRenderer` is mainly responsible for the generation and playback of particles. The functions related to particle generation are composed of multiple modules, including the main module, emitter module, lifetime size module, lifetime color module, lifetime speed module, lifetime rotation module, and texture sheet animation module. In the editor's particle inspector panel, you can visually see each module and its options.
## Other Parameters
-| Property | Description |
-| ------------------------------------------------------------- | --------------------------------------------------------------- |
-| [velocityScale](/apis/core/#ParticleRenderer-velocityScale) | Specifies the extent to which particles stretch based on their velocity |
-| [lengthScale](/apis/core/#ParticleRenderer-lengthScale) | Defines the extent to which particles stretch in their direction of motion, defined as the ratio of the particle's length to its width |
-| [pivot](/apis/core/#ParticleRenderer-pivot) | The pivot of the particle |
-| [renderMode](/apis/core/#ParticleRenderer-renderMode) | The rendering mode of the particle |
-| [mesh](/apis/core/#ParticleRenderer-mesh) | The mesh of the particle, valid when `renderMode` is `Mesh` |
-
-{ /*examples*/ }
+| Property | Description |
+| --- | --- |
+| [velocityScale](/en/apis/core/ParticleRenderer#velocityScale) | Specifies the extent to which particles stretch based on their speed |
+| [lengthScale](/en/apis/core/ParticleRenderer#lengthScale) | Defines the extent to which particles stretch in their direction of motion, defined as the ratio of the particle's length to its width |
+| [pivot](/en/apis/core/ParticleRenderer#pivot) | The pivot of the particle |
+| [renderMode](/en/apis/core/ParticleRenderer#renderMode) | The rendering mode of the particle |
+| [mesh](/en/apis/core/ParticleRenderer#mesh) | The mesh of the particle, effective when `renderMode` is `Mesh` |
diff --git a/docs/en/graphics/renderer/meshRenderer.md b/docs/en/graphics/renderer/meshRenderer.md
index 5169e9e2f0..5c602af9da 100644
--- a/docs/en/graphics/renderer/meshRenderer.md
+++ b/docs/en/graphics/renderer/meshRenderer.md
@@ -1,26 +1,26 @@
---
order: 1
-title: Grid Renderer
+title: Mesh Renderer
type: Graphics
group: Renderer
label: Graphics/Renderer
---
-[MeshRenderer](/apis/core/#MeshRenderer) is a grid rendering component. When an entity is equipped with a mesh rendering component, you only need to set its `mesh` and `material` to render the object.
+[MeshRenderer](/en/apis/core/#MeshRenderer) is a mesh renderer component that uses mesh objects (such as cubes) as the data source for geometric outlines. When an entity is mounted with a mesh renderer component, you only need to set its `mesh` and `material` to render the object.
## Usage
-In the editor **[Hierarchy Panel](/en/docs/interface/hierarchy)**, you can quickly create a node with a cuboid mesh renderer attached ( **Hierarchy Panel** -> **Right-click** -> **3D Object** -> **Cuboid** ).
+In the editor **[Hierarchy Panel](/en/docs/interface/hierarchy)**, you can quickly create a node with a cuboid mesh renderer ( **Hierarchy Panel** -> **Right Click** -> **3D Object** -> **Cuboid** ).
-Alternatively, you can attach a mesh renderer to existing nodes in the scene and set any [mesh](/en/docs/graphics-mesh) and [material](/en/docs/graphics-material). ( **Select Node** -> **[Inspector Panel](/en/docs/interface/inspector)** -> **Add Component** -> **Mesh Renderer** ).
+Of course, you can also mount a mesh renderer to an existing node in the scene and set any [mesh](/en/docs/graphics/mesh/mesh/) and [material](/en/docs/graphics/material/material/). ( **Select Node** -> **[Inspector Panel](/en/docs/interface/inspector)** -> **Add Component** -> **Mesh Renderer** ).
-Corresponding script usage is as follows:
+The corresponding usage in the script is as follows:
```typescript
const cubeEntity = rootEntity.createChild("cube");
@@ -35,18 +35,18 @@ In the editor, you can easily set the properties of the mesh renderer.
-| Setting | Explanation |
-| :--------------- | :----------------------------------------------------- |
-| `material` | Information about the [material](/en/docs/graphics-material) of the object to be rendered |
-| `mesh` | Information about the [mesh](/en/docs/graphics-mesh) of the object to be rendered |
-| `receiveShadows` | Whether to receive shadows |
-| `castShadows` | Whether to cast shadows |
-| `priority` | Rendering priority of the renderer, the smaller the value, the higher the priority, default is 0 |
+| Setting | Explanation |
+| :---------------- | :-------------------------------------------------------- |
+| `material` | [Material](/en/docs/graphics/material/material/) information of the object to be rendered |
+| `mesh` | [Mesh](/en/docs/graphics/mesh/mesh/) information of the object to be rendered |
+| `receiveShadows` | Whether to receive shadows |
+| `castShadows` | Whether to cast shadows |
+| `priority` | Rendering priority of the renderer, the smaller the value, the higher the priority, default is 0 |
-Compared to the basic [renderer](/en/docs/graphics-renderer), the mesh renderer can also be set to support vertex colors (vertex color data is included in the vertex information of the mesh).
+Compared to the basic [Renderer](/en/docs/graphics/renderer/renderer/), the mesh renderer can also set whether to support vertex colors (vertex color data is included in the vertex information of the mesh).
-| Property | Explanation |
-| :----------------- | :----------------- |
+| Property | Explanation |
+| :------------------ | :-------------------- |
| `enableVertexColor` | Whether to support vertex colors |
```typescript
@@ -57,5 +57,4 @@ meshRenderer.enableVertexColor = true;
## Methods
-The mesh renderer **does not introduce** any additional methods. However, it is important to note that in many cases, the mesh renderer's mesh contains **several sub-meshes**. If you want each sub-mesh to correspond to **different materials**, you can specify the corresponding **mesh index** during setup, otherwise, the same material will be used by default.
-
+The mesh renderer does **not add** any new methods, but it is important to note that in many cases, the mesh renderer's mesh contains **several sub-meshes**. If you want each sub-mesh to correspond to **different materials**, you can specify the corresponding **mesh index** when setting it, otherwise, the same material will be used by default.
diff --git a/docs/en/graphics/renderer/order.md b/docs/en/graphics/renderer/order.md
index 16b982d322..1abbc9eb33 100644
--- a/docs/en/graphics/renderer/order.md
+++ b/docs/en/graphics/renderer/order.md
@@ -1,22 +1,22 @@
---
order: 3
-title: Rendering Order
+title: Render Order
type: Graphics
group: Renderer
label: Graphics/Renderer
---
-The rendering order of the renderer will affect the **performance** and **accuracy** of the rendering. In Galacean, for each camera, components are placed in the corresponding **render queue** according to a unified **determination rule**.
+The render order of the renderer affects the **performance** and **accuracy** of rendering. In Galacean, for each camera, components are placed in the corresponding **render queue** according to a unified **determination rule**.
## Render Queue
-Galacean has divided the rendering into three render queues, in the following order:
+Galacean is divided into three render queues, in the order of rendering:
-- Non-transparent render queue (**Opaque**)
-- Transparent cutout render queue (**AlphaTest**)
-- Transparent render queue (**Transparent**)
+- Opaque Render Queue (**Opaque**)
+- Alpha Test Render Queue (**AlphaTest**)
+- Transparent Render Queue (**Transparent**)
-The assignment of the renderer to a queue is determined by whether the renderer material is **transparent** and the **threshold** of transparent cutout.
+The queue to which the renderer is assigned is determined by whether the renderer material is **transparent** and the **alpha test threshold**.
```mermaid
flowchart TD
@@ -27,9 +27,9 @@ flowchart TD
D -->|否| F[非透明渲染队列]
```
-## Determination Rule
+## Determination Rules
-The determination rule for rendering order in Galacean is as follows:
+The determination rules for render order in Galacean are as follows:
```mermaid
flowchart TD
@@ -44,20 +44,20 @@ flowchart TD
### Renderer Priority
-The engine provides a `priority` property for the renderer to modify the rendering order in the render queue. The default value is 0, the **smaller the priority (can be negative), the higher the priority** of rendering.
+The engine provides the `priority` property for the renderer to modify the render order in the render queue. The default value is 0. **The smaller the priority (it can be negative), the higher the rendering priority**.
### Material Priority
-The engine provides a `priority` property for the material to modify the rendering order of different rendering data from the same renderer in the render queue. The default value is 0, the **smaller the priority (can be negative), the higher the priority** of rendering.
+The engine provides the `priority` property for the material to modify the render order of different render data from the same renderer in the render queue. The default value is 0. **The smaller the priority (it can be negative), the higher the rendering priority**.
-### Distance from Renderer Component Bounds to Camera
+### Distance from Renderer Component Bounding Box to Camera
-The calculation of the distance from the renderer component bounds to the camera depends on the type of [camera](/en/docs/graphics-camera). In an orthographic camera, it is the distance between the center point of the renderer bounds and the camera along the camera's view direction. In a perspective camera, it is the direct distance from the center point of the renderer bounds to the camera position.
+The calculation method of the distance from the renderer component bounding box to the camera depends on the [camera](/en/docs/graphics/camera/camera/) type. In an orthographic camera, it is the distance from the center of the renderer bounding box to the camera along the camera view direction. In a perspective camera, it is the direct distance from the center of the renderer bounding box to the camera position.
-
+
-> It is important to note that the impact of distance on rendering order is different in different render queues. In the non-transparent render queue and transparent cutout render queue, the rendering order is **from near to far**, while in the transparent render queue, the rendering order is **from far to near**.
+> It should be noted that in different render queues, the rules for the impact of distance on render order are different. In the opaque render queue and alpha test render queue, the render order is **from near to far**, while in the transparent render queue, the render order is **from far to near**.
### Stability
-Currently, when different renderers have the same `renderer priority` and `distance from renderer component bounds to camera`, Galacean ensures the stability of rendering order through **`renderer.instanceId`**, but it cannot guarantee the stability of rendering order within the **same renderer**.
+Currently, when different renderers have the same `renderer priority` and the same `distance from the renderer component bounding box to the camera`, Galacean ensures the stability of the render order through **`renderer.instanceId`**, but the render order within the **same renderer** cannot be guaranteed to be stable.
diff --git a/docs/en/graphics/renderer/renderer.md b/docs/en/graphics/renderer/renderer.md
index 62defd9298..f46590a77a 100644
--- a/docs/en/graphics/renderer/renderer.md
+++ b/docs/en/graphics/renderer/renderer.md
@@ -6,36 +6,36 @@ group: Renderer
label: Graphics/Renderer
---
-The renderer is responsible for displaying graphics [**components**](/en/docs/core/component), which will display corresponding rendering effects based on different data sources. By attaching a renderer to a node and setting the corresponding rendering data, various complex 3D scenes can be displayed.
+The renderer is responsible for displaying graphics as a [**component**](/en/docs/core/component). It displays the corresponding rendering effects based on different data sources. By mounting the renderer on a node and setting the corresponding rendering data, various complex 3D scenes can be displayed.
-## Renderer Types
+## Types of Renderers
-In Galacean, the following built-in renderers are available:
+In Galacean, the following types of renderers are built-in:
-- [Mesh Renderer](./meshRenderer): Renders objects by setting `mesh` and `material`.
-- [Skinned Mesh Renderer](./skinnedMeshRenderer): Based on the [Mesh Renderer](./meshRenderer), it includes additional capabilities for `skeletal animation` and `Blend Shape`, making object animations more natural.
-- [Sprite Renderer](/en/docs/graphics/2D/spriteRenderer/): Displays 2D images in the scene by setting `sprite` and `material` (default built-in sprite material).
-- [Sprite Mask Renderer](/en/docs/graphics/2D/spriteMask/): Used to implement masking effects on sprite renderers.
+- [Mesh Renderer](/en/docs/graphics/renderer/meshRenderer/): Renders objects by setting `mesh` and `material`.
+- [Skinned Mesh Renderer](/en/docs/graphics/renderer/skinnedMeshRenderer): Based on the [Mesh Renderer](/en/docs/graphics/renderer/meshRenderer/), it additionally includes capabilities for `skeletal animation` and `Blend Shape`, making the animation effects of objects more natural.
+- [Sprite Renderer](/en/docs/graphics/2D/spriteRenderer/): By setting `sprite` and `material` (default built-in sprite material), 2D images can be displayed in the scene.
+- [Sprite Mask Renderer](/en/docs/graphics/2D/spriteMask/): Used to implement masking effects for the sprite renderer.
- [Text Renderer](/en/docs/graphics/2D/text/): Displays text in the scene.
- [Particle Renderer](/en/docs/graphics/particle/renderer/): Displays particle effects in the scene.
-Further understanding of the rendering order of various renderers in the engine can be achieved through [render sorting](./order).
+You can learn more about the rendering order of various renderers in the engine through [Rendering Order](/en/docs/graphics/renderer/order/).
## Properties
-`Renderer` is the base class for all renderers in Galacean and includes the following properties:
+`Renderer` is the base class for all renderers in Galacean, and it includes the following properties:
-| Property | Description |
-| :---------------- | :-------------------------------------------------- |
+| Property | Description |
+| :---------------- | :--------------------------------------------------- |
| `receiveShadows` | Whether to receive shadows |
| `castShadows` | Whether to cast shadows |
-| `priority` | Rendering priority of the renderer, lower values mean higher priority, default is 0 |
-| `shaderData` | Data required for rendering, including constants and macro switches |
-| `materialCount` | Total number of materials in the renderer |
+| `priority` | The rendering priority of the renderer, the smaller the value, the higher the priority, default is 0 |
+| `shaderData` | Data dependent on rendering, including some constants and macro switches |
+| `materialCount` | Total number of materials contained in the renderer |
| `bounds` | World bounding box of the renderer |
-| `isCulled` | Whether the renderer is culled in the current frame |
+| `isCulled` | Whether the renderer is rendered in the current frame |
-These properties can be accessed from any renderer derived from `Renderer`.
+You can get these properties from any renderer derived from `Renderer`.
```typescript
const renderer = cubeEntity.getComponent(Renderer);
@@ -48,15 +48,19 @@ console.log("bounds", renderer.bounds);
console.log("isCulled", renderer.isCulled);
```
+Below shows how to get the overall bounding box of multiple `Renderers`:
+
+
+
## Methods
-The `Renderer` base class mainly provides methods for setting and getting materials, it is important to note that a renderer may contain multiple materials, so the following methods are more like **manipulating an array of materials**.
+The `Renderer` base class mainly provides methods for setting and getting materials. It is important to note that a renderer may contain multiple materials, so the following methods are more like **manipulating an array of materials**.
-| Method | Description |
-| :---------------------- | :------------------------ |
-| `setMaterial` | Set a material in the array |
-| `getMaterial` | Get a material from the array |
-| `getMaterials` | Get the array of materials |
-| `getInstanceMaterial` | Get a copy of a material from the array |
-| `getInstanceMaterials` | Get copies of the array of materials |
+| Method | Description |
+| :-------------------- | :------------------------- |
+| `setMaterial` | Sets a material in the array |
+| `getMaterial` | Gets a material from the array |
+| `getMaterials` | Gets the array of materials |
+| `getInstanceMaterial` | Gets a copy of a material from the array |
+| `getInstanceMaterials`| Gets copies of the array of materials |
diff --git a/docs/en/graphics/renderer/skinnedMeshRenderer.md b/docs/en/graphics/renderer/skinnedMeshRenderer.md
index c888e4510c..314472a2ad 100644
--- a/docs/en/graphics/renderer/skinnedMeshRenderer.md
+++ b/docs/en/graphics/renderer/skinnedMeshRenderer.md
@@ -6,11 +6,11 @@ group: Renderer
label: Graphics/Renderer
---
-The Skinned Mesh Renderer inherits from the [Mesh Renderer](/en/docs/graphics-renderer-meshRenderer) and provides additional capabilities for `skeletal animation` and `Blend Shapes`, making the animation effects of rendered objects more natural and realistic.
+The Skinned Mesh Renderer inherits from the [Mesh Renderer](/en/docs/graphics/renderer/meshRenderer/), additionally encapsulating the capabilities of `skeletal animation` and `Blend Shape`, making the animation effects of rendered objects more natural and realistic.
## Properties
-The properties of the Skinned Mesh Renderer are closely related to `skeletal animation` and `Blend Shapes`.
+The properties of the Skinned Mesh Renderer are mostly related to `skeletal animation` and `Blend Shape`.
| Setting | Description |
| :------------------ | :----------------------------- |
@@ -19,7 +19,7 @@ The properties of the Skinned Mesh Renderer are closely related to `skeletal ani
| `rootBone` | The root bone node corresponding to the Skinned Mesh Renderer |
| `blendShapeWeights` | The blend weights of BlendShapes |
-Models exported from the art workflow generally already have all the bone and BlendShape information set up. Developers only need to combine with the [animation system](/en/docs/animation-overview) to play specific animation clips.
+In models exported from the art workflow, all bone and BlendShape information is generally pre-set. Developers only need to play the specified animation clips in conjunction with the [animation system](/en/docs/animation/overview).
## Skeletal Animation
diff --git a/docs/en/graphics/shader/_meta.json b/docs/en/graphics/shader/_meta.json
new file mode 100644
index 0000000000..caef6528ad
--- /dev/null
+++ b/docs/en/graphics/shader/_meta.json
@@ -0,0 +1,9 @@
+{
+ "intro": { "title": "Shader Introduction" },
+ "class": { "title": "Shader Object" },
+ "builtins": { "title": "Built-in Shaders" },
+ "custom": { "title": "Custom Shaders" },
+ "assets": { "title": "Shader Assets" },
+ "shaderLab": { "title": "ShaderLab" },
+ "shaderAPI": { "title": "Shader API [Experimental]" }
+}
diff --git a/docs/en/graphics/shader/assets.mdx b/docs/en/graphics/shader/assets.mdx
new file mode 100644
index 0000000000..7b459597ef
--- /dev/null
+++ b/docs/en/graphics/shader/assets.mdx
@@ -0,0 +1,25 @@
+---
+title: Shader Assets
+---
+
+Shader assets define Shader objects in the Galacean editor project and currently include two types:
+
+- Shader Main File
+
+With the `.gs` suffix, it is the entry file compiled by ShaderLab.
+
+- Shader Chunk
+
+With the `.glsl` suffix, it is a reusable ShaderLab code snippet introduced via the `#include` macro.
+
+
+ `EditorProperties` and `EditorMacros` can only be declared in the Shader main file and cannot be introduced via the `#include` macro.
+
+
+## Shader Asset Creation
+
+
+
+The editor provides three Shader asset file templates: `Unlit Shader`, `PBR Shader`, and `Shader Chunk`. Similar to script components, selecting a Shader asset allows you to preview the Shader code in the Inspector interface. You can edit the Shader code in real-time by double-clicking or selecting it in the code editing page.
+
+Go to [Shader API Tutorial](/en/docs/graphics/shader/shaderAPI/) to learn how to extend Shaders based on the templates.
diff --git a/docs/en/graphics/shader/builtins/blinnPhong.md b/docs/en/graphics/shader/builtins/blinnPhong.md
new file mode 100644
index 0000000000..3777c30808
--- /dev/null
+++ b/docs/en/graphics/shader/builtins/blinnPhong.md
@@ -0,0 +1,28 @@
+---
+title: Blinn Phong
+---
+
+[BlinnPhongMaterial](/en/apis/core/#BlinnPhongMaterial) material is one of the classic materials. Although it is not based on physical rendering, its efficient rendering algorithm and comprehensive optical components make it still applicable to many scenarios today.
+
+
+
+## Editor Usage
+
+
+
+## Parameter Introduction
+
+| Parameter | Application |
+| :-- | :-- |
+| [baseColor](/en/apis/core/#BlinnPhongMaterial-baseColor) | Base color. **Base color \* Base texture = Final base color.** |
+| [baseTexture](/en/apis/core/#BlinnPhongMaterial-baseTexture) | Base texture. Used in conjunction with the base color, it is a multiplicative relationship. |
+| [specularColor](/en/apis/core/#BlinnPhongMaterial-specularColor) | Specular reflection color. **Specular reflection color \* Specular reflection texture = Final specular reflection color.** |
+| [specularTexture](/en/apis/core/#BlinnPhongMaterial-specularTexture) | Specular reflection texture. Used in conjunction with the specular reflection color, it is a multiplicative relationship. |
+| [normalTexture](/en/apis/core/#BlinnPhongMaterial-normalTexture) | Normal texture. You can set the normal texture to create a bump effect visually, and control the bump degree through normal intensity. |
+| [normalIntensity](/en/apis/core/#BlinnPhongMaterial-normalIntensity) | Normal intensity. Used to control the bump degree. |
+| [emissiveColor](/en/apis/core/#BlinnPhongMaterial-emissiveColor) | Emissive color. **Emissive color \* Emissive texture = Final emissive color. Even without lighting, it can render color.** |
+| [emissiveTexture](/en/apis/core/#BlinnPhongMaterial-emissiveTexture) | Emissive texture. Used in conjunction with the emissive color, it is a multiplicative relationship. |
+| [shininess](/en/apis/core/#BlinnPhongMaterial-shininess) | Specular reflection coefficient. The larger the value, the more concentrated the specular reflection effect. |
+| [tilingOffset](/en/apis/core/#BlinnPhongMaterial-tilingOffset) | Scaling and offset of texture coordinates. It is a Vector4 data that controls the scaling and offset of texture coordinates in the uv direction. Refer to [example](/en/embed/tiling-offset) |
+
+If you need to use the material through scripts, please refer to the [Material Usage Tutorial](/en/docs/graphics/material/script).
diff --git a/docs/en/graphics/shader/builtins/digitalHuman/eye.mdx b/docs/en/graphics/shader/builtins/digitalHuman/eye.mdx
new file mode 100644
index 0000000000..8bab00dfd6
--- /dev/null
+++ b/docs/en/graphics/shader/builtins/digitalHuman/eye.mdx
@@ -0,0 +1,54 @@
+---
+title: Eyes
+---
+
+The eye shader provides realistic rendering for eyeball models, giving your creations a lifelike artistic effect.
+
+
+
+
+This shader is strongly associated with the model UV. If you need to build eyes from scratch, it is not recommended to use this shader. For those without shader code development experience, it is recommended to use the eyeball geometry model from the official examples and simply replace the necessary textures to meet your needs. If you need to create your own eye shader variant, please refer to the [shader lab development tutorial](/en/docs/graphics-shader-lab) document.
+
+
+## Import Eye Example
+
+Galacean comes with an eyeball material example to further help you get started. To view this example, [click here](https://galacean.antgroup.com/editor/projects).
+
+1. Navigate to the editor homepage in the Galacean editor.
+2. Select the **Templates** panel, navigate to the template interface, preview, and download the eyeball example to **Project**.
+
+## Eyeball Anatomy
+
+Before starting to render eyes, familiarize yourself with the biological structure of the eyeball to better use the shader.
+
+| Parameter | Description |
+| :-------------------: | :-------------------------------------------------------: |
+| Sclera (Sclera) | The sclera is the opaque membrane on the outer layer of the eyeball, commonly known as the "white of the eye" |
+| Limbus (Limbus) | Also known as the corneoscleral junction, it is the boundary between the cornea and the sclera (white of the eye) |
+| Iris (Iris) | The iris is a ring of color surrounding the pupil center, forming a hollow ring |
+| Pupil (Pupil) | The pupil is the black part at the center of the eyeball, allowing light to enter the eye and reach the retina |
+| Cornea (Cornea) | The cornea is the transparent part located at the front of the eyeball |
+
+## Eye Textures
+
+| Texture | Parameter | Description |
+| :-------------------------------------------------------------: | :------------------------: | :------------------------------------------------------------------ |
+| | Sclera Texture | Specifies the texture that controls the color of the sclera (white of the eye) and the color of the blood vessels around the eye. If you want the eyeball to be full of blood vessels, you need to modify this texture |
+| | Iris Texture | Controls the color of the iris |
+| | Iris Normal Texture | Specifies the texture to control the normals of the iris, determining the light flow on the surface of the iris |
+| | Cornea Normal Texture | Specifies the texture to control the normals of the sclera, providing subtle texture variations of the sclera (white of the eye) |
+| | Sclera Mask | Specifies a mask texture to control the size of the sclera/iris/limbus/pupil. The `R` channel controls the iris area, the `G` channel controls the limbus area, and the `B` channel controls the pupil scaling area |
+
+## Material Properties
+
+| Parameter | Description |
+| :---------------------: | :-------------------------------------: |
+| Sclera Color | Sclera texture color |
+| Sclera Size | Sclera UV size |
+| Sclera Specular | Sclera (white of the eye) metallicity |
+| Sclera Roughness | Sclera (white of the eye) roughness |
+| Pupil Dilation | Pupil size, adjustable in the xy direction |
+| Limbal Ring Amount | Intensity of the limbal ring |
+| Parallax Layer | Parallax depth |
+| Iris Color | Iris color |
+| Eye Iris Size | Iris size |
diff --git a/docs/en/graphics/shader/builtins/digitalHuman/hair.mdx b/docs/en/graphics/shader/builtins/digitalHuman/hair.mdx
new file mode 100644
index 0000000000..5f84a45cf0
--- /dev/null
+++ b/docs/en/graphics/shader/builtins/digitalHuman/hair.mdx
@@ -0,0 +1,30 @@
+---
+title: Hair
+---
+
+Hair shading is based on the Kajiya-Kay shading model, which approximates the dual-layer anisotropic highlights on the hair surface, commonly known as "angel rings" in hair rendering.
+
+
+
+## Import Example
+
+Galacean provides you with a hair example to further help you get started. To find this example, please [click](https://galacean.antgroup.com/editor/projects).
+
+1. Navigate to the editor homepage in the Galacean Editor.
+2. Select the **Templates** panel, navigate to the template interface, preview and download the hair example to **Project**.
+
+## Dual-layer Highlights
+
+
+## Material Properties
+
+| Parameter | Description |
+| :----------------: | :----------------: |
+| HairFirstWidth | Adjust the width of the first layer "angel ring highlight" of the hair |
+| HairSecondWidth | Adjust the width of the second layer "angel ring highlight" of the hair |
+| HairFirstStrength | The intensity of the first layer highlight of the hair. For example, if you need a stronger highlight effect, you can increase the value |
+| HairSecondStrength | The intensity of the second layer highlight of the hair. For example, if you need a stronger highlight effect, you can increase the value |
+| HairFirstOffest | The offset of the first layer highlight of the hair. If you want to adjust the position of the highlight, you can modify this value |
+| HairSecondOffest | The offset of the second layer highlight of the hair. If you want to adjust the position of the highlight, you can modify this value |
+| HairFirstColor | The color of the first layer highlight of the hair |
+| HairSecondColor | The color of the second layer highlight of the hair |
diff --git a/docs/en/graphics/shader/builtins/digitalHuman/skin.mdx b/docs/en/graphics/shader/builtins/digitalHuman/skin.mdx
new file mode 100644
index 0000000000..0b0f56f0f4
--- /dev/null
+++ b/docs/en/graphics/shader/builtins/digitalHuman/skin.mdx
@@ -0,0 +1,31 @@
+---
+title: Skin
+---
+
+The skin uses the Spherical Gaussian model, allowing flexible customization of different Diffusion Profiles to simulate human skin or general subsurface scattering effects.
+
+
+
+## Import Example
+
+Galacean provides you with a skin example to further help you get started. To find this example, please [click](https://galacean.antgroup.com/editor/projects).
+
+1. Navigate to the editor homepage in the Galacean editor.
+2. Select the **Templates** panel, navigate to the template interface, preview, and download the skin example to **Project**.
+
+## Material Properties
+
+| Parameter | Description |
+| :-------------------: | :-----------------------------: |
+| SSSColor | Adjust the skin scattering color |
+| CurvatureTexture | Curvature map, controls the scattering area |
+| CurvaturePower | Intensity of curvature, the best range is between `0-1` |
+
+## Detail Display
+The following demonstration compares the differences between PBR and skin materials.
+
diff --git a/docs/en/graphics/shader/builtins/intro.mdx b/docs/en/graphics/shader/builtins/intro.mdx
new file mode 100644
index 0000000000..0c84ee2afb
--- /dev/null
+++ b/docs/en/graphics/shader/builtins/intro.mdx
@@ -0,0 +1,25 @@
+---
+title: Introduction
+---
+
+Currently, the Galacean engine has many built-in shaders, such as
+
+| Type | Description |
+| :-- | :-- |
+| [Unlit](/en/docs/graphics/shader/builtins/unlit) | The Unlit material is suitable for rendering baked models. It only requires setting a basic texture or color to display high-quality rendering results obtained from offline rendering. However, the downside is that it cannot display real-time light and shadow interactions because Unlit rendering is determined by the texture and is not affected by any lighting. Refer to [Baking Tutorial](/en/docs/art/bake-blender) and [Export Unlit Tutorial](/en/docs/graphics/shader/builtins/unlit) |
+| [Blinn Phong](/en/docs/graphics/shader/builtins/blinnPhong) | The Blinn Phong material is suitable for scenes that do not require high realism. Although it does not follow physical principles, its efficient rendering algorithm and comprehensive optical components make it suitable for many scenes. |
+| [PBR](/en/docs/graphics/shader/builtins/pbr) | The PBR material is suitable for applications that require realistic rendering. Since PBR is based on physical rendering and follows energy conservation, developers can ensure that the rendering effects are physically correct by adjusting parameters such as metallicity, roughness, and lighting. |
+
+You can directly debug the corresponding properties of the built-in shaders in the editor to observe real-time rendering effect changes.
+
+
+
+> Correspondingly, you can achieve the same effect by setting the APIs of PBRMaterial, BlinnPhongMaterial, and UnlitMaterial materials.
+
+| Parameter | Application |
+| :-- | :-- |
+| [isTransparent](/en/apis/core/#BaseMaterial-isTransparent) | Transparency. You can set whether the material is transparent. If set to transparent, you can use [BlendMode](/en/apis/core/#BaseMaterial-blendMode) to set the color blending mode. |
+| [alphaCutoff](/en/apis/core/#BaseMaterial-alphaCutoff) | Alpha cutoff value. You can set the cutoff value. In the shader, fragments with transparency less than this value will be culled. Refer to [Example](/en/embed/blend-mode) |
+| [renderFace](/en/apis/core/#BaseMaterial-renderFace) | Render face. You can decide to render the front, back, or both sides. |
+| [blendMode](/en/apis/core/#BaseMaterial-blendMode) | Color blending mode. When the material is set to transparent, you can set this enumeration to decide the color blending mode. Refer to [Example](/en/embed/blend-mode) |
+| [tilingOffset](/en/apis/core/#BlinnPhongMaterial-tilingOffset) | Texture coordinate scaling and offset. It is a Vector4 data that controls the scaling and offset of texture coordinates in the uv direction. Refer to [Example](/en/embed/tiling-offset) |
diff --git a/docs/en/graphics/shader/builtins/pbr.md b/docs/en/graphics/shader/builtins/pbr.md
new file mode 100644
index 0000000000..03f5dae0dd
--- /dev/null
+++ b/docs/en/graphics/shader/builtins/pbr.md
@@ -0,0 +1,68 @@
+---
+title: PBR
+---
+
+PBR stands for **Physically Based Rendering**, which means **physically based rendering** in Chinese. It was first proposed by Disney in 2012 and later widely used in the gaming industry. Compared to traditional rendering methods like **Blinn-Phong**, PBR follows the principle of energy conservation, conforms to physical rules, and allows artists to achieve correct rendering effects in complex scenes by adjusting a few simple parameters. PBR follows energy conservation, is physically based, and introduces [IBL](/en/docs/graphics/light/ambient) to simulate global illumination. By adjusting parameters such as metallic and roughness, it is more convenient to tweak the rendering effects.
+
+
+
+## Editor Usage
+
+Based on the interaction of light and materials in the real world, insulators (i.e., when the metallic value is 0) can reflect about 4% of pure color light, rendering the surrounding environment. As shown in the model below, the metallic value is 0, but the surrounding environment can still be faintly seen in the reflection:
+
+
+
+By adjusting the metallic value of the material, we can observe that the higher the metallic value, the clearer the surrounding environment becomes, and it starts to change from white to colorful. This is because dielectric materials (i.e., when the metallic value is 1) will reflect 100% of the light off the surface of the object, reflecting the colorful surrounding environment:
+
+
+
+In addition, there are many common properties that can be configured, such as anisotropy, roughness, ambient occlusion, emissive light, transparency, etc.:
+
+
+
+
+
+## Parameter Introduction
+
+| Parameter | Application |
+| :-- | :-- |
+| [metallic](/en/apis/core/#PBRMaterial-metallic) | Metallic. Simulates the metallic degree of the material. The higher the metallic value, the stronger the specular reflection, which can reflect more of the surrounding environment. |
+| [roughness](/en/apis/core/#PBRMaterial-roughness) | Roughness. Simulates the roughness of the material. The higher the roughness, the less smooth the micro-surface, and the more blurred the specular reflection. |
+| [roughnessMetallicTexture](/en/apis/core/#PBRMaterial-roughnessMetallicTexture) | Metallic roughness texture. Used in conjunction with metallic roughness, it is a multiplicative relationship. |
+| [baseColor](/en/apis/core/#PBRBaseMaterial-baseColor) | Base color. **Base color** \* **Base color texture** = **Final base color**. The base color is the reflectance value of the object. Unlike traditional diffuse color, it contributes to both specular and diffuse colors. We can control the contribution ratio through the metallic and roughness mentioned above. |
+| [emissiveColor](/en/apis/core/#PBRBaseMaterial-emissiveColor) | Emissive color. Allows rendering of color even without lighting. |
+| [baseTexture](/en/apis/core/#PBRBaseMaterial-baseTexture) | Base color texture. Used in conjunction with the base color, it is a multiplicative relationship. |
+| [normalTexture](/en/apis/core/#PBRBaseMaterial-normalTexture) | Normal texture. Can set the normal texture to create a bump effect visually, and the bump degree can be controlled by the normal strength. |
+| [emissiveTexture](/en/apis/core/#PBRBaseMaterial-emissiveTexture) | Emissive texture. We can set the emissive texture and emissive color ([emissiveFactor](/en/apis/core/#PBRBaseMaterial-emissiveTexture)) to achieve the emissive effect, rendering color even without lighting. |
+| [occlusionTexture](/en/apis/core/#PBRBaseMaterial-occlusionTexture) | Ambient occlusion texture. We can set the ambient occlusion texture to enhance the shadow details of the object. |
+| [tilingOffset](/en/apis/core/#PBRBaseMaterial-tilingOffset) | Scaling and offset of texture coordinates. It is a Vector4 data that controls the scaling and offset of texture coordinates in the uv direction. Refer to [example](/en/embed/tiling-offset). |
+| [clearCoat](/en/apis/core/#PBRBaseMaterial-clearCoat) | Clear coat strength, default is 0, which means the clear coat effect is not enabled. Refer to [example](/en/embed/pbr-clearcoat). |
+| [clearCoatTexture](/en/apis/core/#PBRBaseMaterial-clearCoatTexture) | Clear coat strength texture, which is a multiplicative relationship with clearCoat. |
+| [clearCoatRoughness](/en/apis/core/#PBRBaseMaterial-clearCoatRoughness) | Clear coat roughness. |
+| [clearCoatRoughnessTexture](/en/apis/core/#PBRBaseMaterial-clearCoatRoughnessTexture) | Clear coat roughness texture, which is a multiplicative relationship with clearCoatRoughness. |
+| [clearCoatNormalTexture](/en/apis/core/#PBRBaseMaterial-clearCoatNormalTexture) | Clear coat normal texture. If not set, it will share the normal of the original material. |
+
+除了以上通用参数,PBR 提供了 **金属-粗糙度** 和 **高光-光泽度** 两种工作流,分别对应 [PBRMaterial](/en/apis/core/#PBRMaterial) 和 [PBRSpecularMaterial](/en/apis/core/#PBRSpecularMaterial)。
+
+### PBRMaterial
+
+| 参数 | 应用 |
+| :-- | :-- |
+| [metallic](/en/apis/core/#PBRMaterial-metallic) | 金属度。模拟材质的金属程度,金属值越大,镜面反射越强,即能反射更多周边环境。 |
+| [roughness](/en/apis/core/#PBRMaterial-roughness) | 粗糙度。模拟材质的粗糙程度,粗糙度越大,微表面越不平坦,镜面反射越模糊。 |
+| [roughnessMetallicTexture](/en/apis/core/#PBRMaterial-roughnessMetallicTexture) | 金属粗糙度纹理。搭配金属粗糙度使用,是相乘的关系。 |
+| [anisotropy](/en/apis/core/#PBRMaterial-anisotropy) | 各向异性强度。默认为 0,关闭各项异性计算。参考 [案例](/en/embed/pbr-anisotropy) 。 |
+| [anisotropyRotation](/en/apis/core/#PBRMaterial-anisotropyRotation) | 各向异性旋转角度。沿切线、副切线空间旋转相应角度。 |
+| [anisotropyTexture](/en/apis/core/#PBRMaterial-anisotropyTexture) | 各向异性纹理。RG 通道保存着各向异性方向,会和 anisotropyRotation 计算结果相乘;B 通道保存着各向异性强度,会和 anisotropy 相乘。 |
+
+### PBRSpecularMaterial
+
+| 参数 | 应用 |
+| :-- | :-- |
+| [specularColor](/en/apis/core/#PBRMaterial-specularColor) | 高光度。不同于金属粗糙度工作流的根据金属度和基础颜色计算镜面反射,而是直接使用高光度来表示镜面反射颜色。(注,只有关闭金属粗糙工作流才生效) 。 |
+| [glossiness](/en/apis/core/#PBRMaterial-glossiness) | 光泽度。模拟光滑程度,与粗糙度相反。(注,只有关闭金属粗糙工作流才生效)。 |
+| [specularGlossinessTexture](/en/apis/core/#PBRMaterial-specularGlossinessTexture) | 高光光泽度纹理。搭配高光光泽度使用,是相乘的关系。 |
+
+> **注**:PBR 必须开启[环境光](/en/docs/graphics/light/ambient)
+
+如果需要通过脚本使用材质,可以前往[材质的使用教程](/en/docs/graphics/material/script)。
diff --git a/docs/en/graphics/shader/builtins/thin.mdx b/docs/en/graphics/shader/builtins/thin.mdx
new file mode 100644
index 0000000000..7c4b81e90b
--- /dev/null
+++ b/docs/en/graphics/shader/builtins/thin.mdx
@@ -0,0 +1,35 @@
+---
+title: Thin Film Interference
+---
+
+Thin film interference shaders refer to the phenomenon where certain surfaces gradually change color with changes in viewing angle or lighting angle. This is caused by the interference of light waves in microstructures or thin films and is commonly seen in soap bubbles, feathers, butterfly wings, and the nacre of shells.
+
+
+
+
+For thin film interference materials, the color depends on the angle of incidence of the light. Illumination can bring about a good color gradient, but if your model is low-poly, you won't get a good color gradient because each face will reflect light at different angles.
+
+
+## Import Example
+
+Galacean provides you with a thin film interference example to help you get started. To find this example, please [click](https://galacean.antgroup.com/editor/projects).
+
+1. Navigate to the editor homepage in the Galacean editor.
+2. Select the **Templates** panel, navigate to the template interface, preview, and download the thin film interference example to **Project**.
+
+## Material Properties
+
+| Parameter | Description |
+| :-----------------------: | :-----------------------------------------------------------------: |
+| iridescent ior | This refractive index value determines the degree of light bending. For thin film interference, it affects the color of the resulting light. |
+| iridescence | Controls the intensity of the iridescent color. `1` corresponds to the highest intensity, `0` will only have the PBR effect. |
+| iridescence Thickness | Used to control the thickness of the iridescence, determining the final layers of thin film interference. |
+
+## Detail Display
+The following demonstration compares the differences when only adjusting the iridescent ior with the same PBR properties.
+
diff --git a/docs/en/graphics/shader/builtins/unlit.md b/docs/en/graphics/shader/builtins/unlit.md
new file mode 100644
index 0000000000..416d5120ff
--- /dev/null
+++ b/docs/en/graphics/shader/builtins/unlit.md
@@ -0,0 +1,65 @@
+---
+title: Unlit
+---
+
+In some simple scenes, you might not want to calculate lighting. The engine provides [UnlitMaterial](/en/apis/core/#UnlitMaterial), which uses the most streamlined shader code and only requires color or texture to render. Unlit material is suitable for rendering baked models. It only needs a basic texture or color to display the high-quality rendering results obtained from offline rendering. However, the downside is that it cannot display real-time light and shadow interactions because Unlit is determined by the texture and is not affected by any lighting.
+
+
+
+## Editor Usage
+
+
+
+## Parameter Introduction
+
+| Parameter | Application |
+| :-- | :-- |
+| [baseColor](/en/apis/core/#UnlitMaterial-baseColor) | Base color. **Base color \* Base color texture = Final color.** |
+| [baseTexture](/en/apis/core/#UnlitMaterial-baseTexture) | Base texture. Used in conjunction with the base color, it is a multiplicative relationship. |
+| [tilingOffset](/en/apis/core/#UnlitMaterial-tilingOffset) | Scaling and offset of texture coordinates. It is a Vector4 data that controls the scaling and offset of texture coordinates in the uv direction. Refer to [example](/en/embed/tiling-offset) |
+
+If you need to use the material through a script, you can go to the [material usage tutorial](/en/docs/graphics/material/script).
+
+## Export Unlit Material from Blender
+
+As introduced in the [baking tutorial](/en/docs/art/bake-blender), if we have already created the baked map and want a **convenient material** where the color is only influenced by the baked texture, without adding lights, debugging normals, or adjusting advanced properties like metallic roughness, then you can try Galacean's [UnlitMaterial](/en/apis/core/#UnlitMaterial). glTF has a dedicated [KHR_materials_unlit](https://github.com/KhronosGroup/glTF/tree/master/extensions/2.0/Khronos/KHR_materials_unlit) plugin, which Galacean will parse to generate Unlit material.
+
+
+
+Test model: [TREX.zip](https://www.yuque.com/attachments/yuque/0/2021/zip/381718/1623651429048-7f6a3610-d5cb-4a73-97f5-0d37d0c63b2c.zip?_lake_card=%7B%22src%22%3A%22https%3A%2F%2Fwww.yuque.com%2Fattachments%2Fyuque%2F0%2F2021%2Fzip%2F381718%2F1623651429048-7f6a3610-d5cb-4a73-97f5-0d37d0c63b2c.zip%22%2C%22name%22%3A%22TREX.zip%22%2C%22size%22%3A499161%2C%22type%22%3A%22application%2Fx-zip-compressed%22%2C%22ext%22%3A%22zip%22%2C%22status%22%3A%22done%22%2C%22taskId%22%3A%22u458bcbec-d647-4328-8036-3d5eb12860f%22%2C%22taskType%22%3A%22upload%22%2C%22id%22%3A%22ua8a5baad%22%2C%22card%22%3A%22file%22%7D)
+
+Next, we will introduce how to export a glTF file with the unlit plugin using Blender software.
+
+1. Import the model
+
+
+
+2. Modify Shader
+
+The default shader type is BSDF. We need to change the shader type in the surface material properties to **Background**.
+
+
+
+
+
+3. Add Baked Texture
+
+Add the baked texture and connect Color and Shader together.
+
+
+
+
+
+
+
+4. Export glTF
+
+If the preview is normal, export the glTF.
+
+
+
+
+
+Drag the exported glTF file into the editor or [glTF Viewer](https://galacean.antgroup.com/engine/gltf-viewer). If the material type is **UnlitMaterial**, it means that the [KHR_materials_unlit](https://github.com/KhronosGroup/glTF/tree/master/extensions/2.0/Khronos/KHR_materials_unlit) extension has been exported, and Galacean has parsed it into an Unlit material.
+
+
diff --git a/docs/en/graphics/shader/class.mdx b/docs/en/graphics/shader/class.mdx
new file mode 100644
index 0000000000..f55a2d1f9a
--- /dev/null
+++ b/docs/en/graphics/shader/class.mdx
@@ -0,0 +1,54 @@
+---
+title: Shader
+---
+
+In the [Shader Introduction](/en/docs/graphics/shader/intro/), we learned the basic concepts of shaders. The Galacean engine encapsulates other rendering-related information based on shader programs to form a Shader object. The Shader object, together with the [material](/en/docs/graphics/material/material/), determines the final rendering result of the rendered object.
+
+## Shader
+
+The Shader object contains the following:
+
+- name
+
+The name of the Shader instance. Once created, the Shader object is cached by the engine and can be reused by finding it through `Shader.find()` using `Shader.name` as the index. Therefore, the engine requires that the name attribute of the Shader must be unique within a single engine instance.
+
+- SubShader
+
+A Shader object contains at least one SubShader object.
+
+## SubShader
+
+A shader program may run on different GPU hardware platforms and different rendering pipelines. Therefore, the Shader object specifies different rendering logic for different hardware platforms and rendering pipelines through key-value pairs of tags, which are the SubShader objects.
+
+The SubShader contains the following:
+
+- Tags
+
+Tags are key-value pairs consumed by the rendering pipeline, usually used to specify information such as the hardware platforms and rendering pipelines compatible with the Shader. The current engine's built-in pipeline consumes the following Tags:
+
+| Key | value |
+| ------------- | ------------ |
+| pipelineStage | Forward |
+| pipelineStage | DepthOnly |
+| pipelineStage | ShadowCaster |
+| pipelineStage | DepthOnly |
+
+Additionally, you can specify the `replacementTag` of the rendering pipeline through [Camera.setReplacementShader](/en/apis/galacean/#Camera-resetReplacementShader) to achieve the purpose of replacing the `SubShader` with the specified `Tag`.
+
+Tags can be specified through [SubShader.setTag](/en/apis/galacean/#SubShader-setTag) and [ShaderPass.setTag](/en/apis/galacean/#ShaderPass-setTag). For details on specifying Tags in ShaderLab, see the [documentation](/en/docs/graphics/shader/shaderLab/syntax/subShader/#tags).
+
+- Passes
+
+A SubShader contains at least one ShaderPass object.
+
+- RenderStates
+
+[RenderState rendering states](/en/docs/graphics/material/composition/#渲染状态) shared by all ShaderPasses under the SubShader.
+
+## ShaderPass
+
+ShaderPass encapsulates the specific shader program and the [rendering state](/en/docs/graphics/material/composition/#渲染状态) when performing the final rendering.
+
+
+ The ShaderPasses under the same SubShader are rendered sequentially according to the array order, with the rendering effects gradually superimposed to form the final rendering result of the Shader in the current frame.
+
diff --git a/docs/en/graphics/shader/custom.md b/docs/en/graphics/shader/custom.md
index 63c7581595..9db20c4f70 100644
--- a/docs/en/graphics/shader/custom.md
+++ b/docs/en/graphics/shader/custom.md
@@ -1,18 +1,14 @@
---
-order: 4
title: Custom Shaders
-type: Shader
-group: Graphics
-label: Graphics/Shader
---
-In some cases, there may be special rendering requirements in the business, such as water flow effects, which require the use of **custom shaders** to achieve.
+There may be some special rendering requirements in the business, such as water flow effects, which need to be implemented through **custom shaders** (Shader).
## Creating Shaders
-The [Shader class](/apis/core/#Shader) encapsulates vertex shaders, fragment shaders, shader precompilation, platform precision, and platform differences. Its creation and usage are very convenient, allowing users to focus on the shader algorithm itself without worrying about precision or which version of GLSL to use. Here is a simple demo:
+The [Shader class](/en/apis/core/#Shader) encapsulates vertex shaders, fragment shaders, shader precompilation, platform precision, and platform differences. Its creation and use are very convenient, and users only need to focus on the shader algorithm itself without worrying about what precision to use or which version of GLSL to write. Here is a simple demo:
```javascript
import { Material, Shader, Color } from "@galacean/engine";
@@ -43,84 +39,85 @@ Shader.create("demo", vertexSource, fragmentSource);
const material = new Material(engine, Shader.find("demo"));
```
-The `Shader.create()` method is used to add the shader to the engine's cache pool, so it only needs to be created once for the entire runtime. After that, it can be reused using [Shader.find(name)](/apis/core/#Shader-find).
+`Shader.create()` is used to add the shader to the engine's cache pool, so it only needs to be created once during the entire runtime. After that, it can be repeatedly used through [Shader.find(name)](/en/apis/galacean/#Shader-find).
-> Note: The engine has already pre-created shaders such as blinn-phong, pbr, shadow-map, shadow, skybox, framebuffer-picker-color, and trail. Users can directly use these built-in shaders and cannot create shaders with the same name.
+> Note: The engine has already pre-created blinn-phong, pbr, shadow-map, shadow, skybox, framebuffer-picker-color, and trail shaders. Users can directly use these built-in shaders and cannot create them with the same name.
-In the above example, because we did not upload the `u_color` variable, the fragment output is still black (the default value of the uniform). Next, we will introduce the built-in shader variables in the engine and how to upload custom variables.
+In the above example, because we did not upload the `u_color` variable, the fragment output is still black (the default value of the uniform). Next, we will introduce the built-in shader variables of the engine and how to upload custom variables.
## Built-in Shader Variables
-In the example above, we assigned a shader to the material, and at this point, the program can start rendering.
+Above, we assigned a shader to the material, and the program can start rendering at this point.
-> It is important to note that there are two types of variables in shader code: **per-vertex** `attribute` variables and **per-shader** `uniform` variables. (After GLSL 300, they are unified as in variables)
+> It should be noted that there are two types of variables in shader code: **per-vertex** `attribute` variables and **per-shader** `uniform` variables. (After GLSL300, they are unified as in variables)
-The engine has automatically uploaded some commonly used variables that users can directly use in shader code. Here are the variables that the engine defaults to upload.
+The engine has automatically uploaded some commonly used variables, which users can directly use in the shader code, such as vertex data and MVP data. Below are the variables uploaded by default by the engine.
### Vertex Inputs
-| Per-Vertex Data | Attribute Name | Data Type |
-| :-------------- | :------------- | :-------- |
-| Position | POSITION | vec3 |
-| Normal | NORMAL | vec3 |
-| Tangent | TANGENT | vec4 |
-| Vertex Color | COLOR_0 | vec4 |
-| Joint Indices | JOINTS_0 | vec4 |
-| Joint Weights | WEIGHTS_0 | vec4 |
-| Texture Coord 1 | TEXCOORD_0 | vec2 |
-| Texture Coord 2 | TEXCOORD_1 | vec2 |
+| Per-vertex Data | Attribute Name | Data Type |
+| :--------------- | :------------- | :-------- |
+| Vertex | POSITION | vec3 |
+| Normal | NORMAL | vec3 |
+| Tangent | TANGENT | vec4 |
+| Vertex Color | COLOR_0 | vec4 |
+| Bone Index | JOINTS_0 | vec4 |
+| Bone Weight | WEIGHTS_0 | vec4 |
+| First Texture Coord | TEXCOORD_0 | vec2 |
+| Second Texture Coord | TEXCOORD_1 | vec2 |
-### Attributes
+### Properties
#### Renderer
-| Name | Type | Description |
-| :---------------- | :--- | --------------------- |
-| renderer_LocalMat | mat4 | Model local matrix |
-| renderer_ModelMat | mat4 | Model world matrix |
-| renderer_MVMat | mat4 | Model view matrix |
-| renderer_MVPMat | mat4 | Model view projection matrix |
-| renderer_NormalMat| mat4 | Normal matrix |
+| Name | Type | Description |
+| :----------------- | :--- | ----------------------- |
+| renderer_LocalMat | mat4 | Model local coordinate matrix |
+| renderer_ModelMat | mat4 | Model world coordinate matrix |
+| renderer_MVMat | mat4 | Model view matrix |
+| renderer_MVPMat | mat4 | Model view projection matrix |
+| renderer_NormalMat | mat4 | Normal matrix |
#### Camera
-| Name | Type | Description |
-| :-------------------- | :-------- | ------------------------------------------------------------------ |
-| camera_ViewMat | mat4 | View matrix |
-| camera_ProjMat | mat4 | Projection matrix |
-| camera_VPMat | mat4 | View projection matrix |
-| camera_ViewInvMat | mat4 | View inverse matrix |
-| camera_Position | vec3 | Camera position |
-| camera_DepthTexture | sampler2D | Depth information texture |
-| camera_DepthBufferParams | Vec4 | Camera depth buffer parameters: (x: 1.0 - far / near, y: far / near, z: 0, w: 0) |
+| Name | Type | Description |
+| :----------------------- | :-------- | ---------------------------------------------------------------------- |
+| camera_ViewMat | mat4 | View matrix |
+| camera_ProjMat | mat4 | Projection matrix |
+| camera_VPMat | mat4 | View projection matrix |
+| camera_ViewInvMat | mat4 | Inverse view matrix |
+| camera_Position | vec3 | Camera position |
+| camera_DepthTexture | sampler2D | Depth information texture |
+| camera_DepthBufferParams | Vec4 | Camera depth buffer parameters: (x: 1.0 - far / near, y: far / near, z: 0, w: 0) |
+| camera_ProjectionParams | Vec4 | Projection matrix parameters: (x: flipProjection ? -1 : 1, y: near, z: far, w: 0) |
#### Time
-| Name | Type | Description |
-| :--------------- | :--- | :-------------------------------------------------------- |
-| scene_ElapsedTime | vec4 | Total time elapsed since the engine started: (x: t, y: sin(t), z:cos(t), w: 0) |
-| scene_DeltaTime | vec4 | Time interval from the previous frame: (x: dt, y: 0, z:0, w: 0) |
+| Name | Type | Description |
+| :---------------- | :--- | :--------------------------------------------------------------- |
+| scene_ElapsedTime | vec4 | Total time elapsed since the engine started: (x: t, y: sin(t), z: cos(t), w: 0) |
+| scene_DeltaTime | vec4 | Time interval since the last frame: (x: dt, y: 0, z: 0, w: 0) |
#### Fog
-| Name | Type | Description |
-| :------------- | :--- | :-------------------------------------------------------------------------------------------- |
-| scene_FogColor | vec4 | Color of the fog |
-| scene_FogParams| vec4 | Fog parameters: (x: -1/(end-start), y: end/(end-start), z: density / ln(2), w: density / sqr(ln(2)) |
+| Name | Type | Description |
+| :-- | :-- | :-- |
+| scene_FogColor | vec4 | Color of the fog |
+| scene_FogParams | vec4 | Fog parameters: (x: -1/(end-start), y: end/(end-start), z: density / ln(2), w: density / sqr(ln(2)) |
-## Uploading Shader Data
+## Upload Shader Data
-> For uploading per-vertex data, refer to [Mesh Renderer](/en/docs/graphics-mesh-modelMesh), which will not be discussed here.
+> For uploading per-vertex data, please refer to [Mesh Renderer](/en/docs/graphics/mesh/modelMesh), which will not be repeated here.
-In addition to built-in variables, we can upload any custom-named variables in the shader. All we need to do is to use the correct interface based on the shader data type. The interfaces for uploading are all stored in [ShaderData](/apis/core/#ShaderData), and instances of shaderData are respectively stored in the four major classes of the engine: [Scene](/apis/core/#Scene), [Camera](/apis/core/#Camera), [Renderer](/apis/core/#Renderer), and [Material](/apis/core/#Material). By calling the interfaces in these shaderData instances to upload variables, the engine will automatically assemble these data at the lower level and perform optimizations such as duplicate checking for performance.
+In addition to built-in variables, we can upload any custom-named variables in the shader. All we need to do is use the correct interface according to the shader data type. All upload interfaces are stored in [ShaderData](/en/apis/core/#ShaderData), and the shaderData instance objects are stored in the engine's four main classes: [Scene](/en/apis/core/#Scene), [Camera](/en/apis/core/#Camera), [Renderer](/en/apis/core/#Renderer), and [Material](/en/apis/core/#Material). We just need to call the interfaces on these shaderData to upload variables, and the engine will automatically assemble these data at the underlying level and perform optimizations such as redundancy checks.

### Benefits of Separating Shader Data
-Shader data is separately stored in the four major classes of the engine: [Scene](/apis/core/#Scene), [Camera](/apis/core/#Camera), [Renderer](/apis/core/#Renderer), and [Material](/apis/core/#Material). One of the benefits of this approach is that the lower level can upload a certain uniform based on the timing of the upload to improve performance. Additionally, by separating shader data that is independent of materials, shared materials can be achieved. For example, two renderers sharing the same material, although both manipulate the same shader, the part of the shader data upload comes from the shaderData of the two renderers, so it will not affect each other's rendering results.
+Shader data is stored separately in the engine's four main classes: [Scene](/en/apis/core/#Scene), [Camera](/en/apis/core/#Camera), [Renderer](/en/apis/core/#Renderer), and [Material](/en/apis/core/#Material). One of the benefits of this approach is that the underlying layer can upload a specific block of uniform data based on the upload timing, improving performance. Additionally, separating material-independent shader data allows for shared materials. For example, two renderers sharing one material can both manipulate the same shader without affecting each other's rendering results because this part of the shader data upload comes from the shaderData of the two renderers.
-For example:
+Example:
```typescript
const renderer1ShaderData = renderer1.shaderData;
@@ -134,22 +131,22 @@ renderer2ShaderData.setFloat("u_progross", 0.8);
### Calling Interfaces
-The types of shader data and the corresponding APIs to call are as follows:
+The types of shader data and their respective API calls are as follows:
-| Shader Type | ShaderData API |
-| :----------------------------------------------------------------------------------------- | :---------------------------------- |
-| `bool`, `int` | setInt( value: number ) |
-| `float` | setFloat( value: number ) |
-| `bvec2`, `ivec2`, `vec2` | setVector2( value:Vector2 ) |
-| `bvec3`, `ivec3`, `vec3` | setVector3( value:Vector3 ) |
-| `bvec4`, `ivec4`, `vec4` | setVector4( value:Vector4 ) |
-| `mat4` | setMatrix( value:Matrix ) |
-| `float[]`, `vec2[]`, `vec3[]`, `vec4[]`, `mat4[]` | setFloatArray( value:Float32Array ) |
-| `bool[]`, `int[]`, `bvec2[]`, `bvec3[]`, `bvec4[]`, `ivec2[]`, `ivec3[]`, `ivec4[]` | setIntArray( value:Int32Array ) |
-| `sampler2D`, `samplerCube` | setTexture( value:Texture ) |
-| `sampler2D[]`, `samplerCube[]` | setTextureArray( value:Texture[] ) |
+| Shader Type | ShaderData API |
+| :-- | :-- |
+| `bool`, `int` | setInt(value: number) |
+| `float` | setFloat(value: number) |
+| `bvec2`, `ivec2`, `vec2` | setVector2(value: Vector2) |
+| `bvec3`, `ivec3`, `vec3` | setVector3(value: Vector3) |
+| `bvec4`, `ivec4`, `vec4` | setVector4(value: Vector4) |
+| `mat4` | setMatrix(value: Matrix) |
+| `float[]`, `vec2[]`, `vec3[]`, `vec4[]`, `mat4[]` | setFloatArray(value: Float32Array) |
+| `bool[]`, `int[]`, `bvec2[]`, `bvec3[]`, `bvec4[]`, `ivec2[]`, `ivec3[]`, `ivec4[]` | setIntArray(value: Int32Array) |
+| `sampler2D`, `samplerCube` | setTexture(value: Texture) |
+| `sampler2D[]`, `samplerCube[]` | setTextureArray(value: Texture[]) |
-The translated text is as follows:
+The code demonstration is as follows:
```glsl
// shader
@@ -190,13 +187,13 @@ shaderData.setTexture("u_samplerCube", textureCube);
shaderData.setTextureArray("u_samplerArray", [texture2D, textureCube]);
```
-> **Note**: For performance reasons, the engine currently does not support uploading struct arrays or individual array elements.
+> **Note**: For performance considerations, the engine does not currently support struct array uploads or individual element uploads of arrays.
-### Macro Switch
+### Macro Switches
-In addition to uniform variables, the engine also considers [macro definitions](https://www.wikiwand.com/en/OpenGL_Shading_Language) in shaders as variables. Enabling or disabling macro definitions will generate different shader variants and affect rendering results.
+In addition to uniform variables, the engine also treats [macro definitions](https://www.wikiwand.com/en/OpenGL_Shading_Language) in shaders as a type of variable. This is because enabling/disabling macro definitions will generate different shader variants, which will also affect the rendering results.
-If there are macro-related operations in the shader:
+For example, if there are these macro-related operations in the shader:
```glsl
#ifdef DISCARD
@@ -208,7 +205,7 @@ If there are macro-related operations in the shader:
#endif
```
-You can control macro variables using [ShaderData](/apis/core/#Shader-enableMacro):
+They are also controlled through [ShaderData](/en/apis/core/#Shader-enableMacro):
```typescript
// 开启宏开关
@@ -228,17 +225,10 @@ shaderData.disableMacro("LIGHT_COUNT");
## Encapsulating Custom Materials
-This section provides users with a simple encapsulation example based on all the previous content. We hope it will be helpful to you:
+This section combines all the content above to provide users with a simple encapsulation example. We hope it will be helpful to you:
```typescript
-import {
- Material,
- Shader,
- Color,
- Texture2D,
- BlendFactor,
- RenderQueueType,
-} from "@galacean/engine";
+import { Material, Shader, Color, Texture2D, BlendFactor, RenderQueueType } from "@galacean/engine";
//-- Shader 代码
const vertexSource = `
@@ -296,10 +286,8 @@ export class CustomMaterial extends Material {
const depthState = this.renderState.depthState;
target.enabled = true;
- target.sourceColorBlendFactor = target.sourceAlphaBlendFactor =
- BlendFactor.SourceAlpha;
- target.destinationColorBlendFactor = target.destinationAlphaBlendFactor =
- BlendFactor.OneMinusSourceAlpha;
+ target.sourceColorBlendFactor = target.sourceAlphaBlendFactor = BlendFactor.SourceAlpha;
+ target.destinationColorBlendFactor = target.destinationAlphaBlendFactor = BlendFactor.OneMinusSourceAlpha;
depthState.writeEnabled = false;
this.renderQueueType = RenderQueueType.Transparent;
}
diff --git a/docs/en/graphics/shader/intro.mdx b/docs/en/graphics/shader/intro.mdx
new file mode 100644
index 0000000000..f77c87ffbb
--- /dev/null
+++ b/docs/en/graphics/shader/intro.mdx
@@ -0,0 +1,24 @@
+
+
+## Introduction to Shaders {#examples}
+
+A shader is a program that runs on the GPU, typically consisting of two "entry functions" known as the Vertex Shader and the Fragment Shader, corresponding to two different stages of the rendering pipeline. Below is a simplified illustration of the engine's rendering process (rendering pipeline), focusing on the shader part.
+
+```mermaid
+flowchart LR
+ A[CPU应用层] -->|准备顶点和其他\n渲染计算相关的数据| B{Vertex Shader}
+ subgraph Shader / 着色器
+ B --> |计算顶点在三维空间的真实坐标\n以及其他后续渲染阶段需要的数据| C{Fragment Shader}
+ end
+ C -->|基于上一个渲染阶段的计算结果数据\n计算每个顶点的最终像素值| D[帧图像]
+```
+
+Simplified Rendering Pipeline
+
+In this chapter, we will cover the following topics:
+|Section|Content|
+|:--:|:--:|
+|[Shader Object](/en/docs/graphics/shader/class/)|Overview and basic usage of the Shader object in the engine|
+|[Built-in Shaders](/en/docs/graphics/shader/builtins/intro/)|Common built-in shaders in the engine|
+|[Shader Assets](/en/docs/graphics/shader/assets/)|How to create and modify Shader assets in the editor|
+|[ShaderLab](/en/docs/graphics/shader/shaderLab/intro/)|A more convenient way to create Shaders|
diff --git a/docs/en/graphics/shader/shaderAPI.mdx b/docs/en/graphics/shader/shaderAPI.mdx
new file mode 100644
index 0000000000..a947e039f5
--- /dev/null
+++ b/docs/en/graphics/shader/shaderAPI.mdx
@@ -0,0 +1,357 @@
+---
+title: Shader API【Experimental】
+---
+
+
+ This version is currently experimental and can only be used in the `editor`. If you want to use it in `Pro Code`, you need to import the `@galacean/engine-toolkit` package. Please note that the API may change in the next version, and we will notify you in time.
+
+
+Similar to functions, classes, and properties in Typescript, Shader code also has its own set of APIs. This article can help you write your own Shader based on these APIs and `ShaderLab` syntax.
+
+## Getting Started
+
+Let's start with the `Unlit template` to briefly introduce our Shader API. First, create an Unlit Shader as shown in the figure below:
+
+
+
+The default Unlit template already has built-in skinning calculations and Shadow Pass, so you can see that skeletal animation and shadows are rendered correctly:
+
+
+
+The key code is as follows. By calling `UsePass "pbr/Default/ShadowCaster"`, the object can be rendered to the Shadowmap; by using the `getSkinMatrix` API, you can get the bone matrix to animate the object.
+
+```ts showLineNumbers {1,11-15} /getSkinMatrix/
+UsePass "pbr/Default/ShadowCaster"
+
+Pass "Example" {
+ #include "Skin.glsl"
+
+ Varyings vert(Attributes attr) {
+ Varyings v;
+
+ vec4 position = vec4(attr.POSITION, 1.0);
+
+ // Skin
+ #ifdef RENDERER_HAS_SKIN
+ mat4 skinMatrix = getSkinMatrix(attr);
+ position = skinMatrix * position;
+ #endif
+
+ gl_Position = renderer_MVPMat * position;
+ v.uv = attr.TEXCOORD_0;
+
+ return v;
+ }
+}
+```
+
+The Unlit Shader is not affected by lighting by default. We can call the API provided by `Light.glsl` to make the Shader's output affected by lighting:
+
+```ts showLineNumbers {1,4} /getDirectLight/
+#include "Light.glsl"
+
+// Demo 演示,我们只简单计算第 1 盏方向光。
+DirectLight light = getDirectLight(0);
+// 衰减系数,光线越垂直照射的地方越亮
+float dotNL = saturate( dot( v.normalWS, -light.direction ) );
+baseColor.rgb *= dotNL * light.color;
+```
+
+
+
+Of course, in addition, you can also perform vertex color calculations, normal map calculations, [ambient light](/en/docs/graphics/light/ambient) calculations, and other operations, but we do not recommend doing these operations based on the `Unlit template`. The `PBR template` already has these calculations built-in and provides a more comprehensive lighting model, such as anisotropy, Clear Coat, etc., and provides function overload macros for quick expansion.
+
+## PBR Template
+
+We recreate a `PBR Shader template` and bind it to the material ball just now. You can see that the material panel already has built-in basic properties, metallic roughness, anisotropy, normal, emission, shadow occlusion, Clear Coat, and other configurations, and can be affected by direct light and ambient light:
+
+
+
+
+
+
+
+Next, we refer to the `thin film interference` algorithm to see how to overload the implementation of the lighting model:
+
+1. First, create a `DemoPass.glsl` and import it into the main Shader file just now:
+
+```ts showLineNumbers {7-8}
+// PBRShader.gs
+SubShader "Default" {
+ Pass "Forward Pass" {
+ VertexShader = PBRVertex;
+ FragmentShader = PBRFragment;
+
+ // #include "ForwardPassPBR.glsl"
+ #include "./DemoPass.glsl"
+ }
+
+```
+
+2. Modify the lighting model in `DemoPass.glsl`. As a demo, we only demonstrate modifying the direct light part:
+
+
+ The editor currently does not support displaying the contents of `ForwardPassPBR.glsl` because it is in another repository. However, you can copy the `/Internal/Shader/Advanced/IridescenceForwardPass` file for modification.
+
+
+```ts showLineNumbers {7-8}
+// DemoPass.glsl
+#include "Common.glsl"
+#include "Fog.glsl"
+
+#include "AttributesPBR.glsl"
+#include "VaryingsPBR.glsl"
+// #include "LightDirectPBR.glsl"
+#include "DemoLight.glsl"
+
+#include "LightIndirectPBR.glsl"
+
+#include "VertexPBR.glsl"
+#include "FragmentPBR.glsl"
+```
+
+3. Use the `FUNCTION_SPECULAR_LOBE` macro to overload the `direct light specular reflection lighting model`. The algorithm part here is copied from thin-film interference, so you don't need to worry too much about it. After overloading, `LightDirectPBR.glsl` can recognize this function and replace the lighting model implementation. Key functions for both direct and indirect light provide corresponding overload macros, which will be introduced in detail in the API documentation below:
+
+```ts showLineNumbers {2,5,16} /specularLobe_iridescence/
+// DemoLight.glsl
+#define FUNCTION_SPECULAR_LOBE specularLobe_iridescence
+
+#include "BRDF.glsl"
+#include "./IridescenceFunction.glsl"
+
+void specularLobe_iridescence(Varyings varyings, SurfaceData surfaceData, BRDFData brdfData, vec3 incidentDirection, vec3 attenuationIrradiance, inout vec3 specularColor){
+
+ vec3 thin = DirectBDRFIridescence(surfaceData, incidentDirection, brdfData);
+ vec3 BRDF_Specular = BRDF_Specular_GGX( incidentDirection, surfaceData, surfaceData.normal, brdfData.specularColor, brdfData.roughness);
+ vec3 factor =mix(BRDF_Specular,thin,material_Iridescence);
+
+ specularColor += attenuationIrradiance * factor;
+}
+
+#include "LightDirectPBR.glsl"
+```
+
+
+
+## General API
+
+The API call method is as follows:
+
+```glsl
+#include "Common.glsl"
+
+float f2 = pow2(0.5);
+```
+
+### Common
+
+Provides common macros like `PI`, and general methods like `gammaToLinear`, `pow2`, etc. See [source code](https://github.com/galacean/engine-toolkit/blob/main/packages/shaderlab/src/shaders/Common.glsl) for details.
+
+### Fog
+
+Provides depth fogging methods:
+
+```glsl
+vec4 fog(vec4 color, vec3 positionVS);
+```
+
+### Transform
+
+Provides system variables for model space, view space, world space, camera coordinates, etc. [system variables](/en/docs/graphics/shader/custom/#属性):
+
+```glsl
+mat4 renderer_LocalMat;
+mat4 renderer_ModelMat;
+mat4 camera_ViewMat;
+mat4 camera_ProjMat;
+mat4 renderer_MVMat;
+mat4 renderer_MVPMat;
+mat4 renderer_NormalMat;
+
+vec3 camera_Position;
+vec3 camera_Forward;
+vec4 camera_ProjectionParams;
+```
+
+### Light
+
+Provides methods to get [engine lighting](/en/docs/graphics/light/light), including direct light and indirect light:
+
+```glsl
+// 直接光
+DirectLight getDirectLight(int index);
+PointLight getPointLight(int index);
+SpotLight getSpotLight(int index);
+
+// 间接光
+EnvMapLight scene_EnvMapLight;
+
+#ifdef SCENE_USE_SH
+ vec3 scene_EnvSH[9];
+#endif
+
+#ifdef SCENE_USE_SPECULAR_ENV
+ samplerCube scene_EnvSpecularSampler;
+#endif
+```
+
+### Normal
+
+Provides some general methods for normal calculation:
+
+```glsl
+// 在切线空间进行法线贴图运算后的法线
+vec3 getNormalByNormalTexture(mat3 tbn, sampler2D normalTexture, float normalIntensity, vec2 uv, bool isFrontFacing);
+
+// 利用导数计算切线,针对本身没有切线的模型
+mat3 getTBNByDerivatives(vec2 uv, vec3 normal, vec3 position, bool isFrontFacing);
+```
+
+### Shadow
+
+Provides shadow-related functions. See [source code](https://github.com/galacean/engine-toolkit/blob/main/packages/shaderlab/src/shaders/Shadow.glsl) for details.
+
+```glsl
+// 获取级联阴影所属层级,比如级联数量设为4,则返回 0~3
+int computeCascadeIndex(vec3 positionWS);
+
+// 获取 shadowmap 中的坐标
+vec3 getShadowCoord(vec3 positionWS);
+
+// 获取阴影强度,包含采样方式、阴影衰减
+float sampleShadowMap(vec3 positionWS, vec3 shadowCoord);
+```
+
+### Skin
+
+Provides bone calculation methods:
+
+```glsl
+mat4 getSkinMatrix(Attributes attributes);
+```
+
+### BlendShape
+
+Provides BS calculation methods:
+
+```glsl
+void calculateBlendShape(Attributes attributes, inout vec4 position, inout vec3 normal, inout vec4 tangent);
+```
+
+## PBR API
+
+In addition to the general API, PBR also encapsulates a series of APIs such as the BRDF lighting model. These are some core links of `ForwardPassPBR`. When users extend other materials, such as SSS, thin-film interference, etc., they may also need these functions and can try to `#include` reuse these APIs.
+
+
+
+### AttributesPBR
+
+Encapsulates all the Attribute variables needed for PBR. See [source code](https://github.com/galacean/engine-toolkit/blob/main/packages/shaderlab/src/shaders/shadingPBR/AttributesPBR.glsl) for details.
+
+### VaryingsPBR
+
+Encapsulates all the Varyings variables needed for PBR. See [source code](https://github.com/galacean/engine-toolkit/blob/main/packages/shaderlab/src/shaders/shadingPBR/VaryingsPBR.glsl) for details.
+
+### LightDirectPBR
+```
+
+Encapsulates direct light calculations based on the BRDF lighting model. For more details, see [source code](https://github.com/galacean/engine-toolkit/blob/main/packages/shaderlab/src/shaders/shadingPBR/LightDirectPBR.glsl).
+
+Generally, you can call it directly:
+
+```glsl
+// Evaluate direct lighting
+evaluateDirectRadiance(varyings, surfaceData, brdfData, shadowAttenuation, color.rgb);
+```
+
+The following function overload macros are provided to override key calculations of the lighting model:
+
+```glsl
+#define FUNCTION_SURFACE_SHADING surfaceShading
+#define FUNCTION_DIFFUSE_LOBE diffuseLobe
+#define FUNCTION_SPECULAR_LOBE specularLobe
+#define FUNCTION_CLEAR_COAT_LOBE clearCoatLobe
+
+void surfaceShading(Varyings varyings, SurfaceData surfaceData, BRDFData brdfData, vec3 incidentDirection, vec3 lightColor, inout vec3 color);
+void diffuseLobe(Varyings varyings, SurfaceData surfaceData, BRDFData brdfData, vec3 attenuationIrradiance, inout vec3 diffuseColor);
+void specularLobe(Varyings varyings, SurfaceData surfaceData, BRDFData brdfData, vec3 incidentDirection, vec3 attenuationIrradiance, inout vec3 specularColor);
+float clearCoatLobe(Varyings varyings, SurfaceData surfaceData, BRDFData brdfData, vec3 incidentDirection, vec3 color, inout vec3 specularColor);
+```
+
+Refer to the PBR template extension above for the overload method.
+
+### LightInDirectPBR
+
+Encapsulates [ambient light](/en/docs/graphics/light/ambient) calculations based on the BRDF lighting model. For more details, see [source code](https://github.com/galacean/engine-toolkit/blob/main/packages/shaderlab/src/shaders/shadingPBR/LightInDirectPBR.glsl).
+
+Generally, you can call it directly:
+
+```glsl
+// IBL
+evaluateIBL(varyings, surfaceData, brdfData, color.rgb);
+```
+
+The following function overload macros are provided to override key calculations of the lighting model:
+
+```glsl
+#define FUNCTION_DIFFUSE_IBL evaluateDiffuseIBL
+#define FUNCTION_SPECULAR_IBL evaluateSpecularIBL
+#define FUNCTION_CLEAR_COAT_IBL evaluateClearCoatIBL
+
+void evaluateDiffuseIBL(Varyings varyings, SurfaceData surfaceData, BRDFData brdfData, inout vec3 diffuseColor);
+void evaluateSpecularIBL(Varyings varyings, SurfaceData surfaceData, BRDFData brdfData, float radianceAttenuation, inout vec3 specularColor);
+float evaluateClearCoatIBL(Varyings varyings, SurfaceData surfaceData, BRDFData brdfData, inout vec3 specularColor);
+```
+
+### VertexPBR
+
+Some methods required by the PBR vertex shader, such as obtaining UV coordinates after TilingOffset, obtaining world coordinates, normals, tangents after skeletal and BS operations, etc. For more details, see [source code](https://github.com/galacean/engine-toolkit/blob/main/packages/shaderlab/src/shaders/shadingPBR/VertexPBR.glsl).
+
+```glsl showLineNumbers {2, 4}
+Varyings varyings;
+varyings.uv = getUV0(attributes);
+
+VertexInputs vertexInputs = getVertexInputs(attributes);
+
+// positionWS
+varyings.positionWS = vertexInputs.positionWS;
+
+// normalWS、tangentWS、bitangentWS
+#ifdef RENDERER_HAS_NORMAL
+ varyings.normalWS = vertexInputs.normalWS;
+ #ifdef RENDERER_HAS_TANGENT
+ varyings.tangentWS = vertexInputs.tangentWS;
+ varyings.bitangentWS = vertexInputs.bitangentWS;
+ #endif
+#endif
+
+gl_Position = renderer_MVPMat * vertexInputs.positionOS;
+```
+
+### BRDF
+
+The key file of the PBR lighting model, encapsulating general calculation functions related to BRDF, as well as the `SurfaceData` structure and `BRDFData` structure used for subsequent lighting model calculations. For more details, see [source code](https://github.com/galacean/engine-toolkit/blob/main/packages/shaderlab/src/shaders/shadingPBR/BRDF.glsl).
+
+### FragmentPBR
+
+Contains a large number of variables passed from the CPU, such as metallic, roughness, maps, etc., and initializes the `SurfaceData` structure through `getSurfaceData`. For more details, see [source code](https://github.com/galacean/engine-toolkit/blob/main/packages/shaderlab/src/shaders/shadingPBR/FragmentPBR.glsl).
+
+```glsl showLineNumbers
+BRDFData brdfData;
+
+// 初始化 SurfaceData 结构体
+SurfaceData surfaceData = getSurfaceData(varyings, aoUV, gl_FrontFacing);
+
+// 可以在这加工 SurfaceData 里面的数据
+initBRDFData(surfaceData, brdfData);
+```
+
+### Finally
+
+In addition to the functionality and calling methods of key APIs, you can refer to the [ForwardPassPBR](https://github.com/galacean/engine-toolkit/blob/main/packages/shaderlab/src/shaders/shadingPBR/ForwardPassPBR.glsl) on the official website for the organization of the entire file.
diff --git a/docs/en/graphics/shader/shaderLab/create.mdx b/docs/en/graphics/shader/shaderLab/create.mdx
new file mode 100644
index 0000000000..28ba78ab4e
--- /dev/null
+++ b/docs/en/graphics/shader/shaderLab/create.mdx
@@ -0,0 +1,35 @@
+---
+title: Creation
+---
+
+## Create in the Editor
+
+You can add 3 types of ShaderLab templates in the editor: `Unlit`, `PBR`, and Shader Fragments.
+
+
+
+Among them, **`Unlit`** and **`PBR`** are shader templates written using ShaderLab syntax, while **Shader Fragments** are for convenient code segment reuse. In ShaderLab, you can use the `include` macro to reference code segments, which will be automatically expanded and replaced during subsequent compilation. For usage details, see the syntax standard module.
+
+## Create in Script
+
+Currently, `ShaderLab` is not yet integrated into the engine core package. You need to pass in a newly created `ShaderLab` object during engine initialization; otherwise, the engine cannot parse and use shaders written in `ShaderLab` syntax.
+
+1. `ShaderLab` Initialization
+
+```ts
+import { ShaderLab } from "@galacean/engine-shaderlab";
+
+const shaderLab = new ShaderLab();
+// 使用ShaderLab初始化Engine
+const engine = await WebGLEngine.create({ canvas: "canvas", shaderLab });
+```
+
+2. Create Shader
+
+```glsl
+// Directly create Shader using ShaderLab
+const shader = Shader.create(galaceanShaderCode);
+```
diff --git a/docs/en/graphics/shader/shaderLab/intro.md b/docs/en/graphics/shader/shaderLab/intro.md
new file mode 100644
index 0000000000..e154f34b3a
--- /dev/null
+++ b/docs/en/graphics/shader/shaderLab/intro.md
@@ -0,0 +1,18 @@
+---
+title: ShaderLab
+---
+
+> In the [Custom Shader](/en/docs/graphics/shader/custom/) section, we learned how to create custom Shaders using the native WebGL GLSL language. In this section, we will introduce another way to create Shaders --- ShaderLab.
+
+`ShaderLab` is a Shader packaging language tailored for the Galacean engine. It allows developers to write custom Shaders using the familiar [GLSL](https://www.khronos.org/files/opengles_shading_language.pdf) syntax while providing additional high-level abstractions and management features to enhance development efficiency. In the [Material Composition](/en/docs/graphics/material/composition/) section, we mentioned that before the introduction of ShaderLab, setting various [render states](/en/docs/graphics/material/composition/#渲染状态) required developers to manually call APIs. With ShaderLab, developers can directly set and specify render states in the "Shader" file. Additionally, they can define material rendering parameters bound to the Shader, which are mapped to the Inspector panel in the editor, making it easier for developers to adjust rendering effects in real-time.
+
+Although ShaderLab introduces convenience for writing shaders, it does not replace GLSL but is compatible with it. Developers can write native GLSL code blocks within the ShaderLab framework, enjoying the advantages of both. The ShaderLab usage process is as follows:
+
+```mermaid
+flowchart LR
+ Create Shader --> Edit shaderlab --> Debug shaderlab
+```
+
+Below is a simple example of using ShaderLab, which includes two Shaders. The `normal` Shader defines a vertex shader that only implements MVP transformation and a fragment shader that specifies the pixel color through a Uniform variable. Additionally, the `lines` Shader is a [shadertoy](https://www.shadertoy.com/view/DtXfDr) example modified using ShaderLab.
+
+
diff --git a/docs/en/graphics/shader/shaderLab/syntax/intro.mdx b/docs/en/graphics/shader/shaderLab/syntax/intro.mdx
new file mode 100644
index 0000000000..31d30ce314
--- /dev/null
+++ b/docs/en/graphics/shader/shaderLab/syntax/intro.mdx
@@ -0,0 +1,31 @@
+---
+title: ShaderLab Syntax Standard
+---
+
+## Editing Shaders in the Editor
+
+Double-click the shader asset we created in the previous step to jump to the code editing page.
+
+> A future version will release the Galacean VSCode plugin, which will provide syntax checking, auto-completion, and code synchronization features for `ShaderLab`. Stay tuned.
+
+
+
+## Syntax Standard
+
+The `ShaderLab` syntax framework is as follows:
+
+```glsl
+Shader "ShaderName" {
+ ...
+ SubShader "SubShaderName" {
+ ...
+ Pass "PassName" {
+ ...
+ }
+ ...
+ }
+ ...
+}
+```
+
+It mainly includes [Shader](/en/docs/graphics/shader/shaderLab/syntax/shader/), [SubShader](/en/docs/graphics/shader/shaderLab/syntax/subShader/), and [ShaderPass](/en/docs/graphics/shader/shaderLab/syntax/pass/) modules.
diff --git a/docs/en/graphics/shader/shaderLab/syntax/macro.mdx b/docs/en/graphics/shader/shaderLab/syntax/macro.mdx
new file mode 100644
index 0000000000..fbb28fc7f7
--- /dev/null
+++ b/docs/en/graphics/shader/shaderLab/syntax/macro.mdx
@@ -0,0 +1,45 @@
+---
+title: Macros
+---
+
+ShaderLab supports some macros and macro operators from the standard GLSL syntax:
+- `#define`
+- `#undef`
+- `#if`
+- `#ifdef`
+- `#ifndef`
+- `#else`
+- `#elif`
+- `#endif`
+- `defined`
+
+and the additionally introduced `#include` macro.
+
+
+ShaderLab macros are expanded during the preprocessor stage, so macros cannot affect ShaderLab structure parsing. Keywords such as `Shader`, `SubShader`, `Pass`, `EditorProperties`, and `EditorMacros` cannot be included within branch macros like `#ifdef`.
+
+
+## include Macro
+
+To facilitate code reuse, the `include` macro can be used in ShaderLab to reference code segments, which will be automatically expanded and replaced during subsequent compilation.
+
+```glsl
+#include "{includeKey}"
+```
+
+To enable code segments to be referenced via the `include` macro, we have two ways to declare code segments:
+
+1. Create Shader / Shader Fragment in the editor
+
+The `includeKey` for the created code segment is the file path of the file in the project, such as `/Root/Effect.glsl`.
+
+2. Explicitly register code segments in the script
+
+```ts
+import { ShaderFactory } from '@galacean/engine';
+
+const commonSource = `// shader chunk`;
+ShaderFactory.registerInclude('includeKey', commonSource);
+```
+
+Shader file inclusion supports relative path references. All relative paths are converted based on the main Shader file path. For example, if the Shader file path is `/root/hair/shader.gs` and the included code segment path is `/root/hair/common.glsl`, the relative path for inclusion would be `#include "./common.glsl"`.
diff --git a/docs/en/graphics/shader/shaderLab/syntax/pass.mdx b/docs/en/graphics/shader/shaderLab/syntax/pass.mdx
new file mode 100644
index 0000000000..ea5c3b4273
--- /dev/null
+++ b/docs/en/graphics/shader/shaderLab/syntax/pass.mdx
@@ -0,0 +1,86 @@
+---
+title: Pass
+---
+
+```glsl
+Pass "PassName" {
+ Tag {PipelineStage = "ShadowCaster"}
+
+ ...
+ // 全局变量区:公共变量声明,结构体声明,函数声明
+ ...
+
+ // 渲染管线和渲染状态设置
+
+ // 指定顶点着色器和片元着色器 强调glsl语言
+ VertexShader = vert;
+
+ // 指定渲染队列
+ RenderQueueType = Transparent;
+}
+```
+
+`Pass` is the basic element of a `Shader` object. A simple shader object may contain only one `Pass`, but more complex shaders can contain multiple `Pass`es. It defines the operations performed at specific stages of the rendering pipeline, such as the shader programs running on the GPU, rendering states, and settings related to the rendering pipeline.
+
+## Rendering State
+
+It can be specified in the following two ways:
+
+1. Explicit assignment
+
+ ```
+ BlendState = blendState;
+ ```
+
+2. Declaration in the global variable domain of Pass
+
+ ```
+ BlendState blendState {
+ Rendering state property = Property value;
+ }
+ ```
+
+## Specifying uniform Variables
+
+Directly declare them as global variables
+
+```glsl
+mediump vec4 u_color;
+float material_AlphaCutoff;
+mat4 renderer_ModelMat;
+vec3 u_lightDir;
+```
+
+## Declaring varying Variables
+
+Specify by defining the output structure of the vertex shader and the input structure of the fragment shader
+
+```glsl
+struct v2f {
+ vec3 color;
+}
+
+v2f vert(a2v o) {
+ ...
+}
+void frag(v2f i) {
+ ...
+}
+```
+
+## Specifying Vertex and Fragment Shaders
+
+Explicitly specify the shader entry functions using `VertexShader` and `FragmentShader`
+
+```
+VertexShader = vert;
+FragmentShader = frag;
+```
+
+## Setting the Render Queue
+
+Specify using the `RenderQueueType` directive, which is equivalent to the engine API.
+
+```
+RenderQueueType = Transparent;
+```
diff --git a/docs/en/graphics/shader/shaderLab/syntax/shader.mdx b/docs/en/graphics/shader/shaderLab/syntax/shader.mdx
new file mode 100644
index 0000000000..70d1f3ea89
--- /dev/null
+++ b/docs/en/graphics/shader/shaderLab/syntax/shader.mdx
@@ -0,0 +1,212 @@
+---
+title: Shader
+---
+
+```glsl
+Shader "ShaderName" {
+ ...
+ // 全局变量区:变量声明,结构体声明,渲染状态声明,材质属性定义
+ ...
+ SubShader "SubShaderName" {
+ ...
+ }
+ ...
+}
+```
+
+In ShaderLab, `Shader` is a collection of shader programs and other engine rendering settings in the traditional rendering pipeline. It allows defining multiple shader programs within the same `Shader` object and instructs Galacean on how to choose and use them during rendering. The `Shader` object has a nested structure, corresponding to the engine's encapsulated [Shader](/en/apis/galacean/#Shader), [SubShader](/en/apis/galacean/#SubShader), and [ShaderPass](/en/apis/galacean/#ShaderPass) objects.
+
+## Material Property Definition
+
+```glsl
+// Uniform
+EditorProperties
+{
+ material_BaseColor("Offset unit scale", Color) = (1,1,1,1);
+ ...
+
+ Header("Emissive")
+ {
+ material_EmissiveColor("Emissive color", Color) = (1,1,1,1);
+ ...
+ }
+ ...
+}
+
+// 宏
+EditorMacros
+{
+ [On] UV_OFFSET("UV Offset", Range(1,100)) = 10;
+ ...
+}
+```
+
+This module is used to define the UI display of the material bound to the Shader in the editor's Inspector panel. ShaderLab material properties use `EditorProperties` and `EditorMacros` to separately declare macro properties and other Uniform properties. The declaration format is:
+
+1. Uniform Properties
+
+ ```glsl
+ EditorProperties {
+ propertyName("label in Inspector", type) [= defaultValue];
+ ...
+ [ Header("blockName") {
+ propertyName("label in Inspector", type) [= defaultValue];
+ ...
+ } ]
+ }
+ ```
+
+ > Nested `Header` blocks can be used to hierarchically categorize material properties.
+
+ Supported types are
+
+ | Type | Example |
+ | :-: | :-- |
+ | Bool | propertyName("Property Description", Boolean) = true; |
+ | Int | propertyName("Property Description", Int) = 1; propertyName("Property Description", Range(0,8)) = 1 |
+ | Float | propertyName("Property Description", Float) = 0.5; propertyName("Property Description", Range(0.0, 1.0)) = 0.5; |
+ | Texture2D | propertyName("Property Description", Texture2D); |
+ | TextureCube | propertyName("Property Description", TextureCube); |
+ | Color | propertyName("Property Description", Color) = (0.25, 0.5, 0.5, 1); |
+ | Vector2 | propertyName("Property Description", Vector2) = (0.25, 0.5); |
+ | Vector3 | propertyName("Property Description", Vector3) = (0.25, 0.5, 0.5); |
+ | Vector4 | propertyName("Property Description", Vector4) = (0.25, 0.5, 0.5, 1.0); |
+
+2. Macro Properties
+
+ ```glsl
+ EditorMacros {
+ [\[Off/On\]] propertyName("label in Inspector"[, type]) [= defaultValue];
+ ...
+ [ Header("blockName") {
+ [\[Off/On\]] propertyName("label in Inspector"[, type]) [= defaultValue];
+ ...
+ } ]
+ }
+ ```
+
+ All include enable and disable functions, initialized by the `[On/Off]` directive. The types include
+
+ | Type | Example |
+ | :-: | :-- |
+ | None (Toggle Macro) | macroName("Macro Description"); |
+ | Bool | macroName("Macro Description", Boolean) = true; |
+ | Int | macroName("Macro Description", Int) = 1; macroName("Macro Description", Range(0,8)) = 1; |
+ | Float | macroName("Macro Description", Float) = 0.5; macroName("Macro Description", Range(0.0, 1.0)) = 0.5; |
+ | Color | macroName("Macro Description", Color) = (0.25, 0.5, 0.5, 1); |
+ | Vector2 | macroName("Macro Description", Vector2) = (0.25, 0.5); |
+ | Vector3 | macroName("Macro Description", Vector3) = (0.25, 0.5, 0.5); |
+ | Vector4 | macroName("Macro Description", Vector4) = (0.25, 0.5, 0.5, 1.0); |
+
+> Note that the current version of the ShaderLab material property module only defines the Inspector UI panel for the material bound to this Shader in the editor. It does not declare the corresponding global variables for you in the `ShaderPass`. If the `ShaderPass` code references this variable, you need to explicitly declare it in the global variable module (see below).
+
+## Global Variables {#examples}
+
+You can declare 4 types of global variables in ShaderLab: RenderState, Structs, Functions, and Single Variables.
+
+- RenderState
+
+ Includes BlendState, DepthState, StencilState, RasterState
+
+ - BlendState
+
+ ```glsl
+ BlendState {
+ Enabled[n]: bool;
+ ColorBlendOperation[n]: BlendOperation;
+ AlphaBlendOperation[n]: BlendOperation;
+ SourceColorBlendFactor[n]: BlendFactor;
+ SourceAlphaBlendFactor[n]: BlendFactor;
+ DestinationColorBlendFactor[n]: BlendFactor;
+ DestinationAlphaBlendFactor[n]: BlendFactor;
+ ColorWriteMask[n]: float // 0xffffffff
+ BlendColor: vec4;
+ AlphaToCoverage: bool;
+ }
+ ```
+
+ [n] can be omitted. When using MRT, [n] specifies a particular MRT render state. Omitting it sets all MRT states. BlendOperation and BlendFactor enums are the same as the engine API.
+
+ - DepthState
+
+ ```glsl
+ DepthState {
+ Enabled: bool;
+ WriteEnabled: bool;
+ CompareFunction: CompareFunction;
+ }
+ ```
+
+ CompareFunction enums are the same as the engine API.
+
+ - StencilState
+
+ ```glsl
+ StencilState {
+ Enabled: bool;
+ ReferenceValue: int;
+ Mask: float; // 0xffffffff
+ WriteMask: float; // 0xffffffff
+ CompareFunctionFront: CompareFunction;
+ CompareFunctionBack: CompareFunction;
+ PassOperationFront: StencilOperation;
+ PassOperationBack: StencilOperation;
+ FailOperationFront: StencilOperation;
+ FailOperationBack: StencilOperation;
+ ZFailOperationFront: StencilOperation;
+ ZFailOperationBack: StencilOperation;
+ }
+ ```
+
+ CompareFunction and StencilOperation enums are the same as the engine API.
+
+ - RasterState
+
+ ```glsl
+ RasterState {
+ CullMode: CullMode;
+ DepthBias: float;
+ SlopeScaledDepthBias: float;
+ }
+ ```
+
+ CullMode enums are the same as the engine API.
+
+ Example of setting `BlendState` in `ShaderLab`:
+
+ ```glsl
+ Shader "Demo" {
+ ...
+ BlendState customBlendState
+ {
+ Enabled = true;
+ // 常量复制方式
+ SourceColorBlendFactor = BlendFactor.SourceColor;
+ // 变量赋值方式
+ DestinationColorBlendFactor = material_DstBlend;
+ }
+ ...
+ Pass "0" {
+ ...
+ BlendState = customBlendState;
+ ...
+ }
+ }
+ ```
+
+ The above example shows two ways of assigning values to BlendState properties: *constant assignment* and *variable assignment*:
+
+ - Constant assignment means the right side of the assignment statement is a specified engine enum variable, e.g., BlendFactor.SourceColor.
+ - Variable assignment means the right side of the assignment statement is any variable name. The specific value of the variable is specified by the user at runtime through the ShaderData.setInt("material_DstBlend", BlendFactor.SourceColor) API.
+
+- Structs, Functions
+
+ Same as the syntax in GLSL.
+
+- Single Variables
+
+ ```glsl
+ [lowp/mediump/highp] variableType variableName;
+ ```
+
+Similar to other programming languages, global variables in ShaderLab also have scope and name overriding principles. In simple terms, the scope of global variables in ShaderLab is limited to the SubShader or Pass module in which they are declared. The name overriding principle means that if there are global variables with the same name within a Pass, the global variables within the Pass will override the global variables with the same name in the SubShader.
diff --git a/docs/en/graphics/shader/shaderLab/syntax/subShader.mdx b/docs/en/graphics/shader/shaderLab/syntax/subShader.mdx
new file mode 100644
index 0000000000..afca63c530
--- /dev/null
+++ b/docs/en/graphics/shader/shaderLab/syntax/subShader.mdx
@@ -0,0 +1,37 @@
+---
+title: SubShader
+---
+
+```glsl
+SubShader "SubShaderName" {
+ ...
+ // 全局变量区:变量声明,结构体声明,渲染状态声明
+ ...
+ Tags {ReplaceTag = "opaque"}
+
+ UsePass "ShaderName/SubShaderName/PassName"
+
+ Pass "PassName" {
+ ...
+ }
+}
+```
+
+## Tags
+
+In the [Shader Object](/en/docs/graphics/shader/class/) section, we learned the basic concepts and uses of Tags. In ShaderLab, you can directly declare and specify them using the `Tags` directive, without needing to manually specify them using the `SubShader.setTag` API method.
+
+
+## UsePass
+
+ If a `SubShader` contains multiple `Pass` objects, you can reuse other `Pass` objects using the `UsePass` directive, such as the engine's built-in PBR Pass: `UsePass "pbr/Default/Forward"`
+
+ | Built-in Shader | Pass Path |
+ | :-----------------: | :-----------------------------: |
+ | PBR | pbr/Default/Forward |
+ | Unlit | unlit/Default/Forward |
+ | Skybox | skybox/Default/Forward |
+ | Particle-shader | particle-shader/Default/Forward |
+ | SpriteMask | SpriteMask/Default/Forward |
+ | Sprite | Sprite/Default/Forward |
+
diff --git a/docs/en/graphics/shader/shaderLab/usage.md b/docs/en/graphics/shader/shaderLab/usage.md
new file mode 100644
index 0000000000..7406c37165
--- /dev/null
+++ b/docs/en/graphics/shader/shaderLab/usage.md
@@ -0,0 +1,17 @@
+---
+title: "Usage"
+---
+
+With custom shader assets written using `ShaderLab`, we can implement user-defined materials by binding the shader to newly created materials.
+
+
+
+- `ShaderLab` Reflecting Material Properties
+
+If we write the `material property definition` module in `ShaderLab`, the properties defined in the module will be exposed in the Inspector panel of the material asset bound to the Shader.
+
+
+
+## An Example of Implementing Planar Shadows Using Multi-Pass Technology
+
+
diff --git a/docs/en/graphics/texture/2d.md b/docs/en/graphics/texture/2d.md
index 47fc3f4528..e889fc08a1 100644
--- a/docs/en/graphics/texture/2d.md
+++ b/docs/en/graphics/texture/2d.md
@@ -1,20 +1,20 @@
---
order: 1
-title: 2D Texture
-type: Graphics
-group: Texture
+title: 2D 纹理
+type: 图形
+group: 纹理
label: Graphics/Texture
---
-2D textures ([Texture2D](/apis/core/#Texture2D)) are the most commonly used artistic resources, sampled using 2D UV coordinates.
+2D textures ([Texture2D](/en/apis/core/#Texture2D)) are the most commonly used art resources, sampled using two-dimensional UV coordinates.
## Creation
-In the editor, you can easily import a 2D texture by following the path **[Asset Panel](/en/docs/assets/interface)** -> **Right-click to upload** -> **Select Texture2D** -> **Choose corresponding texture** -> **2D texture asset created**.
+In the editor, you can easily import a 2D texture by following the path **[Asset Panel](/en/docs/assets/interface)** -> **Right-click Upload** -> **Select Texture2D** -> **Choose the corresponding texture** -> **2D texture asset creation complete**.
-Similarly, in scripts, you can load an image and get the corresponding 2D texture using [ResourceManager](/apis/core/#ResourceManager):
+Similarly, in the script, you can load an image through [ResourceManager](/en/apis/core/#ResourceManager) to get the corresponding 2D texture:
```typescript
const textureResource = {
@@ -33,15 +33,15 @@ engine.resourceManager
## Methods
-| Method | Description |
-| :------------- | :------------------------ |
-| setImageSource | Set the image source of the texture |
-| setPixelBuffer | Modify the image data of the texture object |
-| getPixelBuffer | Get the image data of the texture object |
+| Method | Description |
+| :-------------- | :----------------------- |
+| setImageSource | Sets the image data source of the texture |
+| setPixelBuffer | Modifies the image data of the texture object |
+| getPixelBuffer | Retrieves the image data of the texture object |
### setImageSource
-As mentioned earlier, images, canvas drawings, videos, and other image-related data sources can be used as textures. For example, videos can be uploaded to a texture using the [setImageSource](/apis/core/#Texture2D-setImageSource) interface:
+As mentioned earlier, image-related data sources such as images, canvas, and videos can be used as textures. For example, a video can be uploaded to the texture through the [setImageSource](/en/apis/core/#Texture2D-setImageSource) interface:
```typescript
// 拿到视频标签,即 HTMLVideoElement
@@ -51,15 +51,15 @@ const video = document.getElementsByTagName("video")[0];
texture.setImageSource(video);
```
-> `setImageSource` can only synchronize the data for that frame, but videos change every frame. If you need the texture to synchronize with the changes, you should execute it in the onUpdate hook of the script.
+> `setImageSource` can only synchronize the data of that frame, but the video changes every frame. If you need the texture to change synchronously, you need to execute it in the script's onUpdate hook.
-For scenarios like videos that require frequent updates to the texture content, it is recommended to disable mipmap and set the texture usage to Dynamic when creating the texture for better performance. The sample code is as follows:
+For scenarios where the texture content needs to be frequently updated, such as videos, you need to disable mipmap and set the texture usage to Dynamic when creating the texture to achieve better performance. The sample code is as follows:
### setPixelBuffer
-Textures correspond to the color values of each pixel at the lowest level, namely the RGBA channels. You can manually fill in the color values of these color channels and then pass them to the texture using the [setPixelBuffer](/apis/core/#Texture2D-setPixelBuffer) interface:
+The underlying texture corresponds to the color value of each pixel, i.e., the RGBA channels. We can manually fill in the color values of these channels and then pass them to the texture through the [setPixelBuffer](/en/apis/core/#Texture2D-setPixelBuffer) interface:
```typescript
const texture = new Texture2D(engine, 1, 1);
@@ -70,7 +70,7 @@ texture.setPixelBuffer(data);
### getPixelBuffer
-Similarly, you can read the color data of these color channels:
+Similarly, we can read the color data of these channels:
```typescript
const texture = new Texture2D(engine, width, height);
@@ -83,11 +83,11 @@ texture.getPixelBuffer(0, 0, width, height, 0, data);
## Usage
-Assigning a texture to the corresponding properties of a material can enable different rendering functions. For example, adding a base color texture can determine the basic color tone of a model. In the editor, simply select the corresponding texture for the respective property.
+Assigning the texture to the corresponding property of the material ball can enable different rendering functions. For example, adding a base color texture can determine the basic tone of the model. In the editor, you only need to select the corresponding texture in the corresponding property.
-Similarly, in scripts, you can set it up like this:
+Similarly, in the script, you can set it like this:
```typescript
const material = new PBRMaterial(engine);
@@ -100,16 +100,15 @@ material.baseTexture = texture;
-To address the issue of black edges appearing at the abrupt change of Alpha values in images with transparent pixels, the editor has a built-in color expansion feature. This feature rewrites the RGB values of all transparent pixels in the image to the RGB values of the nearest non-completely transparent pixel, effectively removing the black edges from the image.
+To solve the problem of black edges appearing at the abrupt changes in Alpha values in images with transparent pixels, the editor has a built-in color expansion function. This function removes the black edges of the image by rewriting the RGB values of all transparent pixels in the image to the RGB values of the nearest non-fully transparent pixels.
| Option | Description |
| :--------------- | :----------------------------------------------- |
-| Alpha Range | Threshold, RGB values are modified for transparent pixels with an Alpha value less than this threshold |
-| Alpha Value | Alpha value after transparent pixel filling |
+| Alpha Range | Threshold, transparent pixel Alpha value less than this threshold, RGB value is modified |
+| Alpha Value | Alpha value after filling transparent pixels |
## Export Configuration
-The [Asset Build](/en/docs/assets-build) document details the **global configuration** when exporting textures. If the Overwrite option is checked here, the asset will be exported following the **custom configuration** instead of the **global configuration**.
-
+The [Project Release](/en/docs/assets/build) document explains the **global configuration** for texture export in detail. If the Overwrite option is selected here, this asset will follow the **custom configuration** instead of the **global configuration** during export.
diff --git a/docs/en/graphics/texture/compression.md b/docs/en/graphics/texture/compression.md
index 540d9da85d..f1573f44f2 100644
--- a/docs/en/graphics/texture/compression.md
+++ b/docs/en/graphics/texture/compression.md
@@ -6,11 +6,11 @@ group: Texture
label: Graphics/Texture
---
-**[KTX2](https://www.khronos.org/ktx/)** (Khronos Texture Container version 2.0) is the latest texture compression solution introduced by Khronos, supported by Galacean since version 1.1. KTX2 will transcode to the corresponding format of compressed textures (BC/PVRTC/ETC/ASTC) based on the device platform support at runtime.
+**[KTX2](https://www.khronos.org/ktx/)** (Khronos Texture Container version 2.0) is the latest texture compression scheme launched by Khronos, supported by Galacean since version 1.1. KTX2 will transcode to the corresponding format of compressed texture (BC/PVRTC/ETC/ASTC) based on the device platform support at runtime.
## Usage
-In the engine, simply load using `resourceManager`:
+In the engine, simply use `resourceManager` to load:
```typescript
engine.resourceManager.load("xxx.ktx2");
@@ -25,20 +25,22 @@ engine.resourceManager.load({
-To use KTX2 in glTF, the [KHR_texture_basisu](https://github.com/KhronosGroup/glTF/blob/main/extensions/2.0/Khronos/KHR_texture_basisu/README.md) extension must be included.
+Using ktx2 in glTF requires including the [KHR_texture_basisu](https://github.com/KhronosGroup/glTF/blob/main/extensions/2.0/Khronos/KHR_texture_basisu/README.md) extension.
-KTX2 generation can be done using:
+KTX2 can be generated using:
- toktx
- basisu
-- Editor packaging, refer to the '[Project Deployment](/en/docs/assets-build)' document.
-## Compatibility
+### Editor
-KTX2 transcoding utilizes WebAssembly technology, requiring Chrome 57+ and iOS 11.3+ (WebAssembly in 11.0 ~ 11.2 has a [bug](https://bugs.webkit.org/show_bug.cgi?id=181781)).
+When packaging the project, the editor can configure options to generate KTX2. Refer to the '[Project Release](/en/docs/assets/build/)' document. The project export is a global configuration, and different compression formats can be configured independently for different texture resources. Check overwrite in the texture panel of the editor to override the global configuration:
-For systems below iOS 16, there is a chance of no return when loading the necessary KTX2 parsing wasm file through a worker, especially when loading wasm for the first time. This can be bypassed by not using a worker on iOS:
+
-```typescript
-WebGLEngine.create({ canvas: "canvas", ktx2Loader: { workerCount: 0 } });
-```
+- ETC1S has a small size and minimal memory usage but lower quality, suitable for albedo, specular, and other maps.
+- UASTC has a larger size and higher quality, suitable for normal maps and similar textures.
+
+## Compatibility
+
+KTX2 transcoding uses WebAssembly technology, requiring Chrome 57+ and iOS 11.3+ (WebAssembly in versions 11.0 ~ 11.2 has a [bug](https://bugs.webkit.org/show_bug.cgi?id=181781)).
diff --git a/docs/en/graphics/texture/cube.md b/docs/en/graphics/texture/cube.md
index 7e96ebbcc3..4d39613d8c 100644
--- a/docs/en/graphics/texture/cube.md
+++ b/docs/en/graphics/texture/cube.md
@@ -6,27 +6,27 @@ group: Texture
label: Graphics/Texture
---
-The difference between a cube texture ([TextureCube](/apis/core/#TextureCube)) and a 2D texture is that it has 6 faces, which means a cube texture is composed of 6 2D textures.
+The difference between a cube texture ([TextureCube](/en/apis/core/#TextureCube)) and a 2D texture is that it has 6 faces, which means a cube texture is composed of 6 2D textures.


-Cube textures and 2D textures have slightly different underlying sampling methods. Textures use 2D coordinates for sampling, while cube textures use 3D coordinates, specifically _direction vectors_ for sampling. Sampling a texture value from a cube texture using an orange direction vector would look like this:
+The underlying sampling method of cube textures is slightly different from that of 2D textures. Textures use two-dimensional coordinates for sampling, while cube textures use three-dimensional coordinates, i.e., _direction vectors_ for sampling. For example, using an orange direction vector to sample a texture value from a cube texture would look like this:

-Due to this sampling characteristic, cube textures can be used to achieve effects like skyboxes and environment reflections.
+Because of this sampling characteristic, cube textures can be used to achieve effects such as skyboxes and environment reflections.
## Creation
-> You can download free HDR textures from [Poly Haven](https://polyhaven.com/) or [BimAnt HDRI](http://hdri.bimant.com/)
+> You can download free HDR maps from [Poly Haven](https://polyhaven.com/) or [BimAnt HDRI](http://hdri.bimant.com/)
-After preparing the HDR, follow the steps **[Assets Panel](/en/docs/assets/interface)** -> **Right-click Upload** -> **Select TextureCube(.hdr)** -> **Choose the corresponding HDR texture** -> **Cube texture asset created** to complete the operation.
+After preparing the HDR, follow the path **[Asset Panel](/en/docs/assets/interface)** -> **Right-click Upload** -> **Select TextureCube(.hdr)** -> **Choose the corresponding HDR map** -> **Cube texture asset creation completed**.

-Similarly, in scripts, loading six corresponding textures in order can also generate the corresponding cube texture.
+Similarly, in the script, you can also get the corresponding cube texture by loading six textures in the correct order.
```typescript
const cubeTextureResource = {
@@ -50,5 +50,5 @@ engine.resourceManager.load(cubeTextureResource).then((resource) => {
## Usage
-Cube textures are mainly used in skyboxes, for more details refer to [Sky Background](/en/docs/graphics-background-sky)
+Cube textures are mainly used in skyboxes. For more details, refer to [Sky Background](/en/docs/graphics/background/sky/)
diff --git a/docs/en/graphics/texture/rtt.md b/docs/en/graphics/texture/rtt.md
index 11399d6644..e73cf834da 100644
--- a/docs/en/graphics/texture/rtt.md
+++ b/docs/en/graphics/texture/rtt.md
@@ -1,24 +1,24 @@
---
order: 3
-title: Off-screen Rendering Texture
+title: Offscreen Render Texture
type: Graphics
group: Texture
label: Graphics/Texture
---
-Off-screen rendering texture, as the name suggests, is a texture that can be obtained through off-screen rendering. The underlying technology uses [FBO](https://developer.mozilla.org/en-US/en/docs/Web/API/WebGLRenderingContext/framebufferTexture2D) to output rendering operations to a texture instead of the screen. Users can use this texture to implement post-processing effects, refraction, reflection, dynamic environment mapping, and other artistic creations.
+Offscreen render texture, as the name suggests, can be obtained through offscreen rendering. It uses [FBO](https://developer.mozilla.org/en-US/docs/Web/API/WebGLRenderingContext/framebufferTexture2D) technology at the underlying level, redirecting rendering operations from the screen to a texture. This texture can be used to achieve post-processing effects, refraction, reflection, dynamic environment mapping, and other artistic creations.
-The engine provides the [RenderTarget](/apis/core/#RenderTarget) class for off-screen rendering and obtaining the corresponding off-screen rendering texture. Currently, the engine supports generating the following types of off-screen rendering textures:
+The engine provides the [RenderTarget](/en/apis/core/#RenderTarget) class for offscreen rendering and obtaining the corresponding offscreen render texture. Currently, the engine supports generating the following offscreen render textures:
| Type | Application |
| :-- | :-- |
-| Color Texture ([Texture](/apis/core/#Texture)) | Single color texture, multiple color textures (MRT), color cube texture |
-| Depth Texture ([Texture](/apis/core/#Texture)) | Depth texture, depth cube texture |
+| Color Texture ([Texture](/en/apis/core/#Texture)) | Can input a single color texture, multiple color textures (MRT), or a color cube texture |
+| Depth Texture ([Texture](/en/apis/core/#Texture)) | Can input a depth texture or a depth cube texture |
| Texture Combination | Color texture + depth texture, color cube texture + depth cube texture, multiple color textures + depth texture |
## Usage
-Here is an example using the `onBeginRender` script hook. Before rendering each frame, render the screen `object A` to the `off-screen texture`, then use the off-screen texture as the base texture of `object B` and render object B to the `screen`. Assuming `object A` has a layer of `Layer0` and `object B` has a layer of `Layer1`;
+Here is an example using the `onBeginRender` hook script. Before rendering each frame, first render the screen `Object A` to the `offscreen texture`, then use the offscreen texture as the base texture for `Object B`, and render Object B to the `screen`. Assume the layer of `Object A` is `Layer0` and the layer of `Object B` is `Layer1`;
```
class switchRTScript extends Script {
@@ -52,4 +52,3 @@ class switchRTScript extends Script {
```
-
diff --git a/docs/en/graphics/texture/texture.md b/docs/en/graphics/texture/texture.md
index 87d2e91e93..f825435ef5 100644
--- a/docs/en/graphics/texture/texture.md
+++ b/docs/en/graphics/texture/texture.md
@@ -6,32 +6,33 @@ group: Texture
label: Graphics/Texture
---
-Textures ([Texture](/apis/core/#Texture)) are the most commonly used resources in 3D rendering. When shading models, we need to assign a color value to each fragment. Besides manually setting the color value, we can also choose to read texels from textures for shading to achieve more sophisticated artistic effects.
+Textures ([Texture](/en/apis/core/#Texture)) are one of the most commonly used resources in 3D rendering. When shading a model, we need to set a color value for each fragment. Besides setting the color value manually, we can also choose to read texels from a texture for shading to achieve richer artistic effects.
-> It is worth noting that images, canvas drawings, raw data, videos, etc., can all be used as textures. The Galacean engine currently supports all standard WebGL textures.
+> It is worth noting that images, canvas, raw data, videos, etc., can all be used as textures. The Galacean engine currently supports all WebGL standard textures.
-We will find that many issues in the engine stem from mappings between different spaces (such as MVP transformations), and textures are no exception. Developers not only need to understand the mapping relationship from image space to texture space but also need to understand the mapping rules from texels to pixels.
+We will find that many issues in the engine stem from mapping between different spaces (such as MVP transformations). Textures are no exception. Developers need to understand the mapping relationship from image space to texture space and the mapping rules from texels to pixels.
-This document will mainly cover:
+This article will mainly introduce:
-- Texture types, texture space, and common properties
-- [2D Texture](/en/docs/graphics-texture-2d)
-- [Cube Texture](/en/docs/graphics-texture-cube)
-- [Playing Video with Textures](/en/docs/graphics-texture-2d)
-- [Setting Skybox Textures](/en/docs/graphics-background-sky)
-- [Offscreen Rendering Textures](/en/docs/graphics-texture-rtt)
-- Using [Compressed Textures](/en/docs/graphics-texture-compression)
+- Types of textures, texture space, and common properties
+- [2D Textures](/en/docs/graphics/texture/2d/)
+- [Cube Textures](/en/docs/graphics/texture/cube/)
+- [Playing Videos through Textures](/en/docs/graphics/texture/2d/)
+- [Setting Sky Textures](/en/docs/graphics/background/sky/)
+- [Off-screen Rendering Textures](/en/docs/graphics/texture/rtt/)
+- Using [Compressed Textures](/en/docs/graphics/texture/compression/)
## Texture Types
-| Type | Description |
-| :---------------------------------------- | :-------------------------------------------------------------------------- |
-| [2D Texture](/en/docs/graphics-texture-2d) | The most commonly used artistic resource, sampled using 2D UV coordinates |
-| [Cube Texture](/en/docs/graphics-texture-cube}) | Composed of 6 2D textures, a cube texture can be used for skyboxes, environment reflections, and other effects |
+| Type | Description |
+| :--------------------------------------- | :----------------------------------------------------------------- |
+| [2D Textures](/en/docs/graphics/texture/2d/) | The most commonly used artistic resource, sampled using 2D UV coordinates |
+| [Cube Textures](/en/docs/graphics/texture/cube/) | Composed of 6 2D textures, can be used to achieve skybox, environment reflection effects |
+| 2D Texture Arrays | Occupies only one texture unit, very suitable for implementing texture atlas switching needs |
## Texture Space
-Texture space is determined by the shape of the texture. 2D textures require the use of 2D spatial vectors for texture sampling, while cube textures require the use of 3D spatial vectors for texture sampling.
+Texture space is determined by the shape of the texture. 2D textures require 2D space vectors for texture sampling, while cube textures require 3D space vectors for texture sampling.
@@ -46,63 +47,63 @@ Texture space is determined by the shape of the texture. 2D textures require the
## Common Properties
-Although texture types vary, they all have some similar basic properties and settings:
+Although there are various types of textures, they all have some similar basic properties and settings:
-| Property | Value |
-| :-------------------------------------------------------------- | :------------------------------------------------------------------------------------------------------------------------------------- |
-| U Wrap Mode ([wrapModeU](/apis/core/#Texture-wrapModeU)) | Clamping ([Clamp](/apis/core/#TextureWrapMode-Clamp)), Repeating ([Repeat](/apis/core/#TextureWrapMode-Repeat)), Mirrored Repeat ([Mirror](/apis/core/#TextureWrapMode-Mirror)) |
-| V Wrap Mode ([wrapModeV](/apis/core/#Texture-wrapModeV)) | Clamping ([Clamp](/apis/core/#TextureWrapMode-Clamp)), Repeating ([Repeat](/apis/core/#TextureWrapMode-Repeat)), Mirrored Repeat ([Mirror](/apis/core/#TextureWrapMode-Mirror)) |
-| Filter Mode ([filterMode](/apis/core/#Texture-filterMode)) | Point Filtering ([Point](/apis/core/#TextureFilterMode-Point)), Bilinear Filtering ([Bilinear](/apis/core/#TextureFilterMode-Bilinear)), Trilinear Filtering ([Trilinear](/apis/core/#TextureFilterMode-Trilinear)) |
-| Anisotropic Filtering Level ([anisoLevel](/apis/core/#Texture-anisoLevel)) | 1 to 16, depending on device support |
+| Property | Value |
+| :-------------------------------------------------------------- | :----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
+| Wrap Mode U ([wrapModeU](/en/apis/core/#Texture-wrapModeU)) | Clamp Mode ([Clamp](/en/apis/core/#TextureWrapMode-Clamp)), Repeat Mode ([Repeat](/en/apis/core/#TextureWrapMode-Repeat)), Mirror Repeat Mode ([Mirror](/en/apis/core/#TextureWrapMode-Mirror)) |
+| Wrap Mode V ([wrapModeV](/en/apis/core/#Texture-wrapModeV)) | Clamp Mode ([Clamp](/en/apis/core/#TextureWrapMode-Clamp)), Repeat Mode ([Repeat](/en/apis/core/#TextureWrapMode-Repeat)), Mirror Repeat Mode ([Mirror](/en/apis/core/#TextureWrapMode-Mirror)) |
+| Filter Mode ([filterMode](/en/apis/core/#Texture-filterMode)) | Point Filter ([Point](/en/apis/core/#TextureFilterMode-Point)), Bilinear Filter ([Bilinear](/en/apis/core/#TextureFilterMode-Bilinear)), Trilinear Filter ([Trilinear](/en/apis/core/#TextureFilterMode-Trilinear)) |
+| Anisotropic Filter Level ([anisoLevel](/en/apis/core/#Texture-anisoLevel)) | 1 ~ 16, depending on device support |
### Loop Mode
-The texture sampling range is `[0,1]`, so when the texture UV coordinates exceed this range, we can control how to sample the out-of-range parts by setting the loop mode.
+The texture sampling range is `[0,1]`. When the texture UV coordinates exceed this range, we can control how the exceeding part is sampled by setting the loop mode.
-| Sampling Loop Mode | Explanation |
-| :----------------- | :--------------------------------- |
-| Clamp | Sample edge pixels when out of range |
-| Repeat | Re-sample from [0,1] when out of range |
-| Mirror | Mirror sampling from [1,0] when out of range |
+| Sampling Loop Mode | Explanation |
+| :----------------- | :----------------------------------- |
+| Clamp | Samples the edge texel when out of range |
+| Repeat | Resamples from [0,1] when out of range |
+| Mirror | Mirrors sampling from [1,0] when out of range |
### Filter Mode
-Generally, pixels and screen pixels do not correspond exactly. We can control the filtering mode for magnification (Mag) and minification (Min) modes by setting the filter mode.
+Generally, texels and screen pixels do not correspond exactly. We can control the filtering mode in magnification (Mag) and minification (Min) modes by setting the filter mode.
-| Sampling Filter Mode | Explanation |
-| :------------------- | :--------------------------------- |
-| Point | Use the nearest pixel to the sampling point |
-| Bilinear | Use the average value of the nearest 2*2 pixel matrix |
-| Trilinear | In addition to bilinear filtering, also average over mipmap levels |
+| Sampling Filter Mode | Explanation |
+| :------------------- | :------------------------------------------------------------ |
+| Point | Uses the texel closest to the sampling point |
+| Bilinear | Uses the average value of the nearest 2\*2 texel matrix |
+| Trilinear | Applies average filtering to the mipmap levels based on bilinear filtering |
### Anisotropic Filtering Level
-Anisotropic filtering technology can make textures look clearer at oblique angles. As shown in the figure below, the end of the texture becomes clearer as the anisotropic filtering level increases. However, please use it carefully, as the larger the value, the greater the computational load on the GPU.
+Anisotropic filtering technology can make textures appear clearer when viewed at oblique angles. As shown in the figure below, the end of the texture becomes clearer as the anisotropic filtering level increases. However, use it with caution; the higher the value, the greater the GPU computation.
## General Settings
-| Setting | Value |
-| :---------------- | :------------------------- |
-| mipmap | Multi-level texture blending (enabled by default) |
+| Setting | Value |
+| :---------------- | :--------------------------- |
+| mipmap | Multi-level texture gradient (enabled by default) |
| flipY | Flip Y-axis (disabled by default) |
| premultiplyAlpha | Premultiply alpha channel (disabled by default) |
| format | Texture format (default R8G8B8A8) |
-### Mipmap
+### mipmap
-The engine defaults to enabling [mipmap](/apis/core/#Texture-generateMipmaps) (multi-level texture blending). Mipmap is used to address the accuracy and performance issues when sampling high-resolution textures from low-resolution screens, allowing for the selection of different resolution textures at appropriate distances, as shown below:
+**The engine enables [mipmap](/en/apis/core/#Texture-generateMipmaps)** (multi-level texture gradient) by default. Mipmap is used to solve the precision and performance issues when sampling high-resolution textures on low-resolution screens, allowing the selection of different resolution textures at appropriate distances, as shown below:

-It is important to note that WebGL 2.0 supports textures of **any resolution**, which will generate mipmaps layer by layer according to the [mipmap algorithm](http://download.nvidia.com/developer/Papers/2005/NP2_Mipmapping/NP2_Mipmap_Creation.pdf). However, if you are in a WebGL 1.0 environment, be sure to upload **textures with power-of-two dimensions**, such as 1024 \* 512, otherwise Galacean will detect that mipmaps cannot be used in the environment and automatically disable the mipmap feature, leading to unexpected visual results.
+It should be noted that WebGL2.0 supports textures of **any resolution** and will generate mip levels based on the [mipmap](http://download.nvidia.com/developer/Papers/2005/NP2_Mipmapping/NP2_Mipmap_Creation.pdf) algorithm. However, if your environment is WebGL1.0, please ensure to upload **power-of-two textures**, such as textures with a resolution of 1024 \* 512. Otherwise, Galacean will detect that the environment cannot use mipmap and will automatically downgrade to disable the mipmap function, which may cause some unexpected visual effects.
-If you need to change the default behavior of mipmap, you can achieve this through scripting. For detailed parameters, refer to the [API](/apis/core/#Texture2D-constructor):
+If you need to change the default behavior of mipmap, you can do so via script. For parameters, see [API](/en/apis/core/#Texture2D-constructor):
```typescript
const texture = new Texture2D(
@@ -114,7 +115,7 @@ const texture = new Texture2D(
); // 第 5 个参数
```
-For cube texture scripting, refer to the [API](/apis/core/#TextureCube-constructor):
+For cube texture scripts, see [API](/en/apis/core/#TextureCube-constructor):
```typescript
const cubeTexture = new TextureCube(
@@ -129,27 +130,27 @@ const cubeTexture = new TextureCube(
### flipY
-flipY is used to control whether the texture is flipped along the Y-axis, i.e., upside down. The engine and editor default to disabled. If you need to change the default behavior of flipY, you can achieve this through the [setImageSource](/apis/core/#Texture2D-setImageSource) method:
+flipY is used to control whether the texture is flipped along the Y-axis, i.e., upside down. **The engine and editor disable it by default**. If you need to change the default behavior of flipY, you can do so via the [setImageSource](/en/apis/core/#Texture2D-setImageSource) method:
```typescript
const texture = new Texture2D(engine, width, height);
-texture.setImageSource(img, 0, true); // 第 3 个参数
+texture.setImageSource(img, 0, true); // The 3rd parameter
```
-### premultiplyAlpha
+### premultiplyAlpha {/*examples*/}
-premultiplyAlpha is used to control whether the texture pre-multiplies the alpha (transparency) channel. **The engine and editor have it turned off by default**. If you need to change the default behavior of premultiplyAlpha, you can do so by using the [setImageSource](/apis/core/#Texture2D-setImageSource) method:
+premultiplyAlpha is used to control whether the texture pre-multiplies the alpha (transparency) channel. **The engine and editor have it turned off by default**. If you need to change the default behavior of premultiplyAlpha, you can do so through the [setImageSource](/en/apis/core/#Texture2D-setImageSource) method:
```typescript
const texture = new Texture2D(engine, width, height);
-texture.setImageSource(img, 0, undefined, true); // 第 4 个参数
+texture.setImageSource(img, 0, undefined, true); // The 4th parameter
```
-### format
+### format {/*examples*/}
-The engine defaults to using `TextureFormat.R8G8B8A8` as the texture format, which means red, green, blue, and alpha channels each use 1 byte, allowing color values in the range of 0 to 255 for each channel. The engine supports configuring different texture formats, for more details refer to [TextureFormat](/apis/core/#TextureFormat). For example, if we don't need to use the alpha channel, we can use `TextureFormat.R8G8B8`:
+The engine uses `TextureFormat.R8G8B8A8` as the default texture format, meaning that the red, green, blue, and alpha channels each use 1 byte, allowing each channel to store color values ranging from 0 to 255. The engine supports configuring different texture formats, which can be referenced in [TextureFormat](/en/apis/core/#TextureFormat). For example, if we do not need to use the alpha channel, i.e., the A channel, we can use `TextureFormat.R8G8B8`:
```typescript
const texture = new Texture2D(engine, width, height, TextureFormat.R8G8B8);
```
-
+```
diff --git a/docs/en/how-to-contribute.mdx b/docs/en/how-to-contribute.mdx
new file mode 100644
index 0000000000..339398c9d4
--- /dev/null
+++ b/docs/en/how-to-contribute.mdx
@@ -0,0 +1,263 @@
+---
+title: How to Contribute to Documentation
+---
+
+Our documentation is based on Nextra, using `_meta.json` to define file order, page titles, page layouts, and other configurations.
+
+## Capabilities Introduction
+
+### TOC
+
+TOC (*Table-Of-Content*) allows your readers to quickly index specific content. You just need to design appropriate headings and hierarchical structures, and readers can quickly navigate through the document content using the TOC menu on the right side of the page. Of course, you can also use the TOC to constantly check whether the hierarchical structure of your document is reasonable while writing.
+
+### MDX
+
+MDX is a new document format based on `.md` that allows you to introduce React components to enrich the content of the document. The new official website also provides some built-in React components for you to use.
+
+```tsx filename="example.mdx"
+some doc content that you write before...
+
+
+You could write some additional information in that place. You also could use mardown syntax in it such as [link](#) or `code`
+
+```
+
+
+You can certainly continue to use `.md` to write documents! However, you can try using `.mdx` to provide a better reading experience for readers.
+
+
+### Callout
+
+```mdx
+
+your text...
+
+```
+
+The Callout component is similar to the quote syntax `> some text` you use in markdown, but it highlights the content in a block format, making it more intuitive to convey some information.
+The Callout component provides four types: `info`, `warning`, `positive`, and `negative`, to suit different usage scenarios.
+
+For example, for general additional information, we can use the `info` type:
+
+
+ This is a piece of information that needs additional explanation. You can use **bold** syntax or `code block` syntax. Touch callbacks **depend on the physics engine**, so make sure the physics engine is initialized before using this feature.
+
+
+For operations that may affect performance, you can use the `warning` type to add a prominent reminder:
+
+
+Note that image tracking requires specifying the tracked image when adding the feature, and in WebXR, the same image will only be tracked once.
+
+
+For recommended operations or best practices, you can use the `positive` type:
+
+
+Click the `sprite atlas` asset, adjust the `texture max width` and `texture max height` in the `packing settings`, and call `pack and preview` in the `packing object` to ensure a high utilization rate of the atlas.
+
+
+Finally, for `breaking changes` or some highly discouraged practices, you can use the `negative` type:
+
+
+Note that if you do not bind the script asset to the entity's script component, the script will not run.
+
+
+### Comparison
+
+```mdx
+
+```
+
+
+
+### Image Zoom
+
+The new official website introduces `react-medium-image-zoom` to achieve the function of clicking to enlarge images. You can use the following method to achieve this:
+
+````mdx filename="your-doc.mdx"
+import { Image } from '@/mdx'
+
+
+
+````
+
+
+
+
+Since the images in our previous documentation were a mix of markdown syntax and img tags, the new version cannot correctly distinguish between the two when parsing. This makes it impossible to directly apply this feature to all images.
+For example, sometimes we display a button screenshot inline in the document, in which case the zoom component should not be used. Therefore, we need to wait for the old document's image syntax to be updated before we can fully use this feature.
+
+
+### Code Highlighting
+
+The new code highlighting feature is very powerful. You can achieve code highlighting not only in inline code and standalone code blocks but also in standalone code blocks with additional features such as **line display**, **filename specification**, **line highlighting**, **range highlighting**, and **keyword highlighting**.
+
+#### Inline Highlighting
+
+```md
+`let x = 1{:ts}`
+```
+
+For example, this is a piece of code embedded in text `let x = 1{:ts}`, and you can see that the TypeScript code is highlighted.
+
+#### Filename
+
+The following example includes how to display the filename and how to show line indicators.
+
+````md filename="script-component.mdx"
+```ts showLineNumbers filename="example.ts"
+class CustomScript extends Script {
+ @ignoreClone
+ a:boolean = false;
+ @assignmentClone
+ b:number = 1;
+ @shallowClone
+ c:Vector3[] = [new Vector3(0,0,0)];
+}
+```
+
+上面的语法产生的效果如下:
+
+```ts showLineNumbers filename="example.ts"
+class CustomScript extends Script{
+ @ignoreClone
+ a:boolean = false;
+ @assignmentClone
+ b:number = 1;
+ @shallowClone
+ c:Vector3[] = [new Vector3(0,0,0)];
+}
+
+
+使用文件名的场景可能并不常见,但某些情况下显示文件名是有必要的。譬如,对于 `project.json` 或 `Scene.json` 这种文件内容,增加文件名的显示可以更直观的传递信息。
+
+
+#### 行高亮
+
+````md filename="Markdown"
+```ts {1,4-5}
+async function setupDefaultScene(scene: Scene){
+ const root = scene.createRootEntity();
+ const cameraEntity = root.createChild();
+ cameraEntity.transform.setPosition(0, 0, 10);
+ cameraEntity.addComponent(Camera);
+ cameraEntity.addComponent(OrbitControl);
+}
+```
+
+通过 `{1,4-5}` 这样的语法,我们可以同时实现单行高亮和多行高亮的功能。
+
+```js {1,4-5} showLineNumbers
+async function setupDefaultScene(scene: Scene) {
+ const root = scene.createRootEntity();
+ const cameraEntity = root.createChild();
+ cameraEntity.transform.setPosition(0, 0, 10);
+ cameraEntity.addComponent(Camera);
+ cameraEntity.addComponent(OrbitControl);
+}
+
+#### 指定代码高亮
+
+在某些情况下,你可能需要用户关注某一个关键方法,或某一个类的名称。这时候就可以使用 ` ```ts /specularTexture/ ` 语法来实现指定代码高亮的功能。
+
+````md filename="Markdown"
+```ts /specularTexture/
+ engine.resourceManager
+ .load({
+ type: AssetType.Env,
+ url: "https://gw.alipayobjects.com/os/bmw-prod/6470ea5e-094b-4a77-a05f-4945bf81e318.bin",
+ })
+ .then((ambientLight) => {
+ scene.ambientLight = ambientLight;
+ skyMaterial.texture = ambientLight.specularTexture;
+ skyMaterial.textureDecodeRGBM = true;
+ openDebug(ambientLight.specularTexture);
+ engine.run();
+ });
+```
+````
+The effect produced by the above syntax is as follows:
+
+```ts /specularTexture/
+engine.resourceManager
+ .load({
+ type: AssetType.Env,
+ url: "https://gw.alipayobjects.com/os/bmw-prod/6470ea5e-094b-4a77-a05f-4945bf81e318.bin",
+ })
+ .then((ambientLight) => {
+ scene.ambientLight = ambientLight;
+ skyMaterial.texture = ambientLight.specularTexture;
+ skyMaterial.textureDecodeRGBM = true;
+ openDebug(ambientLight.specularTexture);
+ engine.run();
+ });
+
+
+### 示例
+
+#### 编写示例代码
+
+我们现在提供两种方式让你组织示例代码:
+
+- 在 `examples/` 文件夹下新建 `.ts` 示例文件来组织示例代码
+- 在 `examples/` 文件夹下新建一个 **示例文件夹**,最后通过 `示例文件夹/index.ts` 来暴露示例代码
+
+#### 嵌入示例
+
+````mdx filename="light.mdx"
+...
+
+Ambient light emits from all directions and enters the eye, as shown in the example below:
+
+
+
+...
+````
+
+
+
+## New Documentation
+
+Adding new documentation may involve more steps than before (in some cases). It should be noted in advance that in the new official website, the file path is the route. At the same time, the order and titles of the documents need to be configured using `_meta.json`.
+
+For example, to add documentation for the animation system joint:
+
+1. Add the new document `docs/animation/joint.mdx`
+2. Add frontMatter to the document. The currently supported fields are:
+ 1. **title** Document title
+ 2. **group** Document subtitle
+ 3. **banner** Header image
+3. Write the document content
+4. Modify `animation/_meta.json` to define the order of the document and the title in the sidebar
+
+对于文档国际化而言,则同步在 `/docs/en` 文件夹中做上述操作
+
+## 新增博客 {/*examples*/}
+新增博客与新增文档的过程类似,只是 frontMatter 多了一些支持的字段。
+
+1. 新增博客文档 `blog/new_blog.mdx`
+2. 为博客增加 frontMatter, 支持的字段有:
+ 1. **title** 博客标题
+ 2. **group** 博客分类,可使用英文 `,` 隔开多个 group 名称
+ 3. **banner** 头图
+ 4. **published** 发布时间。年月日格式,例如 `2024-03-14`
+ 5. **author** 博客作者定义。包含 name avatar website 三个字段
+ 6. **searchable** 需要配置为 `searchable: false` 从而避免被内置的搜索引擎搜索到
+ 7. **summary** 博客的摘要,会显示在博客列表页中
+
+## 新增 Changelog {/*examples*/}
+和新增博客的过程一样,只不过文件位于 `/changelog` 中。
+
+## 新增示例 {/*examples*/}
+除了以往的单文件示例开发外,我们许你将一个大的示例进行拆分。比如将资产文件列表拆分成一个 `json`,或者将 `dat-gui` 的配置拆分到 `gui-config.ts` 中等等。
+
+要实现文件拆分,你只需要在 `examples` 文件中新建示例文件夹,确保文件夹中包含一个 `index.ts` 即可。在示例页面生成时,会自动检测代码中所依赖的其他文件,并注入到工作区中。
diff --git a/docs/en/input/framebuffer-picker.md b/docs/en/input/framebuffer-picker.md
index 170af7375a..630d41080e 100644
--- a/docs/en/input/framebuffer-picker.md
+++ b/docs/en/input/framebuffer-picker.md
@@ -1,13 +1,13 @@
---
order: 4
title: Framebuffer Picking
-type: Interaction
+type: Interact
label: Interact
---
-In 3D applications, it is often necessary to pick objects in the scene. [Ray bounding box](/en/docs/physics-manager#ray-detection) is a commonly used method for picking objects on the CPU, **which has good performance but poor accuracy** because bounding boxes are simple and cannot pick complex models.
+In 3D applications, it is often necessary to pick objects in the scene. [Ray-box intersection](/en/docs/physics/manager/#使用射线检测) is a common method for picking on the CPU. **It has good performance but poor accuracy** because bounding boxes are simple and cannot pick complex models.
-When the picking frequency is low, consider using the **pixel-level accuracy** of the `FramebufferPicker` component; when the picking frequency is high, developers need to evaluate whether the performance overhead is suitable for the business scenario because this component will perform CPU-GPU communication at the underlying level, that is, calling `gl.readPixels`.
+When the picking frequency is not high, you can consider using the `FramebufferPicker` component with **pixel-level accuracy**. When the picking frequency is too high, developers need to evaluate whether the performance overhead is suitable for the business scenario, as this component involves CPU-GPU communication at the underlying level, i.e., calling `gl.readPixels`.
diff --git a/docs/en/input/input.md b/docs/en/input/input.md
new file mode 100644
index 0000000000..b6b6d2a036
--- /dev/null
+++ b/docs/en/input/input.md
@@ -0,0 +1,37 @@
+---
+order: 0
+title: Interaction Overview
+type: Interaction
+label: Interact
+---
+
+Galacean provides a basic input system. Based on cross-end and cross-platform features, the interaction system is well compatible with both PC and mobile ends. The current interaction system can accept the following inputs:
+
+- [Touch](/en/docs/input/pointer/)
+- [Keyboard](/en/docs/input/keyboard/)
+- [Wheel](/en/docs/input/wheel/)
+
+## Initialization
+
+When initializing the engine, you can customize the listening sources for **touch**, **keyboard**, and **wheel**.
+
+
+
+```typescript
+// 将触控事件的监听源设置为 document
+const engine = await WebGLEngine.create({
+ canvas,
+ input: {
+ pointerTarget: document,
+ },
+});
+```
+
+> ⚠️ Do not set the listening source of touch to `window`, because `window` cannot receive `PointerLevel` events, which will cause touch information confusion.
+
+> ⚠️ If you set the listening source of the keyboard to a certain `HtmlElement`, you need to set its `tabIndex` so that it can be focused. For example, you can call `canvas.tabIndex = canvas.tabIndex;` once.
+
+## Framebuffer Picking
+
+If the engine's [touch callback](/en/docs/input/pointer/#触控回调) cannot meet your needs, you can try using [framebuffer picking](/en/docs/input/framebuffer-picker/)
+
diff --git a/docs/en/input/keyboard.md b/docs/en/input/keyboard.md
index b48720e30f..657325ed5b 100644
--- a/docs/en/input/keyboard.md
+++ b/docs/en/input/keyboard.md
@@ -1,23 +1,23 @@
---
order: 2
title: Keyboard
-type: Interaction
+type: Interact
label: Interact
---
-Galacean supports developers to query the current keyboard interaction status at any time, and the API calls are very simple.
+Galacean supports developers to query the current keyboard interaction status at any time, and the interface is very simple to call.
## Methods
-| Method Name | Method Definition |
-| --------------------------------------------------------- | ---------------------------- |
-| [isKeyHeldDown](/apis/core/#InputManager-isKeyHeldDown) | Returns whether the key is held down continuously |
-| [isKeyDown](/apis/core/#InputManager-isKeyDown) | Returns whether the key was pressed during the current frame |
-| [isKeyUp](/apis/core/#InputManager-isKeyUp) | Returns whether the key was released during the current frame |
+| Method Name | Description |
+| --------------------------------------------------------- | -------------------------- |
+| [isKeyHeldDown](/en/apis/core/#InputManager-isKeyHeldDown) | Returns whether the key is being held down |
+| [isKeyDown](/en/apis/core/#InputManager-isKeyDown) | Returns whether the key was pressed in the current frame |
+| [isKeyUp](/en/apis/core/#InputManager-isKeyUp) | Returns whether the key was released in the current frame |
## Quick Start
-Below are simple examples of checking the key status.
+Below is a simple example of detecting key states.
```typescript
class KeyScript extends Script {
@@ -36,32 +36,31 @@ class KeyScript extends Script {
}
```
-## Practical Use
+## Practical Example
-Let's control the angry bird with the space key this time.
+This time, let's use the spacebar to control Angry Birds.
## State Dictionary
-| Key State | isKeyHeldDown | isKeyDown | isKeyUp |
-| -------------------------------------------------------- | ------------- | --------- | ------- |
-| Key has been held down since the previous frame | true | false | false |
-| Key was pressed during the current frame and not released| true | true | false |
-| Key was released and pressed again during the current frame| true | true | true |
-| Key was pressed and released during the current frame | false | true | true |
-| Key was released during the current frame | false | false | true |
-| Key is not pressed and has no interaction | false | false | false |
-| This scenario will not occur | true | false | true |
-| This scenario will not occur | false | true | false |
+| Key State | isKeyHeldDown | isKeyDown | isKeyUp |
+| --------------------------- | ------------- | --------- | ------- |
+| The key has been held down since the last frame | true | false | false |
+| The key was pressed in the current frame and not released | true | true | false |
+| The key was released and pressed again in the current frame | true | true | true |
+| The key was pressed and released in the current frame | false | true | true |
+| The key was released in the current frame | false | false | true |
+| The key was not pressed and had no interaction | false | false | false |
+| This situation will not occur | true | false | true |
+| This situation will not occur | false | true | false |
## Keys
-The keyboard keys enumerated by Galacean correspond one-to-one with physical keyboard keys, following the W3C standard, and are compatible with various special keys on different hardware.
+The keyboard Keys enumerated by Galacean correspond one-to-one with the physical keyboard, following W3C standards, and are compatible with various special keys on different hardware.
-Keys Enumeration: [Keys.ts](https://github.com/galacean/engine/blob/main/packages/core/src/input/enums/Keys.ts)
+Keys Enumeration: https://github.com/galacean/engine/blob/main/packages/core/src/input/enums/Keys.ts
-W3C Standard: [W3C UI Events Code](https://www.w3.org/TR/2017/CR-uievents-code-20170601/)
-
-Keyboard Input Design Approach: [Keyboard Input Design](https://github.com/galacean/engine/wiki/Keyboard-Input-design)
+W3C Standard: https://www.w3.org/TR/2017/CR-uievents-code-20170601/
+Keyboard Input Design Philosophy: https://github.com/galacean/engine/wiki/Keyboard-Input-design
diff --git a/docs/en/input/pointer.md b/docs/en/input/pointer.md
index 5364b735ef..01e96f6d11 100644
--- a/docs/en/input/pointer.md
+++ b/docs/en/input/pointer.md
@@ -1,23 +1,23 @@
---
order: 1
title: Touch
-type: Interaction
+type: Interact
label: Interact
---
-Galacean's touch is based on [PointerEvent](https://www.w3.org/TR/pointerevents3/), which bridges the gap between [MouseEvent](https://developer.mozilla.org/zh-CN/en/docs/Web/API/MouseEvent) and [TouchEvent](https://developer.mozilla.org/zh-CN/en/docs/Web/API/TouchEvent), unifying the concept and interface of touch.
+Galacean's touch is based on [Pointer](https://www.w3.org/TR/pointerevents3/), which smooths out the differences between [Mouse](https://developer.mozilla.org/zh-CN/docs/Web/API/MouseEvent) and [Touch](https://developer.mozilla.org/zh-CN/docs/Web/API/TouchEvent), making touch unified in both concept and interface.
-## Pointer
+## Pointer {/*examples*/}
-In Galacean, whether it's a mouse on a PC, a stylus or finger on a mobile device, when it performs corresponding actions within the touch area (**Down**, **Move**, etc), it is instantiated as a [Pointer](/apis/core/#Pointer), and you can access all active touch points in the [InputManager](/apis/core/#InputManager).
+In Galacean, whether it's a mouse on a PC, a stylus or a finger on a mobile device, when it performs the corresponding behavior within the touch range (**Down**, **Move**, etc.), it will be instantiated as a [Pointer](/en/apis/core/#Pointer). You can get all the currently active touch points in the [InputManager](/en/apis/core/#InputManager).
-> It is important to note that each touch point is independent, responding to events and invoking corresponding callback functions.
+> It should be noted that each touch point is independent of each other, they respond to the corresponding events and callback the corresponding hook functions.
-### Lifecycle
+### Lifecycle {/*examples*/}
-Each touch point begins its own life in **PointerDown** or **PointerMove**, fades away in **PointerLeave** or **PointerCancel**, and in Galacean, you can use `Pointer.phase` to get the current status of the touch point.
+Each touch point will start its life in **PointerDown** or **PointerMove**, and leave the stage in **PointerLeave** or **PointerCancel**. In Galacean, you can get the real-time status of this touch point through `Pointer.phase`.
```mermaid
timeline
@@ -38,59 +38,59 @@ timeline
-### Touch Buttons
+### Touch Buttons {/*examples*/}
-Referring to the [W3C standard](https://www.w3.org/TR/uievents/#dom-mouseevent-button) and [Microsoft documentation](https://learn.microsoft.com/en-us/dotnet/api/system.windows.input.mousebutton?view=windowsdesktop-6.0), Galacean defines touch buttons as follows:
+Referring to the [W3C standard](https://www.w3.org/TR/uievents/#dom-mouseevent-button) and [Microsoft related documentation](https://learn.microsoft.com/en-us/dotnet/api/system.windows.input.mousebutton?view=windowsdesktop-6.0), Galacean defines touch buttons as follows:
| Enumeration | Explanation |
| :---------------------------------------------- | :--------------------------------------------------------------- |
-| [None](/apis/core/#PointerButton-None) | No touch button pressed |
-| [Primary](/apis/core/#PointerButton-Primary) | Primary button of the device, usually left button (mouse) or the only button on a single-button device (finger) |
-| [Secondary](/apis/core/#PointerButton-Secondary) | Secondary button of the device, usually right button (mouse) |
-| [Auxiliary](/apis/core/#PointerButton-Auxiliary) | Auxiliary button of the device, usually the scroll wheel (mouse) |
-| [XButton1](/apis/core/#PointerButton-XButton1) | Extended button of the device, usually the undo button (mouse) |
-| [XButton2](/apis/core/#PointerButton-XButton2) | Extended button of the device, usually the redo button (mouse) |
-| [XButton3](/apis/core/#PointerButton-XButton3) | Extended button |
-| [XButton4](/apis/core/#PointerButton-XButton4) | Extended button |
+| [None](/en/apis/core/#PointerButton-None) | No touch button pressed |
+| [Primary](/en/apis/core/#PointerButton-Primary) | The primary button of the device, usually the left button (mouse) or the only button on a single-button device (finger) |
+| [Secondary](/en/apis/core/#PointerButton-Secondary) | The secondary button of the device, usually the right button (mouse) |
+| [Auxiliary](/en/apis/core/#PointerButton-Auxiliary) | The auxiliary button of the device, usually the wheel (mouse) |
+| [XButton1](/en/apis/core/#PointerButton-XButton1) | The extended button of the device, usually the undo button (mouse) |
+| [XButton2](/en/apis/core/#PointerButton-XButton2) | The extended button of the device, usually the redo button (mouse) |
+| [XButton3](/en/apis/core/#PointerButton-XButton3) | Extended button |
+| [XButton4](/en/apis/core/#PointerButton-XButton4) | Extended button |
| …… | …… |
-结合触控按键可以方便地检测触控点在本帧触发的行为:
+Combining touch buttons can easily detect the behavior of touch points triggered in this frame:
-### 触控回调
+### Touch Callbacks {/*examples*/}
-只需要为添加了 Collider 组件的 Entity 增加触控回调,就可以实现与渲染物体交互的能力。触控回调已经整合到引擎的[脚本组件生命周期](/en/docs/script#组件生命周期函数)中,用户可以很方便地添加以下事件,同时钩子函数中会携带触发此回调的 Pointer 实例。
+You only need to add touch callbacks to an Entity with a Collider component to enable interaction with rendered objects. Touch callbacks are integrated into the engine's [script lifecycle](/en/docs/script/class/#脚本生命周期), allowing users to easily add the following events. The hook functions will carry the Pointer instance that triggered the callback.
-| 接口 | 触发时机与频率 |
-| :------------------------------------------------- | :------------------------------------------------------------------------- |
-| [onPointerEnter](/apis/core/#Script-onPointerEnter) | 当触控点进入 Entity 的碰撞体范围时触发一次 |
-| [onPointerExit](/apis/core/#Script-onPointerExit) | 当触控点离开 Entity 的碰撞体范围时触发一次 |
-| [onPointerDown](/apis/core/#Script-onPointerDown) | 当触控点在 Entity 的碰撞体范围内按下时触发一次 |
-| [onPointerUp](/apis/core/#Script-onPointerUp) | 当触控点在 Entity 的碰撞体范围内松开时触发一次 |
-| [onPointerClick](/apis/core/#Script-onPointerClick) | 当触控点在 Entity 的碰撞体范围内按下并松开,在松开时触发一次 |
-| [onPointerDrag](/apis/core/#Script-onPointerDrag) | 当触控点在 Entity 的碰撞体范围内按下时**持续**触发,直至触控点解除按下状态 |
+| Interface | Trigger Timing and Frequency |
+| :-------------------------------------------------- | :------------------------------------------------------------------------- |
+| [onPointerEnter](/en/apis/core/#Script-onPointerEnter) | Triggered once when the touch point enters the Entity's collider range |
+| [onPointerExit](/en/apis/core/#Script-onPointerExit) | Triggered once when the touch point leaves the Entity's collider range |
+| [onPointerDown](/en/apis/core/#Script-onPointerDown) | Triggered once when the touch point is pressed within the Entity's collider range |
+| [onPointerUp](/en/apis/core/#Script-onPointerUp) | Triggered once when the touch point is released within the Entity's collider range |
+| [onPointerClick](/en/apis/core/#Script-onPointerClick) | Triggered once when the touch point is pressed and released within the Entity's collider range |
+| [onPointerDrag](/en/apis/core/#Script-onPointerDrag) | Continuously triggered when the touch point is pressed within the Entity's collider range until the touch point is no longer pressed |
-> ⚠️ 触控回调**依赖物理引擎**,使用此功能前请确保物理引擎已初始化完毕。
+> ⚠️ Touch callbacks **depend on the physics engine**. Please ensure the physics engine is initialized before using this feature.
-如下示例:
+Example:
-- 最左边的立方体添加了对 Enter 与 Exit 的响应,当鼠标移动到上方和鼠标移出时便会触发它颜色的改变。
-- 中间的立方体添加了对 Drag 的响应,你可以用鼠标拖拽这个立方体在空间内任意移动。
-- 最右边的立方体添加了对 Click 的响应(先 down 后 up ),当鼠标点击时会触发它颜色的改变。
+- The leftmost cube responds to Enter and Exit events, changing color when the mouse moves over it and when the mouse leaves.
+- The middle cube responds to Drag events, allowing you to drag the cube anywhere in space with the mouse.
+- The rightmost cube responds to Click events (first down, then up), changing color when the mouse clicks on it.
-### 射线检测
+### Raycasting {/*examples*/}
-触控回调是基于射线检测实现的,若要自定义射线检测也十分简单,只需按照如下步骤即可。
+Touch callbacks are implemented based on raycasting. Customizing raycasting is also very simple, just follow these steps.
```mermaid
flowchart LR
- 添加碰撞体组件 --> 获取触控点 --> 通过画布坐标获取射线 --> 射线检测
+ Add Collider Component --> Get Touch Point --> Get Ray from Canvas Coordinates --> Raycasting
```
-添加碰撞体组件可参考[物理相关文档](/en/docs/physics-collider),实现检测部分的代码逻辑如下:
+Refer to [Collider Component](/en/docs/physics/collider/) for adding a collider component. The code logic for implementing the detection part is as follows:
```typescript
// 假设当前有一个活动的触控点
@@ -104,31 +104,31 @@ if (scene.physics.raycast(ray, 100, hitResult)) {
}
```
-通过下方示例可以更直观地理解此过程,示例中为主相机添加了辅助线,侧视相机可以完整观察到主相机射线检测到碰撞体的过程。
+The following example provides a more intuitive understanding of this process. The main camera is equipped with auxiliary lines, and the side view camera can fully observe the process of the main camera's raycasting detecting the collider.
-## 兼容性
+## Compatibility
-截止 2024 年 2 月,不同平台对 PointerEvent 的兼容性已经达到了 [96.35%](https://caniuse.com/?search=PointerEvent) 。
+As of February 2024, the compatibility of PointerEvent across different platforms has reached [96.35%](https://caniuse.com/?search=PointerEvent).
-设计思路可参考:https://github.com/galacean/engine/wiki/Input-system-design.
+Design ideas can be referenced at: https://github.com/galacean/engine/wiki/Input-system-design.
-> ⚠️ 若遇到平台的兼容性问题,可以在 https://github.com/galacean/polyfill-pointer-event 提 issue 。
+> ⚠️ If you encounter compatibility issues on a platform, you can raise an issue at https://github.com/galacean/polyfill-pointer-event.
## QA
-### 触控在 PC 端正常,但在移动端异常
+### Touch works fine on PC but behaves abnormally on mobile devices {/*examples*/}
-在移动端,触控会触发 HTML 元素的默认行为,一旦触发默认行为,触控就会从元素上被移除(PointerCancel),可以通过设置监听源的 `touchAction` 解决,若触控的监听源为默认画布:
+On mobile devices, touch interactions trigger the default behavior of HTML elements. Once the default behavior is triggered, the touch interaction is removed from the element (PointerCancel). This can be resolved by setting the `touchAction` of the listening source. If the listening source is the default canvas:
```typescript
(engine.canvas._webCanvas as HTMLCanvasElement).style.touchAction = "none";
```
-### 右键操作失效,弹出菜单栏
+### Right-click operation fails, context menu pops up
-这是由于右键触发系统默认行为导致的,可以加入下列代码阻止:
+This is caused by the default behavior triggered by the right-click. You can add the following code to prevent it:
```typescript
document.oncontextmenu = (e) => {
diff --git a/docs/en/input/wheel.md b/docs/en/input/wheel.md
index 62011ae119..fc7cf376b0 100644
--- a/docs/en/input/wheel.md
+++ b/docs/en/input/wheel.md
@@ -1,17 +1,17 @@
---
order: 3
-title: Scroll Wheel
-type: Interaction
+title: Wheel
+type: Interact
label: Interact
---
-The scroll wheel input in Galacean is based on [WheelEvent](https://www.w3.org/TR/uievents/#interface-wheelevent).
+Galacean's wheel input is based on [WheelEvent](https://www.w3.org/TR/uievents/#interface-wheelevent).
## Usage
-You can use this to implement an example of controlling the camera distance with the scroll wheel.
+Here is an example of using the wheel to control the camera distance.
diff --git a/docs/en/interface/hierarchy.mdx b/docs/en/interface/hierarchy.mdx
new file mode 100644
index 0000000000..b126dd8dfc
--- /dev/null
+++ b/docs/en/interface/hierarchy.mdx
@@ -0,0 +1,93 @@
+---
+order: 3
+title: Hierarchy Panel
+type: Basics
+group: Interface
+label: Basics/Interface
+---
+
+The hierarchy panel is located on the far left side of the editor. It displays all the nodes in the current scene in a tree structure. The scene node is the parent node of all other nodes, including cameras, lights, meshes, etc.
+
+
+
+In the hierarchy panel, you can:
+
+- Add, delete, or clone a node
+- Copy the path information of a node
+- Adjust the hierarchy of nodes by dragging
+- Fuzzy search for nodes in the scene
+- Temporarily hide a node
+
+## Adding, Deleting, and Copying Nodes
+
+### Adding Nodes
+
+> You can add empty nodes or quickly add nodes with corresponding functional components, such as nodes with camera components, light source components, and basic 3D/2D rendering components.
+
+You can click the "+" button in the hierarchy tree panel to add a node. Note that if you have a node selected, the added node will become a **child node of the selected node**; otherwise, it will default to being a child node of the scene:
+
+
+
+After adding, you can edit the properties of the new node in the **[Inspector Panel](/en/docs/interface/inspector)**.
+
+### Deleting Nodes
+
+To delete a node, you can use the following methods:
+
+- Select the node to be deleted -> Click the delete button
+- Select the node to be deleted -> Press the Delete key
+- Right-click a node -> Delete
+
+
+
+
+ Deleting a node will delete the node and all its child nodes. So when deleting a node, you need to be aware of whether the deleted node will affect other nodes in the scene.
+
+
+### Copying Nodes
+
+> Copying a node will copy the selected node and all its child nodes, essentially calling the engine's [clone](/en/docs/core/clone) capability.
+
+After selecting a node, you can quickly clone the node at the same level by using `Duplicated`.
+
+
+
+You can also choose `copy` and `paste` separately to achieve cross-level copying.
+
+
+
+Additionally, you can quickly copy the selected node using the shortcut key ⌘ + D .
+
+## Node Sorting
+
+To better organize nodes, you can sort nodes by dragging. After selecting a node, you can change its position in the hierarchy tree by dragging it with the left mouse button.
+
+
+
+## Node Search
+
+There is a search box at the top of the hierarchy panel where users can enter the name of a node to search for nodes in the scene. The search box supports fuzzy search, allowing you to enter part of the node's name to find it.
+
+## Node Hiding
+
+Each entity node has an eye button on the right side. Clicking it toggles the node's visibility in the scene.
+
+> Note that adjusting the node's visibility here only affects the workspace and not the `isActive` property in the **[Inspector Panel](/en/docs/interface/inspector)**.
+
+## Shortcuts
+
+The following operations are effective only after selecting a node.
+
+| Operation | Shortcut |
+| :---------------- | :----------------------------------------- |
+| `Delete Node` | Backspace or Delete |
+| `Copy Node` | ⌘ + D |
+| `Select Previous Node` | ↑ |
+| `Select Next Node` | ↓ |
+| `Expand Node` | → |
+| `Collapse Node` | ← |
+
+It looks like you haven't pasted the Markdown content yet. Please provide the content you need translated, and I'll help you with the translation while adhering to the specified rules.
diff --git a/docs/en/interface/inspector.md b/docs/en/interface/inspector.md
index c6c16fe71d..4baf36fcac 100644
--- a/docs/en/interface/inspector.md
+++ b/docs/en/interface/inspector.md
@@ -6,67 +6,65 @@ group: Interface
label: Basics/Interface
---
-The Inspector Panel is located on the right side of the editor and is the most commonly used panel while using the editor. Depending on what you have currently selected, the Inspector Panel will display the corresponding properties. You can use the Inspector Panel to edit almost everything in the scene, such as scenes, entities, components, assets, and more.
+The Inspector panel is located on the right side of the editor and will be the most frequently used panel during your use of the editor. Based on what you currently have selected, the Inspector panel will display the corresponding properties. You can use the Inspector panel to edit almost everything in the scene, such as scenes, entities, assets, etc.
-
-
-
- Scene Inspector
-
-
-
- Entity Inspector
-
-
-
- Asset Inspector
-
-
+
+## Types of Inspectors
-## Property Types
+### Scene Inspector
-The properties in the Inspector Panel can be divided into two main categories:
+
-- **Basic Value Types**: Number adjustments, color selection, property toggles, etc.
-- **Reference Types**: Usually resources, such as material selection, texture selection, etc.
+The scene is at the top of the hierarchy tree. By clicking on the scene, you can see that the Inspector provides adjustments for scene-related effects such as ambient light, background, shadows, fog, etc. For detailed parameters on how to edit these elements, see [Scene](/en/docs/core/scene).
+
+
+
+### Entity Inspector
+
+The Entity Inspector is the most commonly used inspector. Its properties include the current entity's component list. You can easily modify the properties of a component or conveniently add any built-in engine components and custom script components through the **Add Component** button. The Entity Inspector also includes basic information about the current entity, such as `Transform`, `Layer`, etc. For more details, see [Entity](/en/docs/core/entity).
+
+
+
+### Asset Inspector
-### Number Adjustments
+After selecting an asset in the asset panel, the Inspector will display the various properties of the current asset and provide a previewer to show the editing results in real-time. The following image is a screenshot of the Inspector for a material asset.
+
+
+
+## Using Inspector Controls
+
+Inspector controls can be divided into two main categories:
+
+- **Basic Value Types**: Number adjustment, color selection, property toggling, etc.
+- **Reference Types**: Usually resources, such as material selection, texture selection, etc.
-There are many number adjustment entries available in the Inspector. For different properties, the range of numbers that can be adjusted and the size of each adjustment may vary. The most typical example is adjusting the position, rotation, and scale values of the `Transform` component.
+### Number Adjustment Controls
-You can quickly adjust the size of numbers by dragging the slider on the right side of the input box. While dragging, holding down `⌘` (or `ctrl` on Windows) allows for more precise adjustments to the numbers (precision is 1/10 of the original step).
+The Inspector provides many entry points for number adjustments. Depending on the property, the range of adjustable numbers and the size of each adjustment will vary. The most typical example is adjusting the position, rotation, and scale values of the `Transform` component.
-
+You can quickly adjust the number size by dragging the slider on the right side of the input box. While dragging, holding down ⌘ (on Windows, Ctrl ) allows for more precise number adjustments (precision is 1/10 of the original step).
-Some adjustable properties appear in the form of sliders. You can drag the slider to quickly adjust the size of numbers, such as the `Intensity` of a light. Similarly, while dragging the slider, holding down `⌘` (or `ctrl` on Windows) allows for more precise adjustments to the numbers.
+
-
+Some adjustable properties appear in the form of sliders. You can drag the slider to quickly adjust the number size, such as the `Intensity` of a light. Similarly, while dragging the slider, holding down `⌘` (on Windows, `ctrl`) allows for more precise number adjustments.
-There are also number adjustment properties that appear in the form of input boxes and buttons, such as the `Near Plane` of a shadow. These properties often have more precise step sizes (such as 0.1, 1, 10). Clicking the button directly increases or decreases the value by the step length.
+
-
+Some number adjustment properties appear in the form of input boxes and buttons, such as the `Near Plane` of shadows. These properties often have more precise step sizes (e.g., 0.1, 1, 10). Clicking the buttons can directly increase or decrease the value by the step length.
-### Color Panel
+
-Some properties need color adjustments, such as lighting, scene background color, or the self-illuminating color of materials. To adjust colors, you need to click on the color button on the left to bring up the color picker. In the color picker, you can use HUE to select colors, adjust the color's transparency; you can also adjust the specific RGBA values of the color in the input box. Click the button to switch between HSLA, RGBA, and HEXA modes.
+### Color Picker
-
+Some properties require color adjustments, such as lighting, scene background color, or the emissive color of a material. To adjust the color, you need to click the color button on the left to bring up the color picker. In the color picker, you can use HUE to select the color and adjust the color's transparency; you can also adjust the specific RGBA values in the input box. Click the button to switch between HSLA, RGBA, and HEXA modes.
-### Asset Selection Popup
+
-Some properties need to reference the required assets. In this case, you can click on the input box of the asset selector to bring up the asset selection popup. Different properties require different types of assets, but the asset selector has already been pre-configured with the corresponding filters, so you can select directly.
+### Asset Picker
-The asset selection popup also provides a search box that you can use to find the corresponding assets more accurately.
+Some properties require referencing the necessary assets. In this case, you can click the input box of the asset picker to bring up the asset picker. Different properties require different types of assets, but the asset picker is already configured with the appropriate filters, so you can select directly.
-
-
-
- Mesh Asset Picker
-
-
-
- Texture Asset Picker
-
-
+The asset picker also provides a search box, which you can use to find the corresponding assets more precisely.
+
diff --git a/docs/en/interface/intro.mdx b/docs/en/interface/intro.mdx
new file mode 100644
index 0000000000..3918cfb376
--- /dev/null
+++ b/docs/en/interface/intro.mdx
@@ -0,0 +1,41 @@
+---
+order: 0
+title: Interface Overview
+type: Basics
+group: Interface
+label: Basics/Interface
+---
+
+## Home
+
+
+
+| No. | Area | Description |
+| --- | ------------ | ------------------------------------------------------------------------------------- |
+| 1 | **Create Project** | You can create a new 3D project or 2D project |
+| 2 | **Projects** | You can view all projects, double-click to enter a project |
+| 3 | **Sidebar** | Besides the project page, you can also access [business templates](/en/docs/interface/template), documentation, and editor discussion area |
+
+## Scene Editing Page
+
+
+
+| No. | Area | Description |
+| --- | --- | --- |
+| 1 | Sidebar | Contains the main menu of the editor, panel switch buttons, and personalization settings |
+| 2 | [Hierarchy Panel](/en/docs/interface/hierarchy) | Located on the left side of the editor, it displays all nodes in the entire scene |
+| 3 | [Assets Panel](/en/docs/assets/interface) | Located at the bottom of the editor, it displays all assets included in the current project, such as HDR maps, models, various texture files, scripts, font files, etc. |
+| 4 | [Inspector Panel](/en/docs/interface/inspector) | Located on the right side of the editor, it shows different editing options based on your current selection |
+| 5 | [Main Editing Area](/en/docs/interface/viewport) | Located in the middle of the editor, it is the main operation area of the editor, where you can edit the scene using the mouse and keyboard |
+| 6 | Toolbar | Located at the top of the editor, it provides some quick operations, such as switching Gizmo modes, switching scene views, camera configurations, etc. |
+| 7 | Camera Preview Area | Located at the top left of the main editing area, you can preview the scene from the perspective of the currently selected camera |
+| 8 | [Animation Clip Editing](/en/docs/animation/clip) | Double-click the AnimationClip asset or click the button in the panel menu to invoke it, where you can edit the specified AnimationClip asset |
+| 9 | [Animation Controller Editor](/en/docs/animation/animator) | Double-click the AnimatorController asset or click the button in the panel menu to invoke it, where you can edit the specified AnimatorController asset |
+
+For detailed introductions of each panel, you can click the links above to view.
diff --git a/docs/en/interface/menu.mdx b/docs/en/interface/menu.mdx
new file mode 100644
index 0000000000..bbe2b13124
--- /dev/null
+++ b/docs/en/interface/menu.mdx
@@ -0,0 +1,37 @@
+---
+order: 2
+title: Main Menu
+type: Basics
+group: Interface
+label: Basics/Interface
+---
+
+By clicking the first button on the sidebar, you can bring up the main menu. The main menu provides project settings, new/clone project options, and some editing options.
+
+
+
+### New/Clone Project
+
+Select the **New Project** option to further choose different types of new projects. Clicking **Fork** will redirect to the newly cloned project page, while the old project will still be retained.
+
+### Project Settings
+
+Clicking the **Project Settings** option will bring up the project settings popup, which includes operations such as project renaming, engine version management, and snapshot management.
+
+
+
+#### Basic Settings
+
+**Basic** includes the basic information settings of the project:
+
+- Engine Version: Upgrade the engine version to quickly fix a bug or enjoy new features.
+- Physics Backend: The physics engine backend, you can choose between _Physics Lite_ or _PhysX_. The former is a lightweight physics engine, while the latter is an advanced physics engine based on [PhysX](https://developer.nvidia.com/physx-sdk).
+- Model Import Options: Model import options, including options to compute tangents and remove lights.
+
+The engine version upgrade operation is irreversible. To avoid damaging the project, a project will be automatically cloned during the engine upgrade process.
+
+#### Snapshot Management
+
+The **Snapshots** snapshot management feature allows users to save a snapshot of a project to the history. In case of data loss or other issues, you can quickly restore to a previously saved snapshot using **Revert**. Users can select **Add Snapshot** from the menu. Clicking on the snapshot name allows you to edit the snapshot name for easy identification next time.
+
+
diff --git a/docs/en/interface/shortcut.md b/docs/en/interface/shortcut.md
index 9a82655103..fb7f5ea53a 100644
--- a/docs/en/interface/shortcut.md
+++ b/docs/en/interface/shortcut.md
@@ -1,11 +1,12 @@
---
-order: 7
-title: Keyboard Shortcuts
+order: 8
+title: Shortcuts
type: Basics
group: Interface
label: Basics/Interface
---
-Keyboard shortcuts help improve efficiency in editing scenarios. Users can find mouse (or trackpad) and keyboard viewport controls in the main menu under `Shortcuts`, as well as global and panel-specific shortcuts.
+Shortcuts help improve the efficiency of editing scenes. Users can find the viewport control methods for the mouse (or touchpad) and keyboard, as well as global and panel-specific shortcuts, in the main menu under `Shortcuts`.
+
diff --git a/docs/en/interface/template.md b/docs/en/interface/template.md
new file mode 100644
index 0000000000..3388f33af9
--- /dev/null
+++ b/docs/en/interface/template.md
@@ -0,0 +1,12 @@
+---
+order: 1
+title: Business Template
+type: Basic Knowledge
+group: Interface
+label: Basics/Interface
+---
+
+Business templates are currently divided into categories such as animation, special effects, 2D, 3D, XR, and digital humans. By clicking on a template, you can view project details, preview the project, and replicate the project locally.
+
+
+
diff --git a/docs/en/interface/viewport.md b/docs/en/interface/viewport.md
index f6c2a0dc95..15253091e5 100644
--- a/docs/en/interface/viewport.md
+++ b/docs/en/interface/viewport.md
@@ -1,5 +1,5 @@
---
-order: 5
+order: 6
title: Viewport
type: Basics
group: Interface
@@ -8,78 +8,95 @@ label: Basics/Interface
## Introduction
-The viewport is an interactive interface used to select, position, and modify various types of entities and components in the current scene.
+The viewport window is an interactive interface used to select, position, and change various types of entities and components in the current scene.
-
+
-## Scene Navigation
+## Browsing the Scene
-There are two ways to navigate the scene: Standard Mode and Fly Mode. Standard Mode revolves around the central viewpoint, while Fly Mode is suitable for browsing large scenes, allowing the scene camera to move forward, backward, left, right, up, and down in 3D space.
+There are two ways to browse the scene: standard mode and flight mode. Standard mode rotates around the center viewpoint, while flight mode is suitable for browsing large scenes, where the scene camera moves forward, backward, left, right, up, and down in three-dimensional space.
-| Mode | Action | Shortcut |
-| :----------- | :----------- | ---------------------------------------------------------------------- |
-| **Standard Mode** | Orbit | `alt` + Left Mouse Button |
-| | Pan | `alt` + `command` + Left Mouse Button, or press the mouse wheel button |
-| | Zoom | `alt` + `control` + Left Mouse Button, or scroll the mouse wheel, or swipe with two fingers on the trackpad |
-| **Fly Mode** | Orbit Camera | alt + Right Mouse Button |
-| | Move Forward | Up Arrow key, or Right Mouse Button + `W` |
-| | Move Backward| Down Arrow key, or Right Mouse Button + `S` |
-| | Move Left | Left Arrow key, or Right Mouse Button + `A` |
-| | Move Right | Right Arrow key, or Right Mouse Button + `D` |
-| | Move Up | Right Mouse Button + `E` |
-| | Move Down | Right Mouse Button + `Q` |
-| | Change Speed | Right Mouse Button + Mouse Wheel |
+| Mode | Operation | Shortcut Key |
+| :------------ | :-------------- | ---------------------------------------------------------------------- |
+| **Standard Mode** | Orbit | `alt` + left mouse button |
+| | Pan | `alt` + `command` + left mouse button, or press the mouse wheel |
+| | Zoom | `alt` + `control` + left mouse button, or scroll the mouse wheel, or swipe with two fingers on the touchpad |
+| **Flight Mode** | Look around | `alt` + right mouse button |
+| | Move forward | Up arrow key, or right mouse button + `W` |
+| | Move backward | Down arrow key, or right mouse button + `S` |
+| | Move left | Left arrow key, or right mouse button + `A` |
+| | Move right | Right arrow key, or right mouse button + `D` |
+| | Move up | Right mouse button + `E` |
+| | Move down | Right mouse button + `Q` |
+| | Change flight speed | Right mouse button + mouse wheel |
## Toolbar
-The toolbar is located at the top of the viewport window, and hovering the mouse will display the shortcut keys for each item or a brief description.
-
-
-
-| Icon | Name | Description | Shortcut |
-| -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ------------------------ | -------------------------------------------------------------------------------------------------------------------------------------------------------- | ------------------------ |
-| | Drag | Drag the view | |
-| | Move Rotate Scale | Transform selected entities | `W` `E` `R` |
-| | Center Anchor/Pivot Anchor| Toggle the anchor point of selected entities | |
-| | Local Coordinates/World Coordinates | Toggle the coordinates of selected entities | |
-| | Focus | Focus the scene camera on the selected entity | `F` |
-| | Scene Camera | The scene camera menu contains options for configuring the scene camera, mainly used to address issues where objects are not visible due to the clipping plane being too far or too close when building a scene. These adjustments do not affect the settings of entities with camera components in the scene. | |
-| | Settings | The settings menu includes options for adjusting the display of auxiliary views, including grids, gizmos (graphics associated with specific components in the scene, including cameras, directional lights, point lights, spotlights), and wireframes. | |
-| | Scene Camera Type | Switch between perspective and orthographic cameras | |
-| | Mode | Conveniently switch between 2D/3D scene modes. In 2D mode, the navigation widget, orthographic/perspective switch, and orbit track in the navigation are disabled. | |
-| | Full Screen/Restore | Maximize the viewport window, minimize the hierarchy, assets, and inspector | |
-| | Screenshot | Take a snapshot of the current scene. Only user-created entities within the scene are displayed, and a series of tools for assistance, such as icons, grids, and gizmos, are not included. After taking a screenshot, the snapshot will be used as the project thumbnail on the homepage. | |
-
-### Auxiliary Element Settings Interface
-
-
-
-| Attribute | Content |
-| ------------------------------ | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
-| Grid | Whether the grid in the view is displayed |
-| 3D Icons | Whether auxiliary icons are scaled based on the distance between components and the camera |
-| Navigation Gimzo | Used to display the current direction of the scene camera and allows quick modification of the perspective and projection mode (orthographic/perspective) through mouse operation. When opened, it will be displayed in the lower right corner of the screen. |
-| Outline Display | Whether to display an outline when a certain entity is selected. The outline color of the selected entity is orange, and the outline of child nodes is blue |
-| Camera | Displays the selected camera component with a cone |
-| Light | Displays the light source component |
-| Static Collider | Displays the shape of static colliders |
-| Dynamic Collider | Displays the shape of dynamic colliders |
-| Emission Shape | Displays the shape of the particle emitter |
-
-### Scene Camera Settings
-
-
-
-| Property | Content | Default Value |
-| :------------------- | :----------------------------------------------------------- | :------------------- |
-| Field of View (Fov) | The field of view of the scene camera | 60 |
-| Dynamic Clipping | Automatically calculates the near and far clipping planes of the scene camera relative to the selected entity and camera position | Off |
-| Near Plane | Manually adjust the closest point relative to the scene camera | Enabled when Dynamic Clipping is off |
-| Far Plane | Manually adjust the farthest point relative to the scene camera | Enabled when Dynamic Clipping is off |
-| Speed | The movement speed of the camera in flight mode | 10 |
+The toolbar is located at the top of the viewport window. Hovering the mouse over each item will display its shortcut key or description.
+
+
+
+| Icon | Name | Description | Shortcut Key |
+| --- | --- | --- | --- |
+| | Drag | Drag the view | |
+| | Move Rotate Scale | Transform the selected entity | `W` `E` `R` |
+| | Center Pivot/Hub Pivot | Switch the pivot of the selected entity | |
+| | Local Coordinates/World Coordinates | Switch the coordinates of the selected entity | |
+| | Focus | Focus the scene camera on the selected entity | `F` |
+| | Scene Camera | The scene camera menu contains options for configuring the scene camera, mainly used to solve the problem of objects not being visible when the clipping plane is too far or too close while building the scene. These adjustments will not affect the settings of entities with camera components in the scene. | |
+| | Settings | The settings menu contains options for adjusting auxiliary displays in the view, including grids, auxiliary icons (graphics associated with specific components in the scene, including cameras, directional lights, point lights, spotlights), and auxiliary wireframes | |
+| | Scene Camera Type | Switch between perspective/orthographic camera | |
+| | Mode | Convenient for switching between 2D/3D scene modes with a click. In 2D mode, navigation components, orthographic/perspective switching are disabled, and orbiting in navigation is no longer effective. | |
+| | Fullscreen/Restore | Maximize the viewport window, minimize the hierarchy, assets, and inspector | |
+| | Play | Play all particles and animations in the scene |
+| | Screenshot | Take a snapshot of the current scene. Only user-created entities in the scene are displayed; a series of auxiliary display tools such as icons, grids, and gizmos will not be included. After taking a screenshot, the snapshot will be used as the project thumbnail on the homepage. |
+
+### Auxiliary Elements Settings Interface
+
+
+
+| Attribute | Content |
+| --- | --- |
+| Grid | Whether the grid in the view is displayed |
+| 3D Icons | Whether auxiliary icons scale based on the distance between the component and the camera |
+| Navigation Gimzo | Used to display the current direction of the scene camera and can quickly modify the view and projection mode (orthographic/perspective) through mouse operations. When enabled, it will be displayed in the lower right corner of the screen. |
+| Outline | Whether to display the outline when an entity is selected. The outline color of the selected entity is orange, and the outline of child nodes is blue |
+| Camera | Display the selected camera component as a cone |
+| Light | Display light source components |
+| Static Collider | Display the shape of static colliders |
+| Dynamic Collider | Display the shape of dynamic colliders |
+| Emission Shape | Display the shape of particle emitters |
+
+### Scene Camera Settings Interface
+
+
+
+| Attribute | Content | Default Value |
+| :---------------------------- | :-------------------------------------------------------------- | :-------------------- |
+| Fov | The field of view of the scene camera | 60 |
+| Dynamic Clipping | Automatically calculate the near and far clipping planes of the scene camera relative to the selected entity and the scene camera position | Off |
+| Near Plane | Manually adjust the nearest point relative to the scene camera | Enabled when dynamic clipping is unchecked |
+| Far Plane | Manually adjust the farthest point relative to the scene camera | Enabled when dynamic clipping is unchecked |
+| Speed | The movement speed of the camera in flight mode | 10 |
+| Opaque Texture | Enable opaque texture for the scene camera | Off |
+| HDR | Enable HDR for the scene camera | Off |
+| Post Process | Enable post-processing for the scene camera | On |
## Preview
-When selecting an entity with a camera component, the real-time preview of the camera will be displayed in the bottom left corner of the view window. This helps users adjust the camera and scene position in real-time. The preview window can be dragged, locked, and switched to windows of different device ratios.
+When an entity with a camera component is selected, a real-time preview of the camera will be displayed in the lower left corner of the view window. This helps users to adjust the camera and scene position in real-time. The preview window can be dragged, locked, and switched to different device aspect ratios.
+
+
+
+
+| Attribute | Content |
+| :-------- | :------------------------------ |
+| Drag | Freely drag the preview window |
+| Position | Position the camera in the scene |
+| Switch Ratio | Switch windows of different devices and ratios |
+| Lock | Lock the camera preview window |
+
+In the hierarchy tree, objects containing camera components can directly synchronize the relevant properties of the scene camera in the view, making it convenient to adjust the position and perspective.
+
+
-
diff --git a/docs/en/miniProgram/miniProgame.md b/docs/en/miniProgram/miniProgame.md
new file mode 100644
index 0000000000..3c89cfbe80
--- /dev/null
+++ b/docs/en/miniProgram/miniProgame.md
@@ -0,0 +1,241 @@
+---
+order: 0
+title: MiniProgram Project
+type: MiniProgram
+label: MiniProgram
+---
+
+Currently, Galacean has been adapted to Alipay and Taobao Mini Programs. This tutorial assumes that developers already have some Mini Program development skills. If not, please read the following tutorials, download the Mini Program development tools, and apply for an AppId:
+
+- [Alipay Mini Program](https://opendocs.alipay.com/mini/developer)
+- [Taobao Mini Program](https://miniapp.open.taobao.com/docV3.htm?docId=119114&docType=1&tag=dev)
+
+Mini Program project release:
+
+- [Alipay Mini Program Release](https://opendocs.alipay.com/mini/introduce/release)
+- [Taobao Mini Program Release](https://developer.alibaba.com/docs/doc.htm?spm=a219a.7629140.0.0.258775fexQgSFj&treeId=635&articleId=117321&docType=1)
+
+## Project Export
+
+The feature to export Alipay Mini Programs from the Galacean editor is still under development, and the interaction methods and template projects may change in the future.
+
+
+
+## Project Startup
+
+After clicking download, a zip file will be downloaded. The directory structure after unzipping is as follows:
+
+```shell
+.
+├── mini # 📁 小程序执行目录
+│ ├── dist # 📁 代码构建结果
+│ ├── pages # 📁 小程序页面
+│ ├── app.json # ⚙️ 项目配置文件
+│ ├── app.js # 代码入口
+├── public # 📁 公共资源目录
+│ ├── scene.json # 场景文件
+│ └── ... # 其他
+├── src # 📁 源代码目录
+├── mini.project.json # ⚙️ 工程配置文件
+├── project.json # ⚙️ 编辑器导出工程配置
+└── ... # 其他
+```
+
+Next, you can install dependencies and start the project:
+
+```shell
+npm install
+npm run dev
+```
+
+Opening it with the Mini Program IDE, you can see:
+
+
+
+## Local Resource Handling
+
+### Ant Group Internal Users
+
+Directly use "Upload to CDN" (in the export panel options, refer to the image above), using the default CDN of the group. If you want to use a custom CDN, refer to the instructions for non-Ant Group internal users.
+
+### Non-Ant Group Internal Users
+
+1. Upload the public files to the CDN yourself.
+2. Modify the scene.json file or configure the baseUrl.
+
+## In-Package File Loading (WIP)
+
+Currently, local file loading for Mini Programs is not supported.
+
+## Known Issues
+
+- Mini Programs do not support WebAssembly, so PhysX cannot be used as the physics backend.
+- Local file loading is not supported yet, and files need to be manually uploaded to the CDN.
+
+## Additional Notes
+
+### Using OrbitControl in Mini Program Projects
+
+1. Import the library
+
+```bash
+npm install @galacean/engine-toolkit-controls -S
+```
+
+```typescript
+import { OrbitControl } from "@galacean/engine-toolkit-controls/dist/miniprogram";
+```
+
+2. Add the component
+
+The `OrbitControl` component needs to be added to the camera node.
+
+```typescript
+cameraEntity.addComponent(OrbitControl);
+```
+
+3. Simulate event dispatch
+
+Since Mini Programs do not support `addEventListener` for adding event listeners, you need to manually add event simulation. Additionally, there is a bug with multi-touch on the Mini Program canvas, so add a view layer of the same size and position as the canvas to dispatch touch events:
+
+```html
+
+
+
+
+
+```
+
+```typescript
+import { dispatchPointerUp, dispatchPointerDown, dispatchPointerMove, dispatchPointerLeave, dispatchPointerCancel } from "@galacean/engine-miniprogram-adapter";
+
+Page({
+ ...
+ onTouchEnd(e) {
+ dispatchPointerUp(e);
+ dispatchPointerLeave(e);
+ },
+ onTouchStart(e) {
+ dispatchPointerDown(e);
+ },
+ onTouchMove(e) {
+ dispatchPointerMove(e);
+ },
+ onTouchCancel(e) {
+ dispatchPointerCancel(e);
+ }
+})
+```
+
+### Creating a Galacean Mini Program Project with Pro Code
+
+> Requires Node.js version >=12.0.0.
+
+Using yarn to create
+
+```bash
+yarn create @galacean/galacean-app --template miniprogram
+```
+
+Using npm **6.x** version to create
+
+```
+npm init @galacean/galacean-app --template miniprogram
+```
+
+Using npm **7.x** version to create
+
+```she
+npm init @galacean/galacean-app -- --template miniprogram
+```
+
+**Follow the prompts** to complete the subsequent steps, then you can use the mini program development tool to open the project:
+
+
+
+Select the corresponding directory, and if everything goes well, you should see:
+
+
+
+### Using Galacean in an existing Pro code project
+
+This tutorial assumes you already have some development skills. If you are not familiar with mini program development, please read the [mini program development documentation](https://opendocs.alipay.com/mini/developer) in detail.
+
+1. Open `Terminal` in the project directory and install dependencies:
+
+```bash
+# 使用 npm
+npm install @galacean/engine --save
+npm install @galacean/engine-miniprogram-adapter --save
+# 使用 yarn
+yarn add @galacean/engine
+yarn add @galacean/engine-miniprogram-adapter
+```
+
+2. Add the following configuration items to the mini program project configuration file `app.json`:
+
+```json
+{
+ ...
+ "window": {
+ ...
+ "v8WorkerPlugins": "gcanvas_runtime",
+ "v8Worker": 1,
+ "enableSkia": "true"
+ }
+}
+```
+
+3. Add a canvas tag to the axml page where you want to add interaction:
+
+```html
+
+```
+
+Use the `onReady` configuration to set up the `canvas` initialization callback. You need to set the `canvas` id, which will be used later.
+
+4. Add a callback function in the `.js` code file of the page, use `my._createCanvas` to create the required canvas context, and then use galacean in the `success` callback.
+
+Note:
+
+1. Use `import * as GALACEAN from "@galacean/engine/dist/miniprogram"` to import mini program dependencies.
+2. You need to use `registerCanvas` from '@galacean/engine-miniprogram-adapter' to register the `canvas`.
+
+For details, you can refer to the following code:
+
+```js
+import * as GALACEAN from "@galacean/engine/dist/miniprogram";
+import { registerCanvas } from "@galacean/engine-miniprogram-adapter";
+
+Page({
+ onCanvasReady() {
+ my._createCanvas({
+ id: "canvas",
+ success: (canvas) => {
+ // 注册 canvas
+ registerCanvas(canvas);
+ // 适配 canvas 大小
+ const info = my.getSystemInfoSync();
+ const { windowWidth, windowHeight, pixelRatio, titleBarHeight } = info;
+ canvas.width = windowWidth * pixelRatio;
+ canvas.height = (windowHeight - titleBarHeight) * pixelRatio;
+
+ // 创建引擎
+ const engine = new GALACEAN.WebGLEngine(canvas);
+ // 剩余代码和 Galacean Web 版本一致
+ ...
+ },
+ });
+ }
+})
+```
+```
diff --git a/docs/en/performance/scene-standard.md b/docs/en/performance/scene-standard.md
index f8be4c5baa..a745d9d866 100644
--- a/docs/en/performance/scene-standard.md
+++ b/docs/en/performance/scene-standard.md
@@ -5,36 +5,36 @@ type: Performance
label: Performance
---
-The Galacean Engine supports popular 3D modeling software (C4D, 3ds Max, Maya, Blender) to export *.fbx* files. Considering runtime performance and compatibility issues, artists should pay attention to the 3D scene specifications:
+Galacean Engine supports mainstream 3D modeling software (C4D, 3ds Max, Maya, Blender) to export *.fbx* files. Considering runtime performance and compatibility issues, artists should pay attention to the 3D scene specifications:
-### Models
+### Model
-- **Triangle Faces and Vertex Count:** It is recommended that the face count of a single scene model should not exceed **100,000 faces**. While ensuring visual effects, try to reduce the number of model triangle faces and vertices as they have a significant impact on _GPU_ performance or VRAM usage, especially the rendering performance of triangle faces.
-- **Model Merging:** Artists should merge models that cannot be independently moved to reduce rendering batches as much as possible. Also, be careful not to merge models with a large scene span that may cause issues with model clipping.
+- **Number of Triangles and Vertices:** It is recommended that the number of faces of a single scene model should not exceed **100,000 faces**. Try to reduce the number of model triangles and vertices as much as possible while ensuring visual effects, as both have a certain impact on _GPU_ performance loss or memory usage, especially the rendering performance of triangles.
+- **Model Merging:** Artists should merge models that cannot move independently as much as possible to reduce rendering batches. At the same time, be careful not to merge models that span too large a scene range, which would cause the model to be unclippable.
-### Materials
+### Material
-- **Material Merging:** Merge materials as much as possible. Materials serve as the foundation for merging rendering batches in a 3D engine. The prerequisite for all engine-level rendering batch merging is to use the same material, so keep the number of material objects as low as possible.
+- **Material Merging:** Merge materials as much as possible. As the basis for merging in a 3D engine, the prerequisite for merging all engine-level rendering batches is to use the same material, so keep the number of material objects as few as possible.
- **Material Selection:**
- - Material model selection should be simplified based on the artistic style. For example, for cartoon-style models where lighting is merged into the diffuse texture, you can directly choose _unlit_ materials without the need for complex _PBR_ material models.
- - Prioritize non-transparent materials as they are less performance-intensive compared to transparent materials, whether in terms of material transparency blending or transparent clipping modes.
+ - The choice of material model should be as simple as possible according to the art style. For example, cartoon-style models that directly merge lighting into the diffuse map can directly choose _unlit_ materials without using complex _PBR_ material models.
+ - Prefer non-transparent materials, as both transparent blending and transparent clipping modes are more performance-consuming compared to non-transparent materials.
-### Textures
+### Texture
-Textures consume a significant amount of VRAM resources. Avoid blindly pursuing quality with oversized textures. Evaluate the actual display pixels rasterized by textures in the project to use textures of similar sizes. Using excessively large textures not only fails to yield performance benefits but also wastes VRAM. Preferably use textures with sizes that are powers of 2. Additionally, you can continue to optimize VRAM usage by using [texture compression](/en/docs/graphics-texture-compression) within reasonable texture sizes.
+Textures are the main consumers of memory resources. The texture size should not blindly pursue quality using ultra-large sizes. It is necessary to evaluate the actual display pixels of the rasterized texture in the actual project to use a similar texture size. Otherwise, using an oversized texture not only does not gain effect benefits but also wastes memory. Try to use textures with dimensions that are powers of 2. Under reasonable texture sizes, you can also use [texture compression](/en/docs/graphics/texture/compression/) to optimize memory usage.
-### Nodes
+### Node
-Reduce the number of empty nodes at runtime. Empty nodes consume a certain amount of memory and may introduce potential computational costs for transformations. Artists should strive to delete empty nodes and merge fragmented nodes as much as possible.
+Reduce the number of empty nodes at runtime. Empty nodes occupy a certain amount of memory and may bring potential [transform](/en/docs/core/transform) calculation costs. Artists should try to delete empty nodes and merge fragmented nodes.
### Animation
-For animation production, it is recommended to use skeletal skinning animation. This is a technique in 3D engines that balances effects and memory usage. However, due to the significant computational cost of skeletal animation, especially in languages like JS that are not proficient in intensive computations, artists should ensure a minimal number of bones in skeletal animations. Keeping it below **25** bones can enhance the performance and memory usage of skeletal animations, especially on devices with limited GPU _uniform_ counts like iPhones.
+It is recommended to use skeletal animation for animation production. This is an animation technique that balances effects and memory in a 3D engine. However, due to the high computational cost of skeletal animation, especially in JS, which is not good at intensive calculations, artists should ensure that the number of bones is as few as possible when creating skeletal animations. This helps improve the performance and memory usage of skeletal animations. Generally, keeping it under **25** bones can ensure optimal performance on devices like iPhones, which have fewer GPU _uniforms_.
### UI
-Avoid wasting the Alpha part of UI elements. For instance, drawing UI elements with nearly full-screen but mostly transparent images can impose a significant rendering burden on the GPU. Additionally, artists should merge UI textures themselves and make efficient use of texture space as relying on editor algorithms for merging may still result in some waste.
+Reduce the waste of the Alpha part of the UI. For example, using nearly full-screen but mostly transparent images for UI rendering will bring a huge rendering burden to the GPU. Additionally, artists should merge UI textures themselves and make full use of texture space, as relying on the editor's algorithm to merge may still cause some waste.
### Effects
-Similar to UI textures, **reduce wastage in the size of transparent parts of effect textures**. Additionally, since effects typically have severe OverDraw, such as particles, it is essential to minimize emission frequencies on effects like particles to reduce rendering overhead.
+The texture part of effects is similar to UI. **Be sure to reduce the waste of the transparent part of the texture size**. Additionally, since effects usually have very serious OverDraw, such as particles, it is necessary to reduce the emission frequency of particles and other effects as much as possible.
diff --git a/docs/en/performance/stats.md b/docs/en/performance/stats.md
index 1a054e6a6d..90df4bbd05 100644
--- a/docs/en/performance/stats.md
+++ b/docs/en/performance/stats.md
@@ -1,11 +1,11 @@
---
order: 2
-title: 统计面板
-type: 性能
+title: Statistics Panel
+type: Performance
label: Performance
---
-[@galacean/engine-toolkit-stats](https://www.npmjs.com/package/@galacean/engine-toolkit-stats) package is mainly used to display the rendering status of the camera. Just add the `Stats` component to the camera node:
+[@galacean/engine-toolkit-stats](https://www.npmjs.com/package/@galacean/engine-toolkit-stats) package is mainly used to display the rendering status of the camera. You only need to add the `Stats` component to the camera node:
```typescript
import { Engine } from "@galacean/engine";
@@ -15,7 +15,6 @@ cameraEntity.addComponent(Camera);
cameraEntity.addComponent(Stats);
```
-## 示例
+## Example
-
diff --git a/docs/en/physics/collider.md b/docs/en/physics/collider.md
index ef6deead5e..c0d9f10b8b 100644
--- a/docs/en/physics/collider.md
+++ b/docs/en/physics/collider.md
@@ -5,39 +5,39 @@ type: Physics
label: Physics
---
-The biggest advantage of introducing a physics engine is to give physical responses to objects in the scene. Colliders belong to components in the engine. Before using them, we need to understand the types of colliders:
+The biggest advantage of introducing a physics engine is that it allows objects in the scene to have physical responses. Colliders ([Collider](/en/apis/core/#Collider)) are a type of component in the engine, and there are currently two types. Before using them, we need to understand these two types of colliders:
-1. [StaticCollider](/apis/core/#StaticCollider): Static collider, mainly used for stationary objects in the scene;
-2. [DynamicCollider](/apis/core/#DynamicCollider): Dynamic collider, used for objects in the scene that need to be controlled by scripts or respond to physical feedback.
+1. [StaticCollider](/en/apis/core/#StaticCollider): Static collider, mainly used for stationary objects in the scene;
+2. [DynamicCollider](/en/apis/core/#DynamicCollider): Dynamic collider, used for objects in the scene that need to be controlled by scripts or respond to physical feedback.
## Editor Usage
### Adding Collider Component
-The first thing to consider is whether the collider is static or dynamic, then add the corresponding collider component, StaticCollider for static colliders or DynamicCollider for dynamic ones.
+Before adding a physics component to an object, the first thing to consider is whether the collider is static or dynamic, and then add the corresponding collider component, either a static collider [StaticCollider](/en/apis/core/#StaticCollider) or a dynamic collider [DynamicCollider](/en/apis/core/#DynamicCollider).

-### Selecting Collider Shape
+### Selecting the Shape of the Collider
-In fact, each `Collider` is a collection of [ColliderShape](/apis/core/#ColliderShape), meaning each `Collider` can set a composite collider shape by combining `ColliderShape`.
+Next, we need to add a [ColliderShape](/en/apis/core/#ColliderShape) to the collider component. In fact, each `Collider` is a collection of [ColliderShape](/en/apis/core/#ColliderShape), meaning each `Collider` can be set to a composite collider shape by combining [ColliderShape](/en/apis/core/#ColliderShape).
-Currently, four `ColliderShape` types are supported, but the level of support varies depending on the backend physics package, as follows:
+Currently, four types of `ColliderShape` are supported, but the support varies among different backend physics packages, as detailed below:
-| Name | Description | Supported Backend Physics Packages |
+| Name | Description | Supported Backend Physics Packages |
| :--- |:---------|:----------------------------|
-| [BoxColliderShape](/apis/core/#BoxColliderShape) | Box-shaped collider | physics-lite, physics-physx |
-| [SphereColliderShape](/apis/core/#SphereColliderShape) | Sphere-shaped collider | physics-lite, physics-physx |
-| [PlaneColliderShape](/apis/core/#PlaneColliderShape) | Infinite plane collider | physics-physx |
-| [CapsuleColliderShape](/apis/core/#CapsuleColliderShape) | Capsule-shaped collider | physics-physx |
+| [BoxColliderShape](/en/apis/core/#BoxColliderShape) | Box-shaped collider | physics-lite, physics-physx |
+| [SphereColliderShape](/en/apis/core/#SphereColliderShape) | Sphere-shaped collider | physics-lite, physics-physx |
+| [PlaneColliderShape](/en/apis/core/#PlaneColliderShape) | Unbounded plane collider | physics-physx |
+| [CapsuleColliderShape](/en/apis/core/#CapsuleColliderShape) | Capsule-shaped collider | physics-physx |
-The engine supports composite collider shapes, meaning a collider can be composed of BoxColliderShape, SphereColliderShape, and CapsuleColliderShape.
+The engine supports composite collider shapes, meaning the collider itself can be composed of `BoxColliderShape`, `SphereColliderShape`, and `CapsuleColliderShape`.
-It is important to note the relationship between `Collider` and `ColliderShape`. The pose of each `Collider` is consistent with the `Entity` it is attached to, and they are synchronized every frame. The `position` property on `ColliderShape` can be used to set an offset **relative to** the `Collider`.
+It is particularly emphasized here the positional relationship between `Collider` and `ColliderShape`. The posture of each `Collider` is consistent with the `Entity` it is attached to, and they are synchronized every frame. The `ColliderShape` can set an offset **relative to** the `Entity` through the `position` property.

-When adding a collider component, the collider shape is not added by default, so you need to click on Add Item to add it, and you will see the collider's auxiliary rendering in the viewport after adding it.
+After adding the collider component, the collider shape is not added by default, so you need to click Add Item to add it. After adding, you will see the auxiliary rendering of the collider appear in the viewport.

@@ -45,50 +45,67 @@ For each collider shape, you can design corresponding size properties. For examp
-However, regardless of the collider shape, you can set the Local Position, which is the local offset relative to the Entity coordinates.
+No matter which collider shape is used, you can set the Local Position, which is the local offset relative to the Entity coordinates.

-### Dynamic Collider Settings
-Unlike static colliders, dynamic colliders are affected by physical laws, so there are many additional physical properties to set.
+The `ColliderShape` also has a noteworthy property called `Trigger`, which can switch this `ColliderShape` from `collider mode` to `trigger mode`.
+
+Trigger mode: The object does not have a rigid body shape, but can trigger specific script functions when contact occurs.
+Collider mode: The object has a rigid body shape, and when contact occurs, it can not only trigger script functions but also change its original motion according to physical laws.
+### Dynamic Collider Settings
+Unlike static colliders, dynamic colliders are subject to physical laws, so there are many additional physical properties to set.
-After modifying these parameters, the viewport will not change because the dynamic collider is affected by gravity by default, so observation can only be done in Play mode.
+After modifying these parameters, the viewport will not change because dynamic colliders are subject to gravity by default, so you need to observe them in `preview mode`.
### Note
-- The collision area should be kept as simple as possible to improve the performance of the physics engine detection.
-- The reference coordinate system of the collider is the coordinate system of the dependent Entity.
-- PlaneColliderShape represents a full plane, so there is no display of auxiliary lines, generally used as a floor.
+- The determined collision area should be as simple as possible to improve the performance of the physics engine detection.
+- The reference coordinate system of the collider is the coordinate system of the subordinate Entity.
+- PlaneColliderShape represents a full plane, so there are no auxiliary lines displayed, and it is generally used as a floor.
## Script Usage
-There are two types of physical responses:
+### Adding a Collider
+``` typescript
+ // 添加静态碰撞器
+ const boxCollider = boxEntity.addComponent(StaticCollider);
+ // 添加动态碰撞器
+ const sphereCollider = sphereEntity.addComponent(DynamicCollider);
+```
-1. Trigger mode: Objects do not have a rigid body shape, but specific script functions can be triggered when contact occurs.
-2. Collider mode: Physics have a rigid body shape, and when contact occurs, not only can script functions be triggered, but the original motion can also be changed according to physical laws.
+### Adding a ColliderShape
+``` typescript
+ const boxCollider = boxEntity.getComponent(StaticCollider);
+ const physicsBox = new BoxColliderShape();
+ physicsBox.size = new Vector3(1, 1, 1);
+ boxCollider.addShape(physicsBox);
-For these two types, corresponding functions are provided in the script, and the collider component also provides a series of functions to set its own state, such as velocity, mass, and so on.
+ //设置 Trigger
+ physicsBox.isTrigger = true;
+```
+
+For these two types, corresponding functions are provided in the script.
### Trigger Script Functions
-For trigger mode, first, add a `Collider` to the `Entity` in the scene; when these components come into contact, three functions in the script component will be automatically triggered:
+For trigger mode, you first need to add a `Collider` to the `Entity` in the scene; when these components come into contact, three functions in the script component will be automatically triggered:
-1. [onTriggerEnter](/en/docs/script#component-lifecycle-functions#ontriggerenter): Called when they come into contact.
-2. [onTriggerStay](/en/docs/script#component-lifecycle-functions#ontriggerstay): Called *repeatedly* during contact.
-3. [onTriggerExit](/en/docs/script#component-lifecycle-functions#ontriggerexit): Called when the contact ends.
+1. [onTriggerEnter](/en/docs/script#$1-ontriggerenter): Called when contact occurs.
+2. [onTriggerStay](/en/docs/script#$1-ontriggerstay): *Loop* called during contact.
+3. [onTriggerExit](/en/docs/script#$1-ontriggerexit): Called when contact ends.
-You can enable trigger mode by setting `isTrigger` on the `ColliderShape`, but it is important to note that **trigger events are not called between two StaticColliders unless one of them is a `DynamicCollider`**.
+You can enable trigger mode through the `isTrigger` property on `ColliderShape`, but it is particularly emphasized that **trigger events will not be called between two StaticColliders**, unless one of them is a `DynamicCollider`.
### Collider Script Functions
-For collider mode, when `DynamicCollider` interact with each other, three collision-related script functions will be triggered:
-1. [onCollisionEnter](/en/docs/script#component-lifecycle-functions#oncollisionenter): Called when a collision occurs.
-2. [onCollisionStay](/en/docs/script#component-lifecycle-functions#oncollisionstay): Called *repeatedly* during the collision process.
-3. [onCollisionExit](/en/docs/script#component-lifecycle-functions#oncollisionexit): Called when the collision ends.
+For collider mode, when `DynamicColliders` interact, three collision-related script functions will be triggered:
+1. [onCollisionEnter](/en/docs/script#$1-oncollisionenter): Called when a collision occurs.
+2. [onCollisionStay](/en/docs/script#$1-oncollisionstay): *Loop* called during the collision.
+3. [onCollisionExit](/en/docs/script#$1-oncollisionexit): Called when the collision ends.
-
diff --git a/docs/en/physics/controller.md b/docs/en/physics/controller.md
index 425d212d96..7da4daa30b 100644
--- a/docs/en/physics/controller.md
+++ b/docs/en/physics/controller.md
@@ -5,9 +5,7 @@ type: Physics
label: Physics
---
-The character controller is a very important functional component provided by the physics engine. It allows for easily adding physical behaviors to the motion of animated characters. For example, parameters can be set to prevent a character from climbing steep slopes or to avoid collisions with other colliders during the character's movement. In fact, the character controller is just an advanced encapsulation of colliders, implementing various advanced character control behaviors through collision detection.
-
-Similar to collider components, the creation and usage of character controller components are very similar to collider components.
+The character controller is a very important functional component provided by the physics engine. With the character controller, it is easy to add physical effects to the movement of animated characters. For example, you can set parameters to prevent the character from climbing slopes of a certain angle, or avoid collision feedback with other colliders during the character's movement. In fact, the character controller is just an advanced encapsulation of the [collider](/en/docs/physics/collider), implementing various advanced character control behaviors through collision detection. Therefore, the creation and use of the character controller component are very similar to the collider component.
```typescript
const physicsCapsule = new CapsuleColliderShape();
physicsCapsule.radius = radius;
@@ -15,11 +13,11 @@ physicsCapsule.height = height;
const characterController = capsuleEntity.addComponent(CharacterController);
characterController.addShape(physicsCapsule);
```
-Like collider components, a `ColliderShape` is constructed and added to the component to give the character controller a specific shape. However, two points need to be emphasized here:
-1. Character controllers do not support compound shapes, so only one `ColliderShape` can be added.
-2. Currently, character controllers only support `CapsuleColliderShape` and `BoxColliderShape`, with `CapsuleColliderShape` being the most commonly used.
+Like the collider component, it is constructed by creating a `ColliderShape` and adding it to the component, giving the character controller a specific shape. However, two points need to be emphasized here:
+1. The character controller does not support compound shapes, so only one `ColliderShape` can be added.
+2. The character controller currently only supports `CapsuleColliderShape` and `BoxColliderShape`, with `CapsuleColliderShape` being the most commonly used.
-The behavior of the character controller can be controlled through the parameters and methods of `CharacterController`, with the most important one being the `move` function:
+Subsequent behaviors of the character controller are controlled through various parameters and methods of `CharacterController`, with the most important being the `move` function:
```typescript
class Controller extends Script {
@@ -35,7 +33,7 @@ class Controller extends Script {
}
```
-In the `move` method, you can specify the character's displacement, and this method returns an enum type composite value. By using the enum type `ControllerCollisionFlag`, you can determine if the character controller collides with other collider components:
+You can specify the character's displacement in the `move` method, and this method returns a composite value of an enumeration type. Through this enumeration type `ControllerCollisionFlag`, you can determine whether the character controller has collided with other collider components:
```typescript
export enum ControllerCollisionFlag {
@@ -48,6 +46,6 @@ export enum ControllerCollisionFlag {
}
```
-Based on this, the character's subsequent animations and movements can be determined. In the example below, you can control the character's movement using the keyboard to make it climb or jump over specific obstacles.
+This determines how the character's next animation and movement will proceed. In the example below, you can control the character's movement via the keyboard, allowing it to climb or jump over specific obstacles.
diff --git a/docs/en/physics/debug.md b/docs/en/physics/debug.md
index b6cdcd8cb0..6d19ea403d 100644
--- a/docs/en/physics/debug.md
+++ b/docs/en/physics/debug.md
@@ -1,19 +1,18 @@
---
order: 6
-title: Physical Debugging
-type: Physical
+title: Physics Debugging
+type: Physics
label: Physics
---
-Physical colliders are composed of basic physical shapes, including spheres, boxes, capsules, and infinite planes. In practical applications, these collider shapes rarely perfectly align with the rendered objects, making visualization debugging quite challenging.
+Physical colliders are composed of basic physical shapes, including spheres, boxes, capsules, and infinite planes. In practical applications, these collider shapes rarely perfectly overlap with the rendered objects, which brings significant difficulties for visual debugging.
There are two debugging methods:
-1. Using PhysX Visual Debugger (PVD), an official debugging tool developed by Nvidia. However, using this tool requires compiling the debug version of PhysX on your own and connecting the browser to the debugging tool via WebSocket.
-For specific usage instructions, refer to the introduction in the [physx.js](https://github.com/galacean/physX.js) Readme.
-2. We also provide a lightweight [auxiliary line tool](https://github.com/galacean/engine-toolkit/tree/main/packages/auxiliary-lines), which draws wireframes based on the configuration of physical components to assist in configuring and debugging physical components.
-It is easy to use, simply attach the `WireframeManager` script and then associate it with various physical components, or directly link nodes like this:
+1. Using the PhysX Visual Debugger (PVD), an official debugging tool developed by Nvidia. However, using this tool requires compiling the debug version of PhysX yourself and connecting the browser and the debugging tool via WebSocket.
+For specific usage methods, you can refer to the introduction in the [physx.js](https://github.com/galacean/physX.js) Readme.
+2. We also provide a lightweight [auxiliary line tool](https://github.com/galacean/engine-toolkit/tree/main/packages/auxiliary-lines), which draws corresponding wireframes based on the configuration of physical components to assist in configuring and debugging physical components.
+It is also very easy to use, just mount the `WireframeManager` script and then set it to associate with various physical components, or directly associate with nodes:
```typescript
const wireframe = rootEntity.addComponent(WireframeManager);
wireframe.addCollideWireframe(collider);
```
-
diff --git a/docs/en/physics/joint-basic.md b/docs/en/physics/joint-basic.md
index 9e36281ebe..6dbfe6c523 100644
--- a/docs/en/physics/joint-basic.md
+++ b/docs/en/physics/joint-basic.md
@@ -5,7 +5,7 @@ type: Physics
label: Physics
---
-Physics constraint components are essential components in physics that allow better control of the movement of dynamic collider components, adding interesting interactive responses to scenes. This article mainly introduces the three most basic physics constraint components:
+Physics constraint components are very important physical components. By using constraints, you can better control the movement of dynamic collider components and add interesting interactive responses to the scene. This article mainly introduces the three most basic physics constraint components:
1. Fixed Constraint Component
@@ -17,7 +17,7 @@ Physics constraint components are essential components in physics that allow bet

-All physics constraints have two target objects: one represents the dynamic collider affected by the physics constraint (the physics constraint component is attached to this node), and the other is the position where the constraint is attached or another dynamic collider (set through component configuration).
+All physics constraints have two acting objects. One represents the dynamic collider affected by the physical constraint (the physics constraint component is mounted on this node), and the other is the position where the constraint is mounted or another dynamic collider (set through component configuration).
Therefore, the usage of these components is similar. Taking the fixed constraint component `FixedJoint` as an example:
```typescript
@@ -27,15 +27,17 @@ fixedJoint.connectedCollider = prevCollider;
## Local Coordinates and World Coordinates
-Understanding the usage of physics constraint components, one key point is to understand **local coordinates** and **world coordinates**. All physics constraints can configure the `connectedCollider` property.
-In addition, some physics constraint components can also set the position where the physics constraint is attached by configuring the `connectedAnchor` property.
+Understanding the use of physics constraint components, one key point is to understand **local coordinates** and **world coordinates**. All physics constraints can configure the `connectedCollider` property.
+In addition, some physics constraint components can also set the position where the physical constraint is mounted by configuring the `connectedAnchor` property.
-**It is important to note that when `connectedCollider` is set, `connectedAnchor` represents the local coordinates relative to that collider. When `connectedCollider` is null, `connectedAnchor` represents world coordinates.**
+**It is particularly important to note that when `connectedCollider` is set, `connectedAnchor` represents the local coordinates relative to that collider. When `connectedCollider` is null,
+`connectedAnchor` represents the world coordinates.**
## Hinge Constraint
-Among the three physics constraints mentioned above, the hinge constraint is relatively more complex because, in addition to configuring `connectedCollider` and `connectedAnchor`, you also need to specify the rotation axis direction and rotation radius of the hinge. This can be done by setting the `axis` (default direction is towards the positive x-axis) and `swingOffset` properties.
-The `swingOffset` is also a vector that can be understood as an offset starting from the rotation center determined by `connectedAnchor` and `connectedCollider`, where the dynamic collider rotates around the rotation axis.
+Among the above three physics constraints, the hinge constraint is relatively more complex because, in addition to configuring `connectedCollider` and `connectedAnchor`, it also requires specifying the direction of the hinge's rotation axis and the rotation radius.
+These two properties can be specified by configuring `axis` (the default direction is towards the positive x-axis) and `swingOffset`.
+The `swingOffset` is also a vector and can be understood as the offset from the rotation center determined by `connectedAnchor` and `connectedCollider`, where the dynamic collision is moved to this point to start rotating around the rotation axis.
-The usage of the above physics constraint components can be referenced in:
+The usage of the above physics constraint components can be referred to:
diff --git a/docs/en/physics/manager.md b/docs/en/physics/manager.md
index 15fe4d7fc3..df6a7af2c6 100644
--- a/docs/en/physics/manager.md
+++ b/docs/en/physics/manager.md
@@ -5,11 +5,11 @@ type: Physics
label: Physics
---
-The Physics Manager is responsible for managing all the physics components in the scene and communicating with the physics backend to perform global operations related to the physics scene, such as updates and raycasting, etc. In a multi-scene project, each Scene has its own Physics Manager, and the physics systems between Scenes are isolated and do not affect each other.
+The Physics Manager (PhysicsManager) is used to manage all the physical components in the scene and is responsible for communicating with the physics backend to perform global operations related to the physical scene, such as updates and raycasting. In multi-scene projects, each Scene has its own PhysicsManager, and the physical systems between Scenes are isolated and do not affect each other.
## Physics Update
-The physics scene and the rendering scene are independent of each other but continuously synchronize their data during program execution. Therefore, just like scripts, the synchronization timing is crucial. Generally, the update frequency of the physics scene is different from the rendering scene, and it can be set in the Physics Manager:
+The physical scene and the rendering scene are independent of each other but continuously synchronize their data during the program's execution. Therefore, like scripts, the timing of synchronization is very important. Generally speaking, the update frequency of the physical scene is different from that of the rendering scene, and it can be set in the physics manager:
```typescript
/** The fixed time step in seconds at which physics are performed. */
@@ -21,9 +21,10 @@ maxSumTimeStep: number = 1 / 3;
In each rendering frame, the physics engine updates at a fixed time step `fixedTimeStep`.
-If the time interval is greater than `fixedTimeStep`, the maximum time step for a single simulation is determined by `maxSumTimeStep`. In this case, with the default parameters listed above, frame skipping may occur. To reduce the number of physics simulation updates per frame, you can adjust `maxSumTimeStep`.
+If the time interval is greater than `fixedTimeStep`, the maximum time step for a single simulation is determined by `maxSumTimeStep`. At this time, if the default parameters listed above are used, frame chasing may occur.
+In this case, you can reduce the number of physics simulation updates per frame by adjusting `maxSumTimeStep`.
-If the time interval is less than `fixedTimeStep`, the update is deferred to the next frame. Therefore, in each rendering frame, the physics scene may be updated multiple times or only once. Thus, any updates related to physics components need to be placed in specific update functions. `Script` provides this interface:
+If it is less than a `fixedTimeStep`, it will be postponed to the next frame for processing. Therefore, in each rendering frame, the physical scene may be updated multiple times or only once, so all updates related to physical components need to be placed in a specific update function, which is provided by `Script`:
```typescript
export class Script extends Component {
@@ -35,23 +36,23 @@ export class Script extends Component {
}
```
-During the physics scene update, besides calling this function, the Collider and the Entity it is attached to will have their poses synchronized. The timing of the physics update is as follows:
+When the physical scene is updated, in addition to calling this function, it will also synchronize the Collider and the posture of the Entity it is attached to. The timing of the physics update is as follows:
-1. Call user logic in `onPhysicsUpdate`
-2. `callColliderOnUpdate` synchronizes the modified Entity's new pose to the physics collider
-3. Update the physics scene
+1. Call the user logic in `onPhysicsUpdate`
+2. `callColliderOnUpdate` synchronizes the new posture of the modified Entity to the physical collider
+3. Update the physical scene
4. `callColliderOnLateUpdate` synchronizes the updated positions of all DynamicColliders to the corresponding Entities
## Using Raycasting
-A ray can be understood as an endless line projected from a point in a specific direction in a 3D world. Raycasting is widely used in 3D applications. Through raycasting, you can pick objects in a 3D scene when the user clicks on the screen, or determine if a bullet can hit a target in a shooting game.
+A ray can be understood as an infinite line emitted from a point in a certain direction in the 3D world. Raycasting is very widely used in 3D applications. Through raycasting, you can pick objects in the 3D scene when the user clicks the screen; it can also be used in shooting games to determine whether a bullet can hit the target.

(_Image source: Internet_)
-When using raycasting, you first need to import the [Ray](/apis/math/#Ray) module in your code. Then, generate a ray - the ray can be custom-generated or converted from screen input to a ray through the camera ([camera](/apis/core/#Camera-viewportPointToRay)). Finally, call the [PhysicsManager.raycast](/apis/core/#PhysicsManager-raycast) method to detect collisions hit by the ray. The code is as follows:
+To use raycasting, you first need to import the [Ray](/en/apis/math/#Ray) module in the code; then generate a ray, which can be custom-generated or converted from screen input through the camera ([camera](/en/apis/core/#Camera-viewportPointToRay)); finally, call the [PhysicsManager.raycast](/en/apis/core/#PhysicsManager-raycast) method to detect the collision body hit by the ray. The code is as follows:
```typescript
// 加载 Raycast 模块
@@ -77,7 +78,6 @@ document.getElementById('canvas').addEventListener('click', (e) => {
});
```
-It is important to note that for an Entity to be raycast-enabled, the Entity must have a **Collider**; otherwise, it will not trigger. If the Colliders hit by the ray have the same distance, the Collider added first will be returned (for example, if two Entities with identical Colliders completely overlap, the Entity with the Collider added first will be returned more accurately).
-
-Additionally, in Galacean, an InputManager is provided, which encapsulates the input source and provides a more user-friendly logic. For usage, you can refer to [here](/en/docs/input).
+It should be particularly noted that if you want to enable raycasting for an Entity, the Entity must have a **Collider**, otherwise it cannot be triggered. If the Shapes of the Colliders hit by the ray are at the same distance, the Shape that was added first will be returned (for example: if two Entities with the same Collider completely overlap, the Entity with the Collider added first, or more accurately, the Shape added first, will be returned).
+At the same time, in Galacean, an InputManager is also provided. This manager encapsulates the input sources and provides more user-friendly logic. You can refer to [here](/en/docs/input) for usage.
diff --git a/docs/en/physics/overall.md b/docs/en/physics/overall.md
index 8391167021..1728b31e7c 100644
--- a/docs/en/physics/overall.md
+++ b/docs/en/physics/overall.md
@@ -1,23 +1,23 @@
---
order: 1
-title: Physics Overview
+title: Overview of Physics
type: Physics
label: Physics
---
-The physics engine is a crucial component in game engines. The industry commonly adopts PhysX for its related functionalities. However, for lightweight scenarios, PhysX results in a very large final application size, exceeding the limits of these projects. Galacean is based on a multi-backend design. On one hand, it compiles to [PhysX.js](https://github.com/galacean/physX.js) through WebAssembly; on the other hand, it also provides a lightweight physics engine. Both are consistent in [API](https://github.com/galacean/engine/tree/main/packages/design/src/physics) design. Users only need to select a specific physics backend when initializing the engine. It can meet the requirements of various scenarios such as lightweight applications and heavyweight games. For the overall design of the physics components, refer to the [Wiki](https://github.com/galacean/engine/wiki/Physical-system-design).
+The physics engine is a very important part of the game engine. The industry generally adopts PhysX to introduce related functions. However, for lightweight scenarios, PhysX makes the final application size very large, exceeding the limits of these projects. Galacean is designed based on multiple backends. On one hand, it compiles [PhysX.js](https://github.com/galacean/physX.js) through WebAssembly; on the other hand, it also provides a lightweight physics engine. Both are consistent in [API](https://github.com/galacean/engine/tree/main/packages/design/src/physics) design. Users only need to choose a specific physics backend when initializing the engine. It can meet the needs of various scenarios such as lightweight applications and heavyweight games. For the overall design of the physics components, you can refer to the [Wiki](https://github.com/galacean/engine/wiki/Physical-system-design).
-For scenarios requiring various physics components and Raycast picking, such as `InputManager`, the physics engine needs to be initialized before use. Currently, the Galacean engine provides two built-in physics engine backend implementations:
+For scenarios that need to use various physics components and `InputManager` that require Raycast picking, the physics engine needs to be initialized before use. Currently, the Galacean engine provides two built-in physics engine backend implementations:
-- [physics-lite](https://github.com/galacean/engine/tree/main/packages/physics-lite)
-- [physics-physx](https://github.com/galacean/engine/tree/main/packages/physics-physx)
+ - [physics-lite](https://github.com/galacean/engine/tree/main/packages/physics-lite)
+ - [physics-physx](https://github.com/galacean/engine/tree/main/packages/physics-physx)
-Developers can set the physics backend in the **Project Settings** panel opened in the [main menu](/en/docs/interface/menu) interface.
+Developers can set the physics backend in the **Project Settings** panel opened from the [Main Menu](/en/docs/interface/menu) interface.


-If initializing the engine via script, simply pass the physics backend object into `Engine`:
+If initializing the engine through a script, you only need to pass the physics backend object into the `Engine`:
```typescript
import {LitePhysics} from "@galacean/engine-physics-lite";
@@ -28,7 +28,7 @@ const engine = await WebGLEngine.create({
});
```
-## PhysX Physics Engine Loading and Initialization
+## Loading and Initializing the PhysX Version of the Physics Engine
```typescript
import { PhysXPhysics } from "@galacean/engine-physics-physx";
@@ -39,9 +39,8 @@ const engine = await WebGLEngine.create({
});
```
-## Selecting a Physics Backend
-When selecting a physics backend, consider the following factors: functionality, performance, and package size:
+## Choosing a Physics Backend
+Choosing a physics backend needs to consider three factors: functionality, performance, and package size:
1. Functionality: For complete physics engine functionality and high-performance physics simulation, it is recommended to choose the PhysX backend. The Lite backend only supports collision detection.
-2. Performance: PhysX will automatically degrade to pure JavaScript code on platforms that do not support WebAssembly, resulting in reduced performance. However, due to the built-in data structures for scene searching, the performance is still better than the Lite backend.
-3. Package Size: Choosing the PhysX backend will introduce an additional wasm file of nearly 2.5mb (the size of the pure JavaScript version is similar), increasing the package size while decreasing the application initialization speed.
-
+2. Performance: PhysX will automatically downgrade to pure JavaScript code on platforms that do not support WebAssembly, so performance will also decrease. However, due to the built-in data structures for scene search, the performance is still better than the Lite backend.
+3. Package Size: Choosing the PhysX backend will additionally introduce nearly 2.5mb of wasm files (the size of the pure JavaScript version is similar), increasing the package size while reducing the application's initialization speed.
diff --git a/docs/en/script/attributes.md b/docs/en/script/attributes.md
index 7862fee52e..2b1096d90d 100644
--- a/docs/en/script/attributes.md
+++ b/docs/en/script/attributes.md
@@ -5,7 +5,7 @@ type: Script
label: Script
---
-Script parameters are a very useful feature in the script system. With this feature, you can expose parameters in the script to the editor, allowing you to configure them in the scene editor. You can directly modify various properties of the script on the interface without the need to delve into the code. This intuitive editing method allows non-professional developers to easily debug various states in the script.
+Script parameters are a very useful feature in the script system. With this feature, you can expose parameters in the script to the editor, allowing them to be configured in the scene editor. You can directly modify various properties of the script in the interface without delving into the code. This intuitive editing method allows non-professional developers to easily debug various states in the script.
## Basic Usage
@@ -23,9 +23,9 @@ export default class extends Script {
}
```
-In the above code, we use the `@inspect` decorator to declare a property named `rotate` of type `Number` and expose it to the editor.
+In the above code, we declare a `rotate` property of type `Number` using the `@inspect` decorator and expose it to the editor.
-
+
## Parameter Types
@@ -35,9 +35,9 @@ Currently supported parameter types are:
- `Input`: Input box
- `Slider`: Slider
- `Boolean`: Boolean type
-- `Vector2`: Two-dimensional vector
-- `Vector3`: Three-dimensional vector
-- `Vector4`: Four-dimensional vector
+- `Vector2`: 2D vector
+- `Vector3`: 3D vector
+- `Vector4`: 4D vector
- `Rect`: Rectangle
- `Color`: Color picker, supports RGBA
- `AssetPicker`: Asset picker
@@ -46,7 +46,7 @@ Currently supported parameter types are:
## Parameter Configuration
-The second parameter of the `@inspect` decorator is an object used to configure various properties of the corresponding type of parameter. Different parameter types have different options. For example, `Number` and `Slider` have `min` and `max` configurations, while `Select` has an `options` configuration. To learn more about the configurable properties, you can refer to [@galaean/editor-decorators](https://www.npmjs.com/package/@galacean/editor-decorators?activeTab=readme). Below is an example using a number selector to explain the meanings of various configurations.
+The second parameter of the `@inspect` decorator is an object used to configure various properties of the corresponding type parameter. Different parameter types have different options. For example, `Number` and `Slider` have `min` and `max` configurations, and `Select` has `options` configuration. For more configurable properties, you can check [@galaean/editor-decorators](https://www.npmjs.com/package/@galacean/editor-decorators?activeTab=readme). Below, taking the number selector as an example, we introduce the meaning of each configuration.
```typescript
import { Script } from '@galacean/engine';
diff --git a/docs/en/script/class.md b/docs/en/script/class.md
index bfc35097e0..55b7d0aa20 100644
--- a/docs/en/script/class.md
+++ b/docs/en/script/class.md
@@ -5,26 +5,25 @@ type: Script
label: Script
---
-The base class for custom scripts is [Script](/apis/core/#Script), which extends [Component](/en/docs/core/component). Therefore, in addition to the basic capabilities of components, it also supports:
+Scripts are the bridge between engine capabilities and game logic. They can be used to extend the engine's functionality and to write your own game logic code in the lifecycle hook functions provided by script components. The base class for custom scripts is [Script](/en/apis/core/#Script), which extends from [Component](/en/docs/core/component). Therefore, it not only supports the basic capabilities of components:
-- Mounting on [Entity](/en/docs/core/entity)
-- Conveniently accessing node instances and component instances
+- Mounting to [Entity](/en/docs/core/entity)
+- Conveniently obtaining node instances and component instances
- Following the disable and destroy rules of components
-- ...
-Furthermore, scripts provide a rich set of lifecycle callback functions. As long as specific callback functions are overridden in the script, they do not need to be manually called, and Galacean will automatically execute the relevant scripts at specific times.
+Additionally, it provides a rich set of lifecycle callback functions. As long as specific callback functions are overridden in the script, you don't need to manually call them; Galacean will automatically execute the relevant scripts at specific times.
## Script Lifecycle
-
+
-> [onBeginRender](/apis/core/#Script-onBeginRender) and [onEndRender](/apis/core/#Script-onEndRender) have some differences compared to others.
+> [onBeginRender](/en/apis/core/#Script-onBeginRender) and [onEndRender](/en/apis/core/#Script-onEndRender) are somewhat different from the others.
>
-> **They are only called** when the entity has mounted a camera component, meaning when a camera component has been added.
+> **They are called only when the entity has a camera component mounted**, meaning they are called when a camera component is added.
-### [**onAwake**](/apis/core/#Script-onAwake)
+### onAwake
-If the [isActiveInHierarchy](/apis/core/#Entity-isactiveinhierarchy) of the entity to which the script is added is `true`, the callback function will be called when the script is initialized. If [isActiveInHierarchy](/apis/core/#Entity-isActiveInHierarchy) is `false`, it will be called when the entity is activated, meaning [isActive](/apis/core/#Entity-isActive) is set to `true`. `onAwake` is only called once and is the first in all lifecycles. Typically, initialization-related operations are done in `onAwake`:
+If the [isActiveInHierarchy](/en/apis/core/#Entity-isactiveinhierarchy) of the entity to which the script is added is `true`, the callback function will be called when the script is initialized. If [isActiveInHierarchy](/en/apis/core/#Entity-isActiveInHierarchy) is `false`, it will be called when the entity is activated, i.e., when [isActive](/en/apis/core/#Entity-isActive) is set to `true`. `onAwake` will only be called once and is at the very beginning of all lifecycles. Typically, we perform some initialization-related operations in `onAwake`:
```typescript
onAwake() {
@@ -33,19 +32,19 @@ onAwake() {
}
```
-### [**onEnable**](/apis/core/#Script-onEnable)
+### onEnable
-The `onEnable` callback is activated when the [enabled](/apis/core/#Component-enabled) property of the script changes from `false` to `true`, or when the [isActiveInHierarchy](/apis/core/#Entity-isactiveinhierarchy) property of the entity changes from `false` to `true. If the entity is created for the first time and [enabled](/apis/core/#Component-enabled) is `true`, it will be called after `onAwake` and before `onStart`.
+The `onEnable` callback is activated when the [enabled](/en/apis/core/#Component-enabled) property of the script changes from `false` to `true`, or when the [isActiveInHierarchy](/en/apis/core/#Entity-isactiveinhierarchy) property of the entity changes from `false` to `true`. If the entity is created for the first time and [enabled](/en/apis/core/#Component-enabled) is `true`, it will be called after `onAwake` and before `onStart`.
-### [**onDisable**](/apis/core/#Script-ondisable)
+### onDisable
-The `onDisable` callback is activated when the [enabled](/apis/core/#Component-enabled) property of the component changes from `true` to `false`, or when the [isActiveInHierarchy](/apis/core/#Entity-isActiveInHierarchy) property of the entity changes from `true` to `false`.
+The `onDisable` callback is activated when the [enabled](/en/apis/core/#Component-enabled) property of the component changes from `true` to `false`, or when the [isActiveInHierarchy](/en/apis/core/#Entity-isActiveInHierarchy) property of the entity changes from `true` to `false`.
-Note: The determination of [isActiveInHierarchy](/apis/core/#Entity-isActiveInHierarchy) is that the entity is active in the hierarchy tree, meaning the entity is active, and its parent and ancestors up to the root entity are also active for [isActiveInHierarchy](/apis/core/#Entity-isActiveInHierarchy) to be `true`.
+Note: The [isActiveInHierarchy](/en/apis/core/#Entity-isActiveInHierarchy) check means that the entity is in an active state in the hierarchy tree, i.e., the entity is active, and its parent and all ancestors up to the root entity are also active. Only then is [isActiveInHierarchy](/en/apis/core/#Entity-isActiveInHierarchy) `true`.
-### [**onStart**](/apis/core/#Script-onStart)
+### onStart
-The `onStart` callback is triggered before the script enters the frame loop for the first time, which is before the first execution of `onUpdate`. `onStart` is typically used to initialize data that may change frequently during `onUpdate`.
+The `onStart` callback function is triggered the first time the script enters the frame loop, i.e., before the first execution of `onUpdate`. `onStart` is usually used to initialize some data that needs to be frequently modified, which may change during `onUpdate`.
```typescript
onStart() {
@@ -57,7 +56,7 @@ onUpdate() {
}
```
-It is important to note that Galacean executes `onStart` callbacks in batches before executing `onUpdate` callbacks in batches. This allows accessing values initialized in other entities in `onUpdate`.
+It is important to note that Galacean executes the `onStart` callbacks in bulk before executing the `onUpdate` callbacks in bulk. The benefit of this approach is that you can access the initialized values of other entities in `onUpdate`.
```typescript
import { TheScript } from './TheScript'
@@ -71,37 +70,37 @@ onUpdate() {
}
```
-### [**onPhysicsUpdate**](/apis/core/#Script-onPhysicsUpdate)
+### onPhysicsUpdate
-`onPhysicsUpdate` callback function is called at the same frequency as the physics engine update. It may be called multiple times per rendering frame.
+The `onPhysicsUpdate` callback function is called at the same frequency as the physics engine update rate. It may be called multiple times per render frame.
-### [**onTriggerEnter**](/apis/core/#Script-onTriggerEnter)
+### onTriggerEnter
-The `onTriggerEnter` callback function is called when triggers make contact with each other to handle the logic when triggers meet, such as deleting entities when a trigger occurs.
+The `onTriggerEnter` callback function is called when triggers come into contact to handle the logic when triggers meet, such as deleting an entity when the trigger occurs.
-### [**onTriggerStay**](/apis/core/#Script-onTriggerStay)
+### onTriggerStay
-The `onTriggerStay` callback function is called **continuously** during the trigger contact, once per frame.
+The `onTriggerStay` callback function is called **continuously** during the trigger contact process, once per frame.
-### [**onTriggerExit**](/apis/core/#Script-onTriggerExit)
+### onTriggerExit
-The `onTriggerExit` callback function is called when two triggers separate, meaning the trigger relationship changes, and it is called only once.
+The `onTriggerExit` callback function is called when two triggers separate, i.e., when the trigger relationship changes, and it is called only once.
-### [**onCollisionEnter**](/apis/core/#Script-onCollisionEnter)
+### onCollisionEnter
-The `onCollisionEnter` callback function is called when colliders collide to handle the logic when colliders meet, such as deleting entities when a collision occurs.
+The `onCollisionEnter` callback function is called when colliders collide to handle the logic when colliders meet, such as deleting an entity when the collision occurs.
-### [**onCollisionStay**](/apis/core/#Script-onCollisionStay)
+### onCollisionStay
-The `onCollisionStay` callback function is called **continuously** during the collider collision, once per frame.
+The `onCollisionStay` callback function is called **continuously** during the collider collision process, once per frame.
-### [**onCollisionExit**](/apis/core/#Script-onCollisionExit)
+### onCollisionExit
-The `onCollisionExit` callback function is called when two colliders separate, meaning the collision relationship changes, and it is called only once.
+The `onCollisionExit` callback function is called when two colliders separate, i.e., when the collision relationship changes, and it is called only once.
-### [**onUpdate**](/apis/core/#Script-onUpdate)
+### onUpdate
-A key point in game/animation development is updating the behavior, state, and position of objects before each frame rendering. These update operations are usually placed in the `onUpdate` callback. It receives a parameter representing the time difference from the last `onUpdate` execution, of type `number`.
+A key point in game/animation development is to update the behavior, state, and position of objects before each frame is rendered. These update operations are usually placed in the `onUpdate` callback. It receives a parameter representing the time difference since the last `onUpdate` execution, of type `number`.
```typescript
onStart() {
@@ -113,9 +112,9 @@ onUpdate(deltaTime: number) {
}
```
-### [**onLateUpdate**](/apis/core/#Script-onLateUpdate)
+### onLateUpdate
-`onUpdate` is executed before all animation updates, but if we need to perform additional operations after the animation effects (such as animations, particles, etc.) are updated, or if we want to perform other operations only after all components' `onUpdate` have been executed, such as camera following, then we need to use the `onLateUpdate` callback. It receives a parameter representing the time difference from the last `onLateUpdate` execution, of type `number`.
+`onUpdate` is executed before all animation updates, but if we want to perform some additional operations after the effects (such as animations, particles, etc.) are updated, or if we want to perform other operations such as camera follow after all components' `onUpdate` have been executed, we need to use the `onLateUpdate` callback. It receives a parameter representing the time difference since the last `onLateUpdate` execution, of type `number`.
```typescript
onStart() {
@@ -131,29 +130,29 @@ onLateUpdate(deltaTime: number) {
}
```
-### [**onBeginRender**](/apis/core/#Script-onBeginRender)
+### onBeginRender
-**Only when the entity is mounted with a camera component**, the `onBeginRender` callback will be called before the [render](/apis/core/#Camera-render) method of the camera component is invoked.
+**Only when the entity has a camera component attached**, the `onBeginRender` callback will be called before the camera component's [render](/en/apis/core/#Camera-render) method is called.
-### [**onEndRender**](/apis/core/#Script-onEndRender)
+### onEndRender
-**Only when the entity is mounted with a camera component**, the `onEndRender` callback will be called after the [render](/apis/core/#Camera-render) method of the camera component is invoked.
+**Only when the entity has a camera component attached**, the `onEndRender` callback will be called after the camera component's [render](/en/apis/core/#Camera-render) method is called.
-### [**onDestroy**](/apis/core/#Script-onDestroy)
+### onDestroy
-When a component or the entity it belongs to calls [destroy](/apis/core/#Entity-destroy), the `onDestroy` callback is invoked, and the components are uniformly recycled at the end of the frame.
+When a component or its entity calls [destroy](/en/apis/core/#Entity-destroy), the `onDestroy` callback will be called, and the component will be uniformly recycled at the end of the frame.
### onPointerXXX
-For details on the input system interface, see [Input Interaction](/en/docs/input).
+For input system interfaces, see [Input Interaction](/en/docs/input).
## Entity Operations
-[Entities](/en/docs/core/entity) are the main objects of scripts. You can modify nodes and components in the editor's scene inspector and dynamically modify them in scripts. Scripts can respond to player input, modify, create, and destroy entities or components, thereby implementing various game logics.
+[Entities](/en/docs/core/entity) are the main objects operated by scripts. You can modify nodes and components in the editor's scene inspector, and you can also dynamically modify them in scripts. Scripts can respond to player input, modify, create, and destroy entities or components to achieve various game logic.
### Accessing Entities and Components
-You can access the entity to which the script is bound at any lifecycle, like:
+You can obtain the entity bound to the script in any lifecycle of the script, such as:
```typescript
class MyScript extends Script {
@@ -163,9 +162,9 @@ class MyScript extends Script {
}
```
-### Getting Other Components
+### Obtaining Other Components
-When we need to get other components on the same node, we use the [getComponent](/apis/core/#Entity-getComponent) API, which helps you find the component you need.
+When we need to get other components on the same node, we use the [getComponent](/en/apis/core/#Entity-getComponent) API, which helps you find the component you need.
```typescript
onAwake() {
@@ -174,11 +173,11 @@ onAwake() {
}
```
-Sometimes there may be multiple components of the same type, the above method will only return the first component found. If you need to find all components, you can use [getComponents](/apis/core/#Entity-getComponents).
+Sometimes there may be multiple components of the same type, and the above method will only return the first found component. If you need to find all components, you can use [getComponents](/en/apis/core/#Entity-getComponents).
### Transformation
-Taking rotation as an example, rotate the entity in the [onUpdate](/apis/core/#Script-onUpdate) using the [setRotation](/apis/core/#Transform-setRotation) method:
+For example, to rotate an entity in the [onUpdate](/en/apis/core/#Script-onUpdate) method using the [setRotation](/en/apis/core/#Transform-setRotation) method:
```typescript
this.entity.transform.setRotation(0, 5, 0);
@@ -192,9 +191,9 @@ onAwake() {
### Finding Child Nodes
-Sometimes, there may be many objects of the same type in the scene, like multiple particle animations, multiple coins, which are usually managed by a global script. If associating them one by one to this script, the work will be cumbersome. To better manage these objects, we can place them under a unified parent object and then get all child objects through the parent object:
+Sometimes, there are many objects of the same type in the scene, such as multiple particle animations or multiple coins, which usually have a global script to manage them uniformly. If you associate them with this script one by one, the work will be cumbersome. To better manage these objects, we can place them under a unified parent object and then obtain all child objects through the parent object:
-If you know the index of the child node in the parent node, you can directly use [getChild](/apis/core/#Entity-getChild):
+If you know the index of the child node in the parent node, you can directly use [getChild](/en/apis/core/#Entity-getChild):
```typescript
onAwake() {
@@ -202,7 +201,7 @@ onAwake() {
}
```
-If you are unsure about the index of the child node, you can use [findByName](/apis/core/#Entity-findByName) to find it by the node's name. [findByName](/apis/core/#Entity-findByName) not only searches for child nodes but also searches for grandchildren nodes.
+If you don't know the index of the child node, you can use [findByName](/en/apis/core/#Entity-findByName) to find it by the node's name. [findByName](/en/apis/core/#Entity-findByName) will search not only child nodes but also grandchild nodes.
```typescript
onAwake() {
@@ -210,7 +209,7 @@ onAwake() {
}
```
-If there are nodes with the same name, you can use [findByPath](/apis/core/#Entity-findByPath) by passing the path for a step-by-step search. Using this API will also improve the search efficiency to some extent.
+If there are nodes with the same name, you can use [findByPath](/en/apis/core/#Entity-findByPath) to pass in the path for step-by-step search. Using this API will also improve search efficiency to some extent.
```typescript
onAwake() {
diff --git a/docs/en/script/communication.md b/docs/en/script/communication.md
index c0ab214fc8..3dd2fd17ff 100644
--- a/docs/en/script/communication.md
+++ b/docs/en/script/communication.md
@@ -5,13 +5,13 @@ type: Script
label: Script
---
-In projects developed using Galacean Engine, it is often necessary to communicate with the external environment, such as sending runtime information from the project to the outside, or receiving certain configuration information from the external environment. In such cases, you can use the event system of Galacean Engine to achieve this functionality.
+Projects developed with Galacean Engine often need to communicate with the external environment, such as sending runtime information of the project to the outside or obtaining certain configuration information from the external environment. At this time, you can use the event system of Galacean Engine to achieve such functions.
-## Add Events
+## Adding Events {/*examples*/}
-Galacean Engine provides [EventDispatcher](/apis/core/#EventDispatcher) as the event class, and [Engine](/apis/core/#Engine) inherits from [EventDispatcher](/apis/core/#EventDispatcher). Therefore, we can directly use `engine` in the code as a medium for internal and external communication.
+Galacean Engine provides [EventDispatcher](/en/apis/core/#EventDispatcher) as the event class, and [Engine](/en/apis/core/#Engine) inherits from [EventDispatcher](/en/apis/core/#EventDispatcher). Therefore, we directly use `engine` in the code as the medium for internal and external communication.
-**Add an event using `engine.on`**
+**Add events using `engine.on`**
```ts
import { Script } from "@galacean/engine";
@@ -25,9 +25,9 @@ class MyScript extends Script {
}
```
-**Add an event using `engine.once`**
+**Add events using `engine.once`**
-Events added using `engine.once` will trigger the callback function only once.
+Events added using `engine.once` will only trigger the callback function once.
```ts
import { Script } from "@galacean/engine";
@@ -43,19 +43,19 @@ class MyScript extends Script {
After saving the code, we can see the corresponding events in the event panel.
-## Trigger Events
+## Triggering Events {/*examples*/}
-Calling the `engine.dispatch` method can dispatch events. Dispatching events will execute the corresponding callback functions using the parameters configured in `dispatch`.
+Calling the `engine.dispatch` method can dispatch events. Dispatching events will execute the corresponding callback function using the parameters configured in `dispatch`.
```ts
this.engine.dispatch("Trigger", { eventData: "mydata" });
```
-You can trigger events at any lifecycle of the script. Of course, you can also trigger events using the event panel or configure the parameters to be carried when triggering events.
+You can trigger events in any lifecycle of the script. Of course, you can also use the event panel to trigger events or configure the parameters carried when triggering events.
-## Remove Events
+## Removing Events {/*examples*/}
-Use `this.engine.off` to remove related events.
+You can remove related events using `this.engine.off`.
```ts
// Remove the specific function "fun" that listen to "Trigger".
diff --git a/docs/en/script/create.mdx b/docs/en/script/create.mdx
new file mode 100644
index 0000000000..a53fd4778c
--- /dev/null
+++ b/docs/en/script/create.mdx
@@ -0,0 +1,14 @@
+---
+order: 2
+title: Creating Scripts
+type: Script
+label: Script
+---
+
+[Script components](/en/docs/script) are an important extension capability provided by the engine for developers. In the Galacean editor, scripts are also a type of asset.
+
+## Using Scripts in the Editor
+
+Using scripts in the editor is very convenient. You just need to create a script and then add the script to the script component of the entity.
+
+
diff --git a/docs/en/script/edit.mdx b/docs/en/script/edit.mdx
new file mode 100644
index 0000000000..60feec35e5
--- /dev/null
+++ b/docs/en/script/edit.mdx
@@ -0,0 +1,77 @@
+---
+order: 3
+title: Edit Script
+type: Script
+label: Script
+---
+
+Galacean Editor provides a powerful code editor with features such as code suggestions, third-party package imports, engine event debugging, script parameter debugging, real-time project preview, and more, helping you quickly edit and debug code.
+
+
+
+| No. | Area | Description |
+| ---- | ------------ | ------------------------------------------------------------ |
+| 1 | File List | View all script files in the project |
+| 2 | Code Editor | Edit script files with features like code highlighting, code suggestions, and code formatting |
+| 3 | Preview Area | Preview the running effect of the current script. The rendering state of this area will refresh in real-time after saving the code |
+| 4 | Package Management Area | Add required [NPM](https://www.npmjs.org/) third-party packages, such as [tween.js](https://www.npmjs.com/package/@tweenjs/tween.js) |
+| 5 | Event Debugging Area | The code editor will automatically search for all events bound to the engine and display them here. You can trigger events here and configure event parameters |
+| 5 | Console | View log information during code runtime |
+
+For more information about the code editor, please check [Code Editor](/en/docs/script/edit).
+
+## Code Editing
+
+After creating a script asset in the scene editor, double-click the script to open the code editor. Scripts in Galacean need to be written in [Typescript](https://www.typescriptlang.org/), and new scripts are created based on built-in templates by default. Additionally, the Galacean code editor is based on Monaco, with shortcuts similar to VSCode. After modifying the script, press `Ctrl/⌘ + S` to save, and the real-time preview area on the right will show the latest scene effects.
+
+> Tip: The Galacean code editor currently supports `.ts`, `.gs`, and `.glsl` file editing.
+
+## File Preview
+
+
+
+1. **File Search** Quickly search for files in the project
+2. **Code Filter** Whether to display only code files ( `.ts`, `.gs`, `.glsl` ) in the file tree
+3. **Built-in Files** Used to display which files are non-editable internal files
+4. **Expand/Hide** Toggle the expansion or hiding of folders
+5. **Code Files** Editable code files will display corresponding file type thumbnails
+
+## Importing Third-Party Packages
+
+The code editor has a built-in engine corresponding to the project, which can automatically provide intelligent suggestions for engine APIs, helping you quickly implement logic based on the engine. However, in most cases, you will need to import Galacean ecosystem packages or other third-party packages to enhance functionality.
+
+
+
+1. **Search Box** Enter the package name in the search box and press Enter to quickly fetch the package version list
+2. **Version Selection** By default, the `latest` version is used
+3. **Import Button** After selecting the package name and version, click the import button to load the type information of the third-party package into the workspace
+4. **Package List** This will list all third-party packages that the current project depends on
+5. **Version Switching** You can switch the version of the imported package here. After switching, the new type information will be loaded into the workspace
+
+
+Try it: Enter `@galacean/engine-toolkit` in the search box, click the "Import" button, and then use `import { OrbitControl } from '@galacean/engine-toolkit` in the code to import the `OrbitControl` component.
+
+
+## Real-time Preview Area
+
+The Galacean code editor provides a real-time preview feature. After saving the code, the preview area will automatically update, allowing you to quickly see the execution results of the code.
+
+
+
+1. **Drag Button** Hold to drag the simulator. Drag the simulator to the right edge of the screen to dock it on the right panel.
+2. **Statistics Toggle** Click to toggle the display status of scene statistics.
+3. **Open in New Window** Open the project preview page in a new window.
+4. **Script Parameter Editing** If the active script in the current scene has configurable parameters, you can open this panel to adjust the script parameters in real-time. For more details on script parameters, please refer to [Script System](/en/docs/script/attributes).
+5. **Close Button** Click to close the simulator. After closing, a display button is provided at the top right corner of the screen, click to reopen the simulator.
+
+## Event Debugging
+
+In the code editor, we provide an event debugging panel to help you quickly debug events in the scene.
+
+
+
+1. **Event List** The Galacean Editor will automatically collect all event names in the scene and display them here.
+2. **Event Parameter Configuration** You can click this button to configure the parameters carried when the event is triggered. The parameters are written in `JSON` format.
+3. **Event Trigger Button** Click this button to trigger the corresponding event in the scene.
+
+> Note, the script component must be bound to an entity to display the event list.
diff --git a/docs/en/script/script.md b/docs/en/script/script.md
new file mode 100644
index 0000000000..3015718b70
--- /dev/null
+++ b/docs/en/script/script.md
@@ -0,0 +1,17 @@
+---
+order: 0
+title: Script Overview
+type: Script
+label: Script
+---
+
+In addition to the [built-in components](/en/docs/core/component/), the Galacean engine also provides a powerful scripting system. The scripting system is the bridge between the engine's capabilities and game logic. It extends from the [Script](/en/apis/galacean/#Script) base class. You can extend the engine's functionality through custom scripts, and write your own game logic code in the lifecycle hook functions provided by the script components.
+
+In this chapter, you will learn about:
+
+- [Script Class](/en/docs/script/class/): The lifecycle of script callbacks and how to control entities with scripts
+- Script Workflow
+ - [Create Script](/en/docs/script/create/): Declare and bind scripts
+ - [Edit Script](/en/docs/script/edit/): Script editing interface, importing third-party packages, previewing, and event debugging
+ - [Script Attributes](/en/docs/script/attributes/)
+ - [Event Communication](/en/docs/script/communication/)
diff --git a/docs/en/xr/compatibility.md b/docs/en/xr/compatibility.md
index 4ec146bece..865d3f1b28 100644
--- a/docs/en/xr/compatibility.md
+++ b/docs/en/xr/compatibility.md
@@ -5,50 +5,50 @@ type: XR
label: XR
---
-The XR system supports multiple backends (refer to [XR Overview](/en/docs/xr-overall)). Currently, only the WebXR standard is officially supported, so the compatibility of XR interactions is also **limited to device support for WebXR**.
+The XR system supports multiple backends (refer to [XR Overview](/en/docs/xr/overall)), currently, the official support is only for the WebXR standard, so the compatibility of XR interactions is also **limited by the device's compatibility with WebXR**.
-Before using XR capabilities, you can evaluate the runtime environment with [CanIUse](https://caniuse.com/?search=webxr). Below is a summary of current WebXR compatibility.
+Before using XR capabilities, you can refer to [CanIUse](https://caniuse.com/?search=webxr) to evaluate the runtime environment. Below is a summary of the current WebXR compatibility.
## Device Support
### PC
-- PC browsers that support WebXR (this document uses Mac Chrome)
+- PC browsers that support WebXR (this article uses Mac Chrome)
- Install [Immersive Web Emulator](https://chromewebstore.google.com/detail/immersive-web-emulator/cgffilbpcibhmcfbgggfhfolhkfbhmik) or other WebXR simulation plugins on PC Chrome
### Android
-- Terminals and browsers that support WebXR (this document uses an Android device with the Chrome app)
-- Android phones need to install [Google Play Services for AR](https://play.google.com/store/apps/details?id=com.google.ar.core&hl=en_US&pli=1)
+- Terminals and browsers that support WebXR (this article uses an Android device and the mobile Chrome app on the Android device)
+- Android phones need to additionally install [Google Play Services for AR](https://play.google.com/store/apps/details?id=com.google.ar.core&hl=en_US&pli=1)
-### iOS
+### IOS
-- Safari on iOS devices does not currently support WebXR
+- Safari on iPhones does not currently support WebXR
- Apple Vision Pro supports WebXR
-### Head-mounted Displays
+### Headset Devices
-Depending on the situation, refer to the official website of the head-mounted display for compatibility information. Most browsers in head-mounted displays (browsers with Chromium kernel) support WebXR.
+Depending on the situation, you can refer to the headset's official website for compatibility information. Most browsers in headsets (browsers with Chromium kernel) support WebXR.
## Runtime Compatibility Check
-During runtime, you can determine if the current environment supports `AR` or `VR` with the following code:
+In the runtime, you can use the following code to check if the current environment supports `AR` or `VR`:
```typescript
// Check if AR is supported
xrManager.sessionManager.isSupportedMode(XRSessionMode.AR);
```
-Before adding features, you can check the compatibility of the feature with the following code:
+Before adding features, you can use the following code to check the compatibility of the feature:
```typescript
// Check if image tracking is supported
xrManager.isSupportedFeature(XRImageTracking);
```
-## Android Experimental Features
+## Enabling Experimental Features on Android
-Android supports some experimental features, but they are disabled by default. You can enable them by setting flags: **Open Chrome on Android** -> **Go to chrome://flags** -> **Search for WebXR** -> **Enable WebXR Incubations**
+Android supports some experimental features, but they are turned off by default. You can enable them by setting flags: **Open Chrome on Android** -> **Log in to chrome://flags** -> **Search for WebXR** -> **Enable WebXR Incubations**
diff --git a/docs/en/xr/overall.md b/docs/en/xr/overall.md
index 88b6e435cb..6a616d8526 100644
--- a/docs/en/xr/overall.md
+++ b/docs/en/xr/overall.md
@@ -5,20 +5,34 @@ type: XR
label: XR
---
-`XR` is a generic term used to describe the concept of Extended Reality (`Extended Reality`), which includes Virtual Reality (VR), Augmented Reality (AR), Mixed Reality (MR), and more.
+`XR` is a general term used to describe the concept of Extended Reality, which includes Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR).
-Galacean has designed XR with a clean and flexible approach:
+## Architecture
-- Cleaner: When XR capabilities are not needed, the package does not contain any XR logic, and its size does not increase at all
-- More flexible: Plug-and-play features make development easier
-- Future-oriented: Multi-backend design for future adaptability to different platforms and interfaces
+Galacean has designed XR to be clean and flexible:
-
+- Clean: When XR capabilities are not needed, the package does not contain any XR logic, and the size does not increase at all.
+- Flexible: Pluggable functionality makes development easier.
+- Future-oriented: Multi-backend design allows for adaptation to different platforms and interfaces in the future.
-In this chapter, you can learn about:
+
-- [Quick XR Interaction Development](/en/docs/xr-start): XR workflow and debugging
-- [XR Manager](/en/docs/xr-manager): Managing [camera](/en/docs/xr-camera), [session](/en/docs/xr-session), [input](/en/docs/xr-input), [features](/en/docs/xr-features), and more
-- [XR Compatibility](/en/docs/xr-compatibility): Introduction to the current WebXR compatibility
+## Module Management
-{ /*examples*/ }
+| Package | Description | Related Documentation |
+| :-- | :-- | --- |
+| [@galacean/engine-xr](https://www.npmjs.com/package/@galacean/engine-xr) | Core architecture logic | [API](/en/apis/galacean) |
+| [@galacean/engine-web-xr](https://www.npmjs.com/package/@galacean/engine-web-xr) | Backend package | [Doc](/en/docs/physics/overall) |
+| [@galacean/engine-toolkit-xr](https://www.npmjs.com/package/@galacean/engine-toolkit-xr) | Advanced tool components | [Doc](/en/docs/xr/toolkit) |
+
+> `@galacean/engine-xr` and `@galacean/engine-web-xr` are dependencies that must be introduced to implement **WebXR**. Compared to these two packages, `@galacean/engine-toolkit-xr` is not mandatory, but its presence can make XR development in the editor much simpler.
+
+> The dependency rules between XR packages follow the [version dependency rules](/en/docs/basics/version/#版本依赖), meaning that the versions of `@galacean/engine-xr` and `@galacean/engine-web-xr` must be consistent with `@galacean/engine`, and the major version of `@galacean/engine-toolkit-xr` must be consistent with `@galacean/engine`.
+
+## Quick Start
+
+In this section, you can:
+
+- Quickly [develop XR interactions](/en/docs/xr/quickStart/develop) and [debug XR interactions](/en/docs/xr/quickStart/debug) without any specialized knowledge.
+- If you want to deeply understand Galacean XR, refer to [XR Core Logic](/en/docs/xr/system/manager).
+- Finally, by understanding [XR Compatibility](/en/docs/xr/compatibility), you can overall control project risks.
diff --git a/docs/en/xr/quickStart/debug.md b/docs/en/xr/quickStart/debug.md
new file mode 100644
index 0000000000..564bf35b77
--- /dev/null
+++ b/docs/en/xr/quickStart/debug.md
@@ -0,0 +1,57 @@
+---
+order: 1
+title: Debugging XR Interaction
+type: XR
+label: XR
+---
+
+This article will mainly introduce how to debug on PC, as well as how to preview and debug on XR devices.
+
+> Unless otherwise specified, the following debugging items are all based on WebXR development.
+
+## PC Debugging
+
+First, prepare the debugging environment. Use the Chrome browser that supports WebXR on the PC, and install the [Immersive Web Emulator](https://chromewebstore.google.com/detail/immersive-web-emulator/cgffilbpcibhmcfbgggfhfolhkfbhmik) plugin.
+
+> For usage of the plugin, refer to the [Immersive Web Emulator repository](https://github.com/meta-quest/immersive-web-emulator).
+
+Once prepared, you can preview the XR project in the editor:
+
+
+
+Of course, you can also preview by downloading the project to the local script build:
+
+```bash
+npm install
+npm run https
+```
+
+
+
+> WebXR is only available in a secure environment (HTTPS), so you need to enable HTTPS when building the project for debugging.
+
+## Mobile Debugging
+
+To support mobile debugging, the following conditions must be met:
+
+- The phone supports ARCore, refer to [ARCore supported devices](https://developers.google.com/ar/devices).
+- Install a browser that supports WebXR (mobile Chrome app).
+- Additionally, install [Google Play Services for AR](https://play.google.com/store/apps/details?id=com.google.ar.core&hl=en_US&pli=1).
+
+> `Google Play Services for AR` is an augmented reality platform developed by Google (ARCore). Some phones come with this app pre-installed. If not, you can search for it in the app store. The image below shows the search result in the Xiaomi app store.
+
+
+
+With all the above conditions met, you can preview the locally built project on your phone (ensure that **the phone and computer are on the same local network**):
+
+
+
+### Debugging
+
+Please refer to [Remote Debugging Android Devices](https://developer.chrome.com/docs/devtools/remote-debugging?hl=zh-cn), the same applies to XR devices.
+
+> Ensure that the phone has **`Developer Options`** enabled and allows **`USB Debugging`** before debugging.
+
+## Best Practices
+
+Since XR debugging is relatively cumbersome, we recommend completing most of the work and verification during the PC preview and debugging stage. This can significantly improve development efficiency.
diff --git a/docs/en/xr/quickStart/develop.md b/docs/en/xr/quickStart/develop.md
new file mode 100644
index 0000000000..663821de75
--- /dev/null
+++ b/docs/en/xr/quickStart/develop.md
@@ -0,0 +1,114 @@
+---
+order: 0
+title: Developing XR Interactions
+type: XR
+label: XR
+---
+
+This document describes how to quickly develop XR interactions in both the editor and ProCode scenarios.
+
+## Editor
+
+The process for developing XR interactions in the editor is as follows:
+
+```mermaid
+flowchart LR
+ Create Project --> Add XR Node --> Add XR Capabilities --> Preview --> Export
+```
+
+### Create Project
+
+On the **[Home Page](/en/docs/interface/intro/#%E9%A6%96%E9%A1%B5)** click **Create Project**, then in **[Project Settings](/en/docs/interface/menu/#项目设置)** select the physics backend as `WebXR`.
+
+
+
+### Add XR Node
+
+In the **[Hierarchy Panel](/en/docs/interface/hierarchy/)** add an XR node.
+
+
+
+> Adding an XR node will automatically create and select `origin` and `camera`, so there should be no other `Camera` components in the scene unless intentionally added.
+
+> Multiple XR nodes can be added to the scene, but at any given time, only one XR node will be active.
+
+### Preview
+
+If you have followed the [Debug XR Project](/en/docs/xr/quickStart/debug/) requirements using Chrome and the [Immersive Web Emulator](https://chromewebstore.google.com/detail/immersive-web-emulator/cgffilbpcibhmcfbgggfhfolhkfbhmik) plugin, you can preview directly.
+
+
+
+### XR Capabilities
+
+To achieve stunning mixed reality effects, additional capabilities are often added to XR interactions.
+
+#### Anchor Tracking
+
+Add the `XR Anchor Manage` component to any active Entity to add anchor tracking capabilities to XR.
+
+| Attribute | Description |
+| :---------- | :-------------------------------------------------------------------------- |
+| Anchor List | List of tracked anchors, determined by Position and RotationQuaternion in real space |
+| Prefab | If a prefab is set, it will be instantiated and attached to the tracked anchor when the anchor is tracked |
+
+
+
+#### Image Tracking
+
+Add the `XR Image Manage` component to any active Entity to add image tracking capabilities to XR.
+
+| Attribute | Description |
+| :--------- | :------------------------------------------------------------------------ |
+| Image List | List of tracked images, add `ReferenceImageAssets` to determine the tracked image information |
+| Prefab | If a prefab is set, it will be instantiated and attached to the tracked anchor when the anchor is tracked |
+
+
+
+Among them, the tracked image is an asset in the editor. You can upload the tracked image by right-clicking on the blank space of the **[Asset Panel](/en/docs/assets/interface/)** → **Upload** → **XRReferenceImage** → **select the corresponding image**.
+
+| Attribute | Description |
+| :-------- | :---------- |
+| name | The name of the tracked image (unique), which can be used to identify the tracked image |
+| Prefab | If a prefab is set, the prefab will be instantiated and attached to the tracked image when the image is tracked |
+
+> Image tracking cannot be debugged on the editor side. It needs to be exported and previewed on a mobile device for debugging.
+
+#### Plane Tracking
+
+Add the `XR Plane Manage` component to any active Entity to enable plane tracking for XR.
+
+| Attribute | Description |
+| :-------------- | :---------- |
+| Detection Mode | The type of plane detection, including `None`, `Horizontal`, `Vertical`, `EveryThing`. It can determine the type of plane to be tracked. The default is `EveryThing`, but in `WebXR`, it usually detects horizontal planes |
+| Prefab | If a prefab is set, the prefab will be instantiated and attached to the tracked plane when the plane is tracked |
+
+
+
+### Note
+
+It should be noted that `WebXR` requires entering the XR context through a button click on the page. If it is an XR project, the editor will automatically add a button to assist developers in previewing during preview. However, after the project is exported, this step needs to be added by the developer. Simply add the following code in the `onClick` callback of the `Button`:
+
+```typescript
+// XR 管理器
+const xrManager = engine.xrManager;
+// 开启的 XR 会话模式
+const xrMode = XRSessionMode.AR;
+engine.xrManager.sessionManager.isSupportedMode(xrMode).then(
+ () => {
+ // 点击进入 XR 会话
+ htmlButton.onclick = () => {
+ xrManager.enterXR(xrMode);
+ };
+ },
+ (error) => {
+ // 不支持该模式
+ console.error(error);
+ }
+);
+```
+
+## Script Development
+
+Before entering pure code development, please first understand some [XR Managers](/en/docs/xr/system/manager/) content. Below is the simplest example to enable AR interaction:
+
+
diff --git a/docs/en/xr/system/camera.md b/docs/en/xr/system/camera.md
new file mode 100644
index 0000000000..f1ace80e4d
--- /dev/null
+++ b/docs/en/xr/system/camera.md
@@ -0,0 +1,39 @@
+---
+order: 4
+title: Camera Manager
+type: XR
+label: XR
+---
+
+The Camera Manager is a part of the XRManager instance, which you can access via `xrManager.cameraManager`.
+
+## Properties
+
+| Property | Type | Description |
+| :-- | :-- | :-- |
+| fixedFoveation | number | Sets the fixed foveation of the camera. For more details, refer to [fixedFoveation](https://developer.mozilla.org/en-US/docs/Web/API/XRProjectionLayer/fixedFoveation) |
+
+## Methods
+
+| Method | Description |
+| :----------- | :------------------------------------------- |
+| attachCamera | Binds the virtual world's camera to the real world's camera |
+| detachCamera | Unbinds the virtual world's camera from the real world's camera |
+
+> When the XR session type is AR, the type of camera to bind is `XRTrackedInputDevice.Camera`
+
+> When the XR session type is VR, the types of cameras to bind are `XRTrackedInputDevice.LeftCamera` and `XRTrackedInputDevice.RightCamera`
+
+## Update Process
+
+Simply synchronize the parameters and pose of the `real camera` completely with the `virtual camera`, and the `real scene` and `virtual scene` can remain **synchronized**.
+
+```mermaid
+flowchart TD
+ A[时间片开始] --> B[获取现实相机数据]
+ B --> C[将姿态同步给虚拟相机]
+ C --> D[将 viewport 同步给虚拟相机]
+ D --> E[将投影矩阵同步给虚拟相机]
+ E --> F[时间片结束]
+ F --> A
+```
diff --git a/docs/en/xr/system/features.md b/docs/en/xr/system/features.md
new file mode 100644
index 0000000000..3ea0e84234
--- /dev/null
+++ b/docs/en/xr/system/features.md
@@ -0,0 +1,130 @@
+---
+order: 5
+title: XR Capabilities
+type: XR
+label: XR
+---
+
+Galacean XR currently includes the following capabilities:
+
+| Capability | Description |
+| :--------------- | :-------------- |
+| Anchor Tracking | Anchor Tracking |
+| Plane Tracking | Plane Tracking |
+| Image Tracking | Image Tracking |
+| Hit Test | Hit Test |
+
+## Anchor Tracking
+
+| Property | Description |
+| :--------------- | :-------------------------------- |
+| trackingAnchors | (Read-only) Get requested tracking anchors |
+| trackedAnchors | (Read-only) Get tracked anchors |
+
+| Method | Description |
+| :--------------------- | :---------------------------- |
+| addAnchor | Add a specific anchor |
+| removeAnchor | Remove a specific anchor |
+| clearAnchors | Remove all anchors |
+| addChangedListener | Add a function to listen for anchor changes |
+| removeChangedListener | Remove a function to listen for anchor changes |
+
+You can add anchors in the XR space with the following code:
+
+```typescript
+const anchorTracking = xrManager.getFeature(XRAnchorTracking);
+const position = new Vector3();
+const rotation = new Quaternion();
+// 添加一个锚点
+const anchor = anchorTracking.addAnchor(position, rotation);
+// 移除这个锚点
+anchorTracking.removeAnchor(anchor);
+// 监听锚点变化
+anchorTracking.addChangedListener(
+ (added: readonly XRAnchor[], updated: readonly XRAnchor[], removed: readonly XRAnchor[]) => {
+ // 此处添加对新增锚点,更新锚点和移除锚点的处理
+ }
+);
+```
+
+## Plane Tracking
+
+| Property | Description |
+| :------------- | :--------------------------------------------- |
+| detectionMode | (Read-only) Type of screen tracking: horizontal, vertical, or all |
+| trackedPlanes | (Read-only) Get tracked planes |
+
+| Method | Description |
+| :--------------------- | :---------------------------- |
+| addChangedListener | Add a function to listen for plane changes |
+| removeChangedListener | Remove a function to listen for plane changes |
+
+> Note that the type of plane tracking needs to be specified when adding the feature.
+
+```typescript
+// Specify the type of plane tracking as everything during initialization
+xrManager.addFeature(XRPlaneTracking, XRPlaneMode.EveryThing);
+```
+
+We can track real-world planes and mark them with transparent grids and coordinate systems:
+
+
+
+## Image Tracking
+
+| Property | Description |
+| :-------------- | :------------------------------------------------------ |
+| trackingImages | (Read-only) Array of requested tracking images, including name, source, and size |
+| trackedImages | (Read-only) Get tracked images |
+
+| Method | Description |
+| :--------------------- | :---------------------------- |
+| addChangedListener | Add a function to listen for plane changes |
+| removeChangedListener | Remove a function to listen for plane changes |
+
+Note that the image tracking feature requires specifying the images to be tracked in advance. In the engine, the tracked images are represented by the `XRReferenceImage` object:
+
+| Property | Description |
+| :------------- | :---------------------------------------------------------------------------------------------------------- |
+| name | Name of the tracked image |
+| imageSource | Source of the tracked image, usually an HtmlImageElement |
+| physicalWidth | Size of the tracked image in the real world, default unit is meters. If specified as `0.08`, it means the image size in the real world is `0.08` meters |
+
+> In WebXR, the same image will only be tracked once.
+
+```typescript
+const image = new Image();
+image.onload = () => {
+ // 创建追踪图片
+ const refImage = new XRReferenceImage("test", image, 0.08);
+ // 初始化图片追踪能力,并指定追踪图片
+ xrManager.addFeature(XRImageTracking, [refImage]);
+};
+image.src = "图片的 URL";
+```
+
+The example below can track a real-world image and mark the coordinate system:
+
+
+
+> The above example can directly generate a QR code for mobile-side experience. The tracking image is as follows:
+
+
+
+## Collision Detection
+
+| Method | Description |
+| :------------ | :------------------------------------------- |
+| hitTest | Performs collision detection with a plane in real space using a ray |
+| screenHitTest | Performs collision detection with a plane in real space using screen space coordinates |
+
+```typescript
+const pointer = engine.inputManager.pointers[0];
+// 获取平面触控点
+if (pointer) {
+ const hitTest = xrManager.getFeature(XRHitTest);
+ const { position } = pointer;
+ // 通过屏幕空间坐标与现实空间的平面进行碰撞检测
+ const result = hitTest.screenHitTest(position.x, position.y, TrackableType.Plane);
+}
+```
diff --git a/docs/en/xr/system/input.md b/docs/en/xr/system/input.md
new file mode 100644
index 0000000000..d31da71f31
--- /dev/null
+++ b/docs/en/xr/system/input.md
@@ -0,0 +1,61 @@
+---
+order: 3
+title: Interaction Manager
+type: XR
+label: XR
+---
+
+The Interaction Manager is subordinate to the XRManager instance, which you can access via `xrManager.inputManager`. It manages all input devices, including but not limited to:
+
+- Controllers
+- Headsets
+- Hands
+- ……
+
+## Methods
+
+| Method | Description |
+| :-- | :-- |
+| getTrackedDevice | Get a device by type, specified by `XRTrackedInputDevice` |
+| addTrackedDeviceChangedListener | Add a function to listen for device changes. When a device is added or removed, the callback will be executed with the added and removed device lists as parameters |
+| removeTrackedDeviceChangedListener | Remove the function that listens for device changes |
+
+Currently supported input devices are as follows:
+
+| Enumeration | Description |
+| :----------------------------------- | :----------------------- |
+| XRTrackedInputDevice.Camera | Usually the camera of an AR device |
+| XRTrackedInputDevice.LeftCamera | Usually the left eye of a VR headset |
+| XRTrackedInputDevice.RightCamera | Usually the right eye of a VR headset |
+| XRTrackedInputDevice.Control | Usually the remote control of an AR device |
+| XRTrackedInputDevice.LeftControl | Usually the left controller of a VR device |
+| XRTrackedInputDevice.RightController | Usually the right controller of a VR device |
+
+## Usage
+
+You can listen for device updates with the following code:
+
+```typescript
+const { inputManager } = xrManager.inputManager;
+inputManager.addTrackedDeviceChangedListener((added: readonly XRInput[], removed: readonly XRInput[]) => {
+ // 此处添加对新增设备和移除设备的处理
+});
+```
+
+You can get the pose of the left-hand controller with the following code:
+
+```typescript
+const controller = inputManager.getTrackedDevice(XRTrackedInputDevice.LeftController);
+// 手柄的姿态
+controller.gripPose.position;
+controller.gripPose.rotation;
+controller.gripPose.matrix;
+// 是否按下 select 键
+controller.isButtonDown(XRInputButton.Select);
+// 是否抬起 select 键
+controller.isButtonUp(XRInputButton.Select);
+// 是否一直按着 select 键
+controller.isButtonHeldDown(XRInputButton.Select);
+```
+
+> `XRInputButton.Select` corresponds to the WebXR native `XRInputSourceEventType.selectXXX` event. `XRInputButton.Squeeze` corresponds to the WebXR native `XRInputSourceEventType.squeezeXXX` event.
diff --git a/docs/en/xr/system/session.md b/docs/en/xr/system/session.md
new file mode 100644
index 0000000000..6c1b411764
--- /dev/null
+++ b/docs/en/xr/system/session.md
@@ -0,0 +1,39 @@
+---
+order: 2
+title: Session Manager
+type: XR
+label: XR
+---
+
+The session manager is subordinate to the XRManager instance, which you can access via `xrManager.sessionManager`.
+
+## Properties
+
+| Property | Type | Description |
+| :----------------- | :------------- | :------------------------- |
+| mode | XRSessionMode | (Read-only) Get the current session type |
+| state | XRSessionState | (Read-only) Get the current session state |
+| supportedFrameRate | Float32Array | (Read-only) Get the frame rates supported by the hardware |
+| frameRate | number | (Read-only) Get the frame rate at which the hardware is running |
+
+## Methods
+
+| Method | Description |
+| :------------------------- | :-------------------------------------------------------------------------------------- |
+| isSupportedMode | Check if the session type is supported. Developers can determine if the current environment supports it before starting a session. The parameter is `AR` or `VR`. |
+| addStateChangedListener | Add a listener for session state changes. When the state changes, the callback will be executed and the latest session state will be passed as a parameter. |
+| removeStateChangedListener | Remove the listener for session state changes. |
+| run | Run the session. |
+| stop | Stop the session. |
+
+> There are five states in an XR session: `None`, `Initializing`, `Initialized`, `Running`, and `Paused`. The state transitions are shown in the diagram below. After entering an XR session, developers can run or stop the session at any time, and this state does not affect the engine's `run` and `pause`.
+
+```mermaid
+flowchart TD
+ A[None] -- request --> B[Initializing]
+ B --> C[Initialized]
+ C -- AutoRun --> D[Running]
+ D <--> E[Paused]
+ D -- exit --> F[None]
+ E -- exit --> F
+```