particle system improvements and features #1049
Replies: 16 comments
-
Beta Was this translation helpful? Give feedback.
-
I like the idea of controlling the properties directly, but we should add this after we have everything else finalized. |
Beta Was this translation helpful? Give feedback.
-
I agree. Otherwise, I think it's a great idea. |
Beta Was this translation helpful? Give feedback.
-
The description also illustrates another point I made about changing the emitter angle and rotation range properties into vectors. The proposed UI for it is also very simple (approx. an hour to implement all of this?) I ask that you not be so quick to dismiss it as a "nice to have but not a priority". Often, a UI will lead to API design decisions. Not the other way around. (Eat your own dog food to discover which parts need work. Do this early and often.) Secondly, presentation is just as important as the rest of the particle manager. Why have a particle effect example in the first place? Eye candy. Demonstration of capabilities. Third, the process will be identical to maintaining the melonJS engine and melonJS examples; every change to the engine requires changes to the examples, and these are maintained in parallel. So we just maintain the particle manager and its example (the editor) in parallel. |
Beta Was this translation helpful? Give feedback.
-
I didn't mean to dismiss the idea at all, I am completely on your side with having an awesome particle effect editor interface to demonstrate the capabilities of this awesome engine. I just think it will save us a lot of time if we do some of the other changes first, as it will heavily influence what options the UI will have to provide and how they are going to work. Specifically changing from the current model to having a particle manager which contains multiple emitters is something I think should be done first, as it will have a big impact on how particles work. Also I am sure that we will need some iterations to make the UI feel alright after we have a first prototype, so an hour is IMHO a very optimistic guess for the whole working and shining interface ;) And last but not least I think it is a good idea to do the fallback interface first (sliders, input fields, check-boxes), because you then can compare if everything is working as you expect it to, which in turn makes it easier to develop an experimental interface. |
Beta Was this translation helpful? Give feedback.
-
The hour estimate was for the widget alone... The one that I mocked up in GIMP (image above). 😉 Adding form controls and styling them will be the brunt of the work, and I share no optimism for pace at which that can be developed. Also, manual inputs will always be necessary (last point in my first post). The widget provides a visualization and natural manipulation aspect, but a number field is the only way to provide fixed values (sliders can be inaccurate, and are just a different kind of visualization/manipulation widget). But again, I recommend doing the work in parallel; time will be saved overall by adding multiple emitters and multiple emitter editors to the UI simultaneously. If you procrastinate on the UI, you'll just spend more time manipulating properties in code. The tradeoff seems like having a UI early would be minimally advantageous with the added effort. But in the long run, it will save a great deal of time for testing and iterating on the internal moving parts. In other words, you will see diminishing returns far faster without a decent UI. |
Beta Was this translation helpful? Give feedback.
-
Well, you are right. With that said, I would like to start implementing the editor then. I will first add some input fields to the example and then open a pull request so you can have a look at it and we can continue the discussion there. |
Beta Was this translation helpful? Give feedback.
-
You make a good point, and just to be clear: I don't advocate changing the APIs just to match a fancy UI. I think of the process more as an exercise in solving a problem. The problem in this case was having multiple numbers represent range properties for the emitter. The numbers are not really connected to one another except through naming conventions, so it's hard to treat them as anything but individual numbers. When represented as vectors, the problem of describing the range properties is resolved; a vector tightly ties coordinates together. But it's not a simple concept to understand without seeing it in action. That's what lead to the widget design and mockup. There you have it: The API designed the UI, not the other way around. |
Beta Was this translation helpful? Give feedback.
-
Some new ideas I had while working on the editor today:
|
Beta Was this translation helpful? Give feedback.
-
Hi @insidiator I would help in implementing this item:
But it is interesting to use a texture atlas to load images. Any idea how to pass a texture atlas to the particles, instead of a single image? Thanks! |
Beta Was this translation helpful? Give feedback.
-
Hi @ciangames In order to keep the draw method small, it would probably be best to create different particle classes for the current form and for particles based on renderables. We could also add additional classes for primitive particles based on shape objects or text. Then we can just pass any object that we support into the emitter. If you want to use all images inside the texture atlas you would probably iterate over the atlas, call |
Beta Was this translation helpful? Give feedback.
-
Hi @insidiator Currently the emitter accept a image parameter for a single image: me.loader.getImage("particle"); And for use a Texture Atlas we need something like: game.createSpriteFromName("particle"); We need create new classes as:
It's something like this? What do you guys think? Thanks! |
Beta Was this translation helpful? Give feedback.
-
I think the emitter should accept either a single image like now, or an array of images. The required particle class should then be decided inside addParticles when an image is selected. The new classes can be based on the same parent class, because the update function should be the same for all particles. |
Beta Was this translation helpful? Give feedback.
-
from my point of view the emiter should only accept a renderable object. Sure there will be a small overhead compared to just an image, but even for a single image it is useful just at least because it's then possible to get a single sprite out of a packed texture. |
Beta Was this translation helpful? Give feedback.
-
The fact is that we need to optimize the renderable process anyway. Just like we need to optimize the container. There's no point to writing custom code for the sake of speed, if it is not going to help other parts of the engine as well. So I agree with @obiot - particles should be any class that extends |
Beta Was this translation helpful? Give feedback.
-
some improvements might be there for 1.1.0 (see #531) |
Beta Was this translation helpful? Give feedback.
-
If it is okay with you, I would like to use this ticket for discussing which features and improvements to the particle system could and should be implemented.
The implementation of a feature should then be discussed in a separate pull request.
So here are some ideas I gathered so far. They are in no particular order.
particle effect editorThe example could be extended to allow the creation of custom effects, while the current effects are just presets. An export button would then allow the user to copy and paste the necessary source code into his game.Beta Was this translation helpful? Give feedback.
All reactions