-
-
Notifications
You must be signed in to change notification settings - Fork 1.2k
Demo Scene Builder #2412
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Demo Scene Builder #2412
Conversation
Looks good to me, if everything is done and you're ready then I can merge this in the next few days. The only thing I'd suggest adding is keyTriggers or buttons for toggling things like bloom, shadows, ssao, etc etc. But if you don't get around to it before I merge it then I can do that when I eventually work with TestSceneBuilder in the near future. Once this is merged, I'm going to work on finding some free high quality pbr models and extend your TestSceneBuilder class to create a demo/example in jme-examples to display a polished PBR scene with the ability to toggle all of the post processing and walk or fly around inspecting everything. That way jme can have a solid demo of the engines graphical capabilities to showcase to potential new users. |
scene.baseScene(); | ||
scene.brightMountainsSun(); | ||
scene.hardwareProbe(); | ||
scene.brightMountainsSky(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've noticed that the order of scene filters can have a considerable impact, with FXAAFilter
frequently recommended as the final filter. What criteria determined the current insertion order? Was this based on specific examples or an arbitrary choice?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I chose no particular order, though this is supposed to test that all the different features work than anything else. What do you suggest the order should be?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a common starting point, but feel free to experiment!
-
DirectionalLightShadowFilter
(or other Shadow Filters): Generally, you want to calculate shadows early in the process as they affect the base lighting of the scene. -
SSAOFilter
: adds subtle shadowing in crevices and occluded areas, enhancing the sense of depth and grounding objects. It typically operates on the depth and normal buffers, so it makes sense to apply it relatively early. -
DepthOfFieldFilter
: This filter blurs objects based on their distance from the camera's focal point. It needs depth information, so it comes after the initial rendering and potentially SSAO if you want the AO to be blurred as well. -
ToneMapFilter
: Tone mapping adjusts the high dynamic range (HDR) lighting of your scene to fit within the low dynamic range (LDR) of your screen. It's often applied before color correction and other stylistic effects. -
ColorOverlayFilter
/FadeFilter
/FogFilter
/PosterizationFilter
/CrossHatchFilter
/CartoonEdgeFilter
: These are more stylistic filters that directly manipulate the color and appearance of the scene. Their order relative to each other depends on the specific look you want to achieve. For instance: -
Applying a
ColorOverlayFilter
beforeCartoonEdgeFilter
will result in the edges being drawn on the tinted image.
ApplyingFogFilter
earlier will make other effects appear to be seen through the fog. -
BloomFilter
: Bloom adds a glow effect around bright areas. It usually works best on a tone-mapped image to correctly identify the bright regions. -
RadialBlurFilter
: This filter blurs the image outwards from a center point. Its placement depends on whether you want the blur to affect the already post-processed image. -
FXAAFilter
(or other Anti-Aliasing Filters): Anti-aliasing smooths out jagged edges. It's generally applied towards the end of the pipeline to operate on the final rendered image. -
TranslucentBucketFilter
: This filter likely handles the rendering of translucent objects. Its placement might depend on how you want translucency to interact with other post-processing effects. Experiment to see what looks best for your specific use case. -
ComposeFilter
: This filter is often used to combine the results of multiple render targets or passes. Its position will depend entirely on what you are composing. -
LightScatteringFilter
(God Rays): These are often applied after the main rendering and potentially after tone mapping to interact realistically with bright light sources.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I can't say that I know the best order myself because I don't have experience using all of the filters in jme, but order definitely is important.
I think in some cases, certain filters need to be first/last (for example, I know that TranslucentBucketFilter needs to be one of the last filter added if you want translucency to work with jme's particle emitters and WaterProcessor)
But in other cases (and depending what combination of filters is being used) the order has some room for variation.
I deliberately chose not to do this because the graphical elements that should be included should be up to whoever is writing the tests using TestSceneBuilder, not whoever is viewing the tests. One thing I am still concerned about for this is that the textures look fairly blurry. I think it has to do with mipmapping relative to the scene scale, but I don't know for sure here.
Good idea. Let me know if you any help with that. |
/** | ||
* Simple first person physical character control. | ||
*/ | ||
public class FirstPersonCharacter extends BetterCharacterControl implements AnalogListener { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Decouple the FirstPersonCharacter
and BetterCharacterControl
logic. The current composition leads to an undesirable state where BetterCharacterControl.setEnabled(false)
disables physics while AnalogListener
continues to process input. Furthermore, for basic movement and jumping, an ActionListener
might offer a more intuitive approach than the current AnalogListener
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
leads to an undesirable state where BetterCharacterControl.setEnabled(false) disables physics while AnalogListener continues to process input.
This isn't so bad. If the physics control is disabled, setWalkDirection
and jump
will have no effect.
Decouple the FirstPersonCharacter and BetterCharacterControl logic.
It's impossible to decouple FirstPersonCharacter from BetterCharacterControl because FPC uses functionality provided by BCC. If I don't have FPC inherit from BCC as you're suggesting, I'd have to some gymnastics to go get the BCC anyway... and there's a chance it won't work. I think the current approach is better because it packages everything up neatly into one control.
While I'm not a big fan of this monolithic approach, as similar solutions tend to be complex to maintain and evolve over time, I agree with the idea of creating tests with high-resolution graphical details. If this architecture is already established, I suggest at least making substantial modifications to the FirstPersonCharacter class to make it more robust. Here is an example (I haven't had a chance to test it yet): public void character(float radius, float height, float mass, Consumer<FirstPersonCharacter> config) {
Node characterNode = new Node("TestScene_FirstPersonCharacter");
sceneNode.attachChild(characterNode);
BetterCharacterControl bcc = new BetterCharacterControl(radius, height, mass);
characterNode.addControl(bcc);
FirstPersonCharacter fpc = new FirstPersonCharacter(app.getCamera(), app.getInputManager());
if (config != null) {
config.accept(fpc);
}
characterNode.addControl(fpc);
getOrCreatePhysics().add(fpc);
} public class FirstPersonCharacter extends AbstractControl implements ActionListener {
private static final String FORWARD = "TestScene_Character_Forward";
private static final String BACKWARD = "TestScene_Character_Backward";
private static final String LEFT = "TestScene_Character_Left";
private static final String RIGHT = "TestScene_Character_Right";
private static final String JUMP = "TestScene_Character_Jump";
private Trigger forwardTrigger = new KeyTrigger(KeyInput.KEY_W);
private Trigger backwardTrigger = new KeyTrigger(KeyInput.KEY_S);
private Trigger leftTrigger = new KeyTrigger(KeyInput.KEY_A);
private Trigger rightTrigger = new KeyTrigger(KeyInput.KEY_D);
private Trigger jumpTrigger = new KeyTrigger(KeyInput.KEY_SPACE);
private final Camera cam;
private InputManager inputManager;
private BetterCharacterControl bcc;
private final Vector3f camDir = new Vector3f();
private final Vector3f camLeft = new Vector3f();
private final Vector3f walkDirection = new Vector3f();
private float yOffset = 1.5f;
private float walkSpeed = 10f;
private float strafeSpeed = 7f;
private float jumpDelay = 0f;
private boolean moveForward, moveBackward, leftStrafe, rightStrafe;
public FirstPersonCharacter(Camera cam) {
this.cam = cam;
}
public FirstPersonCharacter(Camera cam, InputManager inputManager) {
this.cam = cam;
registerWithInput(inputManager);
}
@Override
public void setSpatial(Spatial spatial) {
super.setSpatial(spatial);
if (spatial != null) {
bcc = spatial.getControl(BetterCharacterControl.class);
Objects.requireNonNull(bcc, "BetterCharacterControl not found: " + spatial);
}
}
@Override
protected void controlUpdate(float tpf) {
jumpDelay -= tpf;
cam.getDirection(camDir).setY(0);
cam.getLeft(camLeft).setY(0);
walkDirection.set(Vector3f.ZERO);
if (moveForward) {
walkDirection.addLocal(camDir);
} else if (moveBackward) {
walkDirection.addLocal(camDir.negateLocal());
}
if (leftStrafe) {
walkDirection.addLocal(camLeft);
} else if (rightStrafe) {
walkDirection.addLocal(camLeft.negateLocal());
}
walkDirection.normalizeLocal().multLocal(walkSpeed);
bcc.setWalkDirection(walkDirection); // Walk
cam.setLocation(spatial.getWorldTranslation().add(0f, yOffset, 0f));
}
@Override
protected void controlRender(RenderManager rm, ViewPort vp) {
}
@Override
public void onAction(String name, boolean isPressed, float tpf) {
if (!isEnabled()) {
return;
}
if (name.equals(FORWARD)) {
moveForward = isPressed;
} else if (name.equals(BACKWARD)) {
moveBackward = isPressed;
} else if (name.equals(LEFT)) {
leftStrafe = isPressed;
} else if (name.equals(RIGHT)) {
rightStrafe = isPressed;
} else if (name.equals(JUMP) && isPressed) {
jump();
}
}
private void jump() {
if (jumpDelay <= 0f && bcc.isOnGround()) {
bcc.jump();
jumpDelay = 0.05f;
}
}
public final void registerWithInput(InputManager inputManager) {
this.inputManager = inputManager;
inputManager.addMapping(FORWARD, forwardTrigger);
inputManager.addMapping(BACKWARD, backwardTrigger);
inputManager.addMapping(LEFT, leftTrigger);
inputManager.addMapping(RIGHT, rightTrigger);
inputManager.addMapping(JUMP, jumpTrigger);
inputManager.addListener(this, FORWARD, BACKWARD, LEFT, RIGHT, JUMP);
}
public final void unregisterInput() {
if (inputManager != null) {
inputManager.deleteMapping(FORWARD);
inputManager.deleteMapping(BACKWARD);
inputManager.deleteMapping(LEFT);
inputManager.deleteMapping(RIGHT);
inputManager.deleteMapping(JUMP);
inputManager.removeListener(this);
}
}
public float getyOffset() {
return yOffset;
}
public void setyOffset(float yOffset) {
this.yOffset = yOffset;
}
// ... getters/setters
} |
TestSceneBuilder is monolithic on purpose. The idea is to make it really easy to create scenes quickly, and this does that very well. It is still fairly well organized; each feature is kept within its dedicated methods (with only a couple exceptions). Keep in mind that TestSceneBuilder isn't meant to be extensible or particularly robust. If you need something a little different than what is offered, it is not that difficult to just add it manually. In fact, if a test were showcasing the BloomFilter, for example, I would expect the test to explicitly create the BloomFilter and FilterPostProcessor setup on its own, and not use TestSceneBuilder's |
My thought was to allow the person writing the test to configure each filter and determine which are enabled by default. And then give the user running the test the option to enable/disable the filters that are being used. But I tihnk you're right, its probably alright to leave that out of the TestSceneBuilder class, and instead leave that up to the person writing each test/example/demo that extends TestSceneBuilder. This also leads me to ask: is that how you would best intend someone to use TestSceneBuildre for writing a demo? My plan was to create a new class that extends TestSceneBuilder, and just swap out the model that gets loaded and pick which filters/sky/etc get enabled. Then maybe add some extra bells and whistles (like a few particle effects and point lights) to enhance the demo further. Does that sound how you intended it to be used?
With that in mind, I think its also important to make sure that the intention and scope of this class is well documented. I think it is important to view this as a demo/showcase rather than a test, since the goal here is not to test or showcase one single feature or aspect of the engine, but instead the goal is to combine a bunch of features to make the scene as visually impressive as possible and easy to inspect. So I'm also thinking that we should consider renaming "TestSceneBuilder" to instead be "DemoSceneBuilder" (or something similar) since this is not intended for creating targeted engine tests, but is instead intended to be used to demo high quality models and scenes with post processing without having to rewrite as much of the same boiler plate code to set up post processing and filters every time a new demo is made. |
No, sorry for being unclear about that. It's not intended to be extended, but to act more as a set of factory methods for building scenes. You'd declare a new TestSceneBuilder in TestSceneBuilder scene = new TestSceneBuilder(application);
scene.configure();
scene.baseScene();
// ... call other factory methods as desired ... If you want to swap out the base scene with something else, you'd have to load the other scene yourself (and not call
I like this idea. DemoSceneBuilder is a much clearer name than TestSceneBuilder. |
The final feedback I'd have is to add a few more of jme's filters, such as SSAO, LightScattering, DepthOfField, Fog, Water, TransluscentBucketFilter, and whichever others I may be missing. Then I'd eventually like to have one demo that uses all of these filters at the same time to serve as not only a graphical demo but also as an example showing how to properly use jme's stock filters together. A demo like that should also help find certain bugs that only occur when using multiple filters/effects/features in one complex app, and those issues typically don't appear in jme's current tests/examples that are mostly designed to test one thing at a time on a small scale. So having some more complex demos using multiple features and effects at the same time should also help find some bugs that otherwise go unnoticed. |
@codex128 I'm also curious to hear your input on another idea I just had (related to my previous suggestion of adding an interface to allow the user to toggle things at runtime while inspecting a demo scene) Instead of coding that into DemoSceneBuilder, what if I instead create another class titled SceneInspectorInterface that could optionally be used by any demo (or any jme app). This inspector interface would allow the user to inspect each texture layer of the scene, turn a list of filters on/off, and could also allow the user to switch between the different skies and light probes at runtime, similar to how some of the online stores and 3d gltf model viewers work to allow for full inspection of a scene/model. |
Good idea! I think at this point, either a new subproject or repo should be created to exclusively put demos in. It's been long overdue, and I suspect a GUI (such as Lemur) will be extremely helpful for such an interface. I also appreciate your offer to work on it. I've admittedly taken on a bit much lately, so I'm a little swamped right now. |
This adds a class for building quality scenes for testing purposes. The emphasis is both on graphical quality and minimal setup. So far it supports these features: