Added post-processing shader support to the engine. Post-processing shaders can be used to apply full screen effects to scenes. Examples of such shaders include full-screen anti-aliasing, vignette effects and motion blur to name but a few. With a post processing shader the scene is typically rendered to a texture that is bound to an off screen frame buffer with appropriate depth and color attachments. The texture is then applied to a full screen quad and rendered to screen. This short clip demonstrates the post processing shader.
The entity, component system architecture enables the easy addition of components (behaviours) and systems (allocate and update components of a specific type in a cache friendly manner). Components can be attached to scene items to create more advanced scene elements and interactions. The engine supports the pluggable addition of systems and components. Systems and components can be defined within a shared library (aka “plugin”) which are then exported for loading into the engine and editor. The editor displays all loaded systems in the Systems tab and all loaded components can be attached to scene elements via the editor’s components panel. As with scene items the editor also reflects on system and component properties so they can be configured either manually (in the editor) or programmatically through script. Like scene elements, both system and component properties are serialised into the scene in a versionable format. Below is a short clip demonstrating the Physics plugin’s Physics System and the RigidBodyObjectComponent exported by the plugin.
The engine and editor supports CSG courtesy of the CSG plugin. This short clip demonstrates using the “Union” operation to union together all static meshes in a scene from which a navigation mesh is then generated. The CSG plugin is actually 2 plugins, one that plugs into the editor adding the Union, Intersection and Difference tools and its engine counterpart plugin that plugs into the engine adding the implementation of the CSG functions.
A short clip demonstrating the engine’s support for terrain generated procedurally at runtime from coherent noise. The terrain is generated as a n x n tile grid with each tile being a user configurable number of world units in extent. The terrain is near infinite with the tiles being generated (and discarded) in parallel at runtime utilising all available machine cores as the viewpoint moves through the terrain / noise.
A short clip that demonstrates scripting support. The engine can be scripted (via LUA) to dynamically create, modify and destroy scenes and scene elements in response to user events (i.e. key presses, active camera entering proximity sensors etc).
A short clip demonstrating support for duplication, undo and redo.