The Manual

1.Getting started

Welcome to the documentation for the SDK and editor. The best way to learn about the SDK is to browse the demo apps and source code shown on editor startup.

1.1.Installation

Download the sdk & editor from here: https://benmorris.itch.io/plugin-based-scene-editor

Linux (Tested on Ubuntu 18.04 & CentOS 8):

  • Either:
    • Right click the .AppImage and set the executable check box or
    • chmod u+x Firefly*.AppImage from the command line in the directory containing the downloaded .AppImage
  • Double click to launch

Windows:

  • Double click the downloaded .msi to install

macOS:

  • Double click the fireflyeditor.dmg
  • Drag the fireflyeditor.app into the Applications folder (linked to in the installer)
  • Launch the fireflyeditor.app from your Applications folder

Android:

Supported architectures: armeabi-v7a, x86,  arm64-v8a and x86-64

Supported OpenGL ES version: 3.2

  • Install the Linux version (see above)
  • Install Android Studio and NDK r18
  • (In the editor from v0.41.0+) File -> Publish will publish both a desktop and android-studio project

2.Key Concepts

The engine is ECS (Entity Component System) based and comes bundled with its own custom event system that allows you to connect signals (emitted by senders) to slots (on receivers).

NB. This engine does NOT use any GPL or LGPL libraries internally and so games produced by it have no licensing requirements. The engine comes with its own (pluggable) custom modern OpenGL based back end renderer, ECS implementation and signal & slot framework.

A brief description of the key concepts:

  • System (AbstractSystem) – is a system responsible for updating a set of components of a specific type in a cache friendly manner
  • Component (AbstractComponent) – is a collection of functions and data that can be attached to a SceneItem and is processed by an associated System
  • SceneItem – is this engine’s “entity” to which Components are attached
  • Signal (AbstractSignal) – an event emitted by a SceneItem or Component
  • Slot (AbstractSlot) – a function on a SceneItem or Component

For more details see the code samples and demos bundled with the SDK (displayed in the editor on startup).

3.Tutorials

3.1.Code Samples

The editor ships with the following code samples (runnable demos with accompanying code accessible from the editor’s “Start Here -> Templates & Samples Browser” menu).

A brief summary of the samples:

  • example_Model – model loading
  • example_PrefabricatedScene – how to instance a “prefabricated” scene (i.e. a prefabricated scene being a scene created in the editor with animations / models / shaders / scripts / textures etc) into a containing scene
  • example_Collisions – collision detection and response
  • example_Collisions_Triggers – collision trigger volumes and collision scripts
  • example_Physics – rigid body physics
  • example_EmptyGame – a simple empty game skeleton
  • example_Animation – key frame based property animation support

3.2.Walkthrough: Creating a game project

Prerequisites:

Typical Workflow:

Creating a game is a 3 step process:

  1. Open the editor and create a basic scene (or select one of the template scenes from the Start Here -> Templates & Samples Browser editor menu)
  2. File -> Publish editor menu to publish the scene as a code project and then run CMake on the generated project as per the instructions displayed on generation
    •  Windows: when configuring in CMake be sure to set the target architecture to x64 – see image below. NB. (In your IDE) build the code project under either the Release or RelWithDebInfo configuration as the SDK only provides the engine Release binaries.
    • macOS: to fix signing errors use an appropriate code signing identity and profile and set Other Code Signing Flags  to –deep (see image below)
  3. Add code and / or editor generated scenes to the code project (code examples are supplied with the editor under the Start Here-> Templates & Samples Browser editor menu)

3.3.Walkthrough: Creating a System / Component

(Available in Editor/SDK version 0.39.0+)

SimulationStarterKit is a plugin-based Entity Component System (ECS) architecture. ECS allows you to add multiple (and varying) behaviours (components) at runtime to entities (an entity being a game object or “SceneItem” in this SDK’s parlance). All components of a specific type are processed by an associated system in a cache friendly manner for performance.

There are 2 ways to extend the engine adding custom systems and components:

  • As plugins as per the video below
  • Explicitly in code as per the example_Components code sample (accessible from the editor’s “Start here -> Templates & Samples Browser” menu)

In this walkthrough we will extend the engine adding a new Component and System plugin using an editor supplied code wizard.

Steps For Creating a System / Component Plugin

IMPORTANT – PLUGIN PATHS

To ensure your plugins are loaded correctly you need to provide the engine with your plugin paths.

You can do this in one of 2 ways:

  1. In the editor -> Tools -> Settings -> Plugin Paths or
  2. Programmatically as follows:
SDK::SceneSettings::PathList paths;
paths.push_back("my/first/path");
paths.push_back("my/second/path");

SDK::SceneSettings settings;
settings.PluginPaths = paths;
SDK::GetInstance().SetSceneSettings(settings);

Troubleshooting plugins

If your plugin contains bugs that crash the editor but you have already set the editor’s plugin paths and can’t open it to unset them then you can launch the editor with the safemode command line argument and then unset them inside (the now loaded) editor.

With this in mind watch the videos below for platform specific instructions.

Linux:

macOS:

Open the editor

Click the Menu -> Tools -> Development -> New SceneItem / System / Component plugin menu item

Set the:

  • Plugin name to MyPlugin
  • Initial Scene Item name to MySceneItem
  • Specify the path you want to generate code in (should be a writable location)

Click the Generate button

In the wizard’s output pane you should see some output like:

Successfully generated plugin MyPlugin6. Your initial scene item’s uuid is {92e6af43-6776-45a0-9e63-159463e3005e}.

Next, open CMakeGUI and set your source path to:

/Users/benmorris/Dev/Test/MyPlugin6

Launch the CMake GUI (see below) and set the path to your Plugin’s source code as indicated by the wizard output and a directory to build your plugin in.

Hit the Configure button

Fix the error (i.e. CMake can’t locate the SDK) by setting the firefly_DIR variable to the install location of the fireflyeditor.app app pointing to the path of the SDK’s CMake package config file (under the fireflyeditor.app’s bundle in the Contents/Resources/CMake directory – see the above image for a typical path)

Hit the Configure button

Hit the Generate button

Hit the Open Project button to launch your project in Xcode (assuming it’s installed)

Hit the build button in Xcode

NB. The generated Xcode project contains a post-build step that will copy your plugin to the editor’s ‘Frameworks’ folder and so is immediately available in the engine / editor.

Relaunch the editor

Go to the System tab and confirm you have a system called MySystem also confirm that your component is available for adding to game objects by clicking the Component drop down in the editor right hand pane (beneath the properties pane, the component name should be MyComponent)

Congratulations, you’ve created a plugin and it is now accessible within the editor / engine, can be attached to scene items and will be updated each frame by your MySystem system.

NB. To debug your plugin in Xcode set the launching executable for your plugin to the fireflyeditor.app in your /Applications folder (or your install location) and then add your breakpoints then build / run as usual.

Next, we will add a property to our component and modify the system to do something with that property.

 

4.Troubleshooting

See the sub-sections below.

4.1.Android

Android

Publishing to Android

In order to build the published game for Android you need:

  • Android Studio
  • Android NDK 18
  • Define ANDROID_NDK_HOME environment variable to point to your NDK installation. Typically, install Android Studio, then download your NDK from within Android Studio (Tools -> SDK Manager menu) then in there you should see the path to your sdk, your ndk path will be beneath that in your file system (open File Explorer to confirm) and then paste the NDK location into an ANDROID_NDK_HOME environment variable and restart Android Studio
  • Also, check that the local.properties file in your Android project doesn’t have any invalid paths in it

4.2.Windows

4.3.macOS

Code-signing

Here are the code signing setting I use during development:

4.4.Other

Plugins

My ECS plugins aren’t being loaded

Ensure your plugin paths have been specified either in the editor or programatically in your game code. For more details see the plugins tutorial.

5.The Editor

5.1.The toolbar

The toolbar is loaded by the editor on startup by scanning the editor directory for plugins that expose specific entry points that provide access to tool icons and behaviours and so is fully customisable through plugins. By default the editor ships with an interaction plugin that provides pick, move, rotate and add tools.

From left to right (beneath the File / Edit / etc menu):

    • View: Displays a drop down of viewpoints (i.e. cameras) currently added to the scene. You can change the currently active camera here. The active viewpoint can also be modified in script and through an action connection described later on in this page.
    • Run! Runs the scene in a standalone executable for testing the scene as it might appear as a (published) standalone simulation. This also enables you to have both the editor and the running scene open simultaneously on a multi monitor display. The standalone executable’s scene can be reloaded by pressing the F5 key.
  • Pick tool – Activates picking. Holding down CTRL whilst picking allows you to pick multiple objects
  • Move tool – Activate to move picked objects
  • Rotate tool – Activate to rotate picked objects
  • Add tool – Activate to add objects to the scene
  • Tool options – This button with an elipsis to the right of the Add tool can be clicked to open an application modal dialog that displays the options associated with the active tool. The options are provided by the tool’s plugin for a given tool.

NB. These tools are provided through the Interaction plugin. You can add your own icons to this toolbar as described in a walk through on this page.

The Scene Tree View

The scene tree (below) represents the scene hierarchy. This is purely a data representation, in reality, scene items are rendered in batches by the engine to reduce state switches so isn’t necessarily executed in the order seen in the tree.

5.2.Adding objects

  1. Select the Add object tool
  2. Choose Box in the tool options dialog (you can open this dialog by clicking the tool options button on the main toolbar as described above)
  3. Click on the scene (or an object within the scene) to add boxes under the clicked position. If no mesh is found under the mouse click then the newly added object will be created at the scene origin
  4. Verify the boxes have been placed in the positions as expected
  5. Perform undo (CTRL + Z) to remove all added boxes
  6. Perform redo (CTRL + Y) to re-add all boxes
  7. Save scene (File -> Save menu)
  8. New scene (File -> New menu)
  9. Load the scene saved in step 7 (File -> Open menu)
  10. With the scene now loaded hit the “Run!” toolbar button and verify your scene loads in a separate window

5.3.Importing models

(Available in version 0.40.11+)

Scenes can be imported from a wide variety of formats. Once imported scenes can be saved and then instanced in containing scenes. This process is demonstrated in the video below:

5.4.Selecting objects

  1. Active the picking tool in the toolbar
  2. Left mouse click on an object to select
  3. To select multiple objects hold down either the Command key (Apple) or CTRL key (Windows) to select multiple objects

5.5.Duplicating objects

  1. Load a previously generated scene
  2. Select a scene object using the picking tool
  3. Activate the Move tool
  4. Hit CTRL + D to duplicate the scene object
  5. Move the duplicated scene object with the move widget

5.6.Moving, rotating and scaling objects

  1. Load a previously generated scene
  2. Select a single object in the scene
  3. Activate the Move tool
  4. Move the picked object
  5. Undo the object move (CTRL + Z)
  6. Redo the object move (CTRL + Y)
  7. Activate the pick tool
  8. Select multiple scene objects
  9. Activate the Move tool
  10. Move the picked objects
  11. Perform undo and redo and verify the movement is un-done / re-done as expected

5.7.Snapping movement to a grid

Snapping movement to a grid

(Available in version 0.40.12+)

Movement can be snapped to a grid. Multiple snap settings can be defined and activated directly within the editor’s main window as demonstrated in the video below:

5.8.Previewing a scene

  1. Select the Play button in the editor toolbar
  2. Verify your scene opens in a separate window
  3. (With the preview window still open) In the editor window activate the add object tool
  4. Place another object in the scene by clicking in the editor’s scene view
  5. Save the scene File -> Save menu
  6. Activate the preview window by mouse clicking in the preview window
  7. Hit F5 to re-load the preview window’s scene
  8. Verify the scene reloaded in the preview window and its appearance reflects the scene in the editor

5.9.Switching viewpoints

You can add multiple viewpoints to a scene. You can specify which viewpoint should be the default (i.e. when the scene is run) by setting the viewpoint’s DefaultCamera property to true.

Types of cameras supported currently are:

  • EditorCamera – this is an implementation of a 3D editor style camera. By default it pans when left mouse button dragging, zooms when using the mouse middle wheel and can have its view direction changed by holding down the SHIFT key whilst left mouse button dragging.
  • FirstPersonCamera – an implementation of a simple first person perspective camera.

To change between viewpoints in the editor:

  1. Beneath the menu bar (i.e. beneath the panel that hosts the main menus File, Edit, Tools etc) there’s a viewpoint drop-down list. Modify the selected camera.
  2. Verify the viewpoint changes
  3. Expand the scene tree until the Editor Camera node is visible
  4. Select the Editor camera and right click choosing the “Make this a child of -> Cameras -> Editor Camera”
  5. Verify the newly added camera appears in the viewpoint drop-down list in step 2.

NB. To allow a user to toggle between different views in your scene you can add an Input capture device (i.e. Keyboard) to the scene and create an Action (as described in “Scripting – Activating a script”) to wire a Keyboard key press event to a Camera’s “Activate” slot. You can assign an action (i.e. key press) to each Camera of interest.

6.Technical Guides

6.1.Debugging

Safe Mode

To disable all scene loading launch the editor with the safemode command line argument.

6.2.Rendering overview

The engine is a hybrid scene graph / ECS architecture.

The rendering algorithm:

  1. Scene graph traversal: A pre-order traversal of the scene graph. For each item in the scene graph:
    1. The scene item’s Prepare() method is called
    2. Typically the Prepare() method enqueues an item onto a render queue (or if the item is a scene manager, an Octree for instance, any further traversal of the Octree’s scene-graph is short-circuited allowing the scene manager to perform frustum / occlusion culling enqueueing only the visible items that exist in the Octree’s sub scene-graph)
  2. System Update: Once the scene graph has been traversed the systems are updated (a system is effectively an array of components of a specific type). A system component can optionally query the visibility of the scene item (to which it is attached) to determine if it should be updated or not.
  3. Rendering: The render queue renders the scene by performing state-based sorting of the render items it contains followed by calling each item’s Render() method

Improvements: The scene graph might be replaced with a rendering system and component at some point in the future simplifying the above to a two step process i.e. system update followed by rendering.

6.3.Versioning

The framework is fully versionable and it achieves this through a couple of mechanisms.

  • A framework version – this is the version of the framework i.e. the core engine. Typically, new classes inherit framework classes adding new fields (data) and methods.If a framework class is modified then the framework version is updated and tested for in the modified serialisation code.
  • A class specific current version. When you add a new custom class be it a System, Component or SceneItem that derives the framework AbstractSystem, AbstractComponent or SceneItem then there will come a time when you want to add new fields in a version friendly manner. To do this you need to override the GetCurrentVersion method on your class (a monotonically increasing value) that you can then test for when reading your data.

For further implementation details refer to the source code.

7.Geometry Processing

The editor supports an evolving number of convenient geometry processing routines described in more detail in the following sections.

7.1.Boolean Mesh Operations

The editor comes bundled with a CSG plugin that supports the following operations:

  • Union
  • Intersection
  • Difference

These tools can be useful, for example, in combining meshes for generating navigation meshes as follows (also demonstrated in the video below):

  1. Import or add 1 or more meshes to the scene
  2. Activate the selection tool and then command click to select multiple meshes
  3. Activate the Union, Intersection or Difference toolbar tool icon
  4. Hit Enter key to apply the operation
  5. A MeshInstance that contains the modified geometry is created
  6. Add a TiledNavMeshComponent to the new MeshInstance to generate a navigation mesh from the updated geometry

8.Scripting

The engine is fully scriptable using LUA. All classes described in the API documentation (Help -> API documentation menu) are scriptable with the API documentation being a reference to the (scriptable) API.

There are 2 scripting approaches:

  • Adding a Script component to a scene item
  • Creating a global script

 

8.1.Script Component

(Available from version 0.40.2+)

  1. Select an object in the scene
  2. In the Property Pane (right hand side of editor) select the “Script” component in the component drop down
  3. Set the script’s source

A script component has access to the following variables:

  • object – a reference to the object to which this component is attached
  • time – the current frame time
  • sdk – a reference to the SDK object through which all scene objects can be accessed

8.2.Global Scripts

Creating a global script

  1. Create a new scene (File -> New)
  2. Expand the scene tree (left hand tree view) until you can see the “MyScene” node
  3. Click the “MyScene” node
  4. Right click and select “Add child -> Scripting -> Script”
  5. Select the added Script node and verify the scripts source code is visible in the right hand property pane
  6. Right click and select “Actions -> Execute”
  7. Verify the script has executed

Binding a global script to a keyboard event

  1. Continuing on from the Global scripting steps above, select the root “Group” node in the scene tree view (left hand pane)
  2. Right click and select “Add child -> Input -> Keyboard” to add a Keyboard input capture node to the scene
  3. In the Actions pane (see image below), recreate the details in the image below as follows:
    1. Click “Add” to add an action
    2. Click the Sender field and choose “Keyboard”
    3. Click the Signal field and choose “F – key down”
    4. Click the Receiver field and choose “Script”
    5. Click the Slot field and choose “Execute”
  4. Set focus on the scene view by clicking into the scene view
  5. Hit the “F” key to trigger your action (executes script)

9.Lighting & Materials

A surface’s appearance is determined by the interaction of light with the material applied to the surface. Typically a material references a shader with particular values set i.e. ambient, diffuse and specular colours. The material’s shader will be updated in each frame by the engine with the scene’s light data (i.e. the properties of the lights added to the scene graph).

The engine comes bundled with some simple shaders for colouring and lighting surfaces:

  • Ambient_Diffuse_Specular shader – as the name suggest supports the setting of a surface’s ambient, diffuse and specular colour components
  • Ambient_Diffuse_Specular_Attenuated shader – as above but attenuates the light’s intensity by multiplying it by the inverse square of the light’s distance from the surface

To reference scene lighting data within your own custom shaders add the following uniform:

uniform struct LightInfo
{
	vec4  position;      // w = 0 for directional 1 for positional
	vec3  ambientColor;
	vec3  diffuseColor;
	vec3  specularColor;
	float attenuation;
}lights[8];

When the engine compiles shaders and encounters this uniform it will cache references to it so it can be updated efficiently with the light data you’ve defined in your scene on a per-frame basis.

For a more complete example see the Ambient_Diffuse_Specular_Attenuated source code included with the SDK.

Adding lights to a scene

With one of the above shaders attached to a material (and the material attached to a mesh) you can add lights to your scene by either right clicking the scene graph in the tree view or by clicking the + icon in the tool bar and setting the tool’s option (by clicking the toolbar’s elipsis button) and choosing “Light”.

The following light types are currently supported:

  • Positional
  • Directional

You can then move the lights around in the scene using the move widget and set (and undo) properties via the light’s property panel.

A short clip demonstrating the effect of light attenuation:

10.Shaders

The engine currenty targets GLSL 330 and GLES 330. When providing additional custom shaders for use by both desktop and mobile platforms be sure to provide appropriate shaders for each target platform using the methods below.

The engine ships with some standard shaders and in addition supports 3 methods for adding new shaders:

  1. (From v0.41.5+) Registering custom shaders in your scenes within the editor
  2. Shader discovery from disk (informed by shader search paths that you register in the engine)
  3. Programmatically registering shaders

These are described below.

Method 1: Adding shaders to your scenes in the editor

(Available in v0.41.5+)

Of the three methods this is the easiest as it is done in a couple of clicks.

TODO – Coming soon

Method 2: Shader discovery from disk

The engine supports shader discovery by loading shaders from a special directory:

  • macOS [your_install_location] / fireflyeditor.app / Contents / Frameworks / Shaders
  • Windows [your_install_location] / bin / Shaders
  • Linux supported from version 0.41.4+ – see note below

(Available in v0.41.4+)

In addition to the above, for versions >= 0.41.4 to ensure custom shaders are loaded correctly you need to provide the engine with your additional custom shader paths for the engine to search.

You can do this in one of 3 ways:

  1. In the editor -> Tools -> Settings -> Shader Paths or
  2. Programmatically by specifying a list of shader directories
  3. Programmatically by registering shaders directly with the engine (see below)

Specifying shader directories

SDK::SceneSettings::PathList paths;
paths.push_back("my/first/path");
paths.push_back("my/second/path");

SDK::SceneSettings settings;
settings.ShaderPaths = paths;
SDK::GetInstance().SetSceneSettings(settings);

Method 3: Registering shaders programmatically

For vertex / fragment shader pairs:

firefly::ShaderManager::GetInstance().RegisterShader(shaderName, vertexShaderText, fragmentShaderText)

For post-processing shader vertex / fragment shader pairs:

firefly::SDK::GetInstance().GetRenderer().RegisterPostProcessShader(shaderName, vertexShaderText, fragmentShaderText)

Shaders are automatically loaded from these locations on startup and are then create-able within the editor / engine.

To instantiate these shaders in-editor

  • Click on the Assets tab
  • Click on the Shaders tab
  • Click Add
  • Choose the shader
  • Set its properties
  • Choose the Assets/Materials tab
  • Create a new Material
  • Set the Material’s shader (to the one you created above)
  • Assign the shader to a scene object by selecting the object in the scene and then set its Material property (to the one you created above) or by dragging the material onto 1 or more selected objects

NB. To instantiate shaders in code see the example_Shaders demo source code (accessible from the demo launcher on editor startup).

10.1.Built-in shader variables

Refer to the Shader Reference (accessible from the Editor’s Start Here-> Shader Reference menu).

10.2.Post processing shaders

(NB. Available from version 0.40.1+)

A post processing shader effect is a special type of shader that renders anything beneath it in the scene graph (usually your scene) to a texture. The texture is then displayed on a full screen rectangle so that the rectangle’s fragments cover the entire display and, in so doing, allow a pixel shader to be applied to every pixel on the display to create a full screen (post processing) effect.

From version 0.40.14 the engine supports post-process shader discovery by loading post-processing shaders from a special directory:

  • (macOS) [your_install_location]/fireflyeditor.app/Contents/Frameworks/PostProcess
  • (Windows) [your_install_location]/bin/PostProcess

PostProcess shaders are automatically loaded from these locations on startup and are then can be applied within the editor with a single click.

This short video demonstrates how a post processing shader can be applied to a scene within the editor.

NB. For details on how to apply a post processing shader to a scene programatically see the example_PostProcessing source code and demo in the demo browser displayed on editor startup.

11.Nested Scenes

(Available from v0.40.15+)

Entire scenes can be nested within other containing scenes using the PrefabricatedScene class. The SDK comes with an example_PrefabricatedScene application sample that demonstrates how to do this programmatically.

The typical workflow is:

  • Create a scene in the editor (or programmatically) adding your models, shaders, animations etc
  • Save your scene
  • Instance your saved scene into a containing scene using the PrefabricatedScene utility class either in the editor or programmatically. NB. You can optionally partially load a branch of the nested scene’s graph through its scene root property. This can be useful if your nested scene contains additional geometry and assets to assist in the editing of the scene but are not needed when nesting into a containing scene.

12.Physics

12.1.Rigid Bodies

(Available from v0.40.19+)

The engine supports rigid body simulations via the Physics plugin (ext_Physics.so / .dll / .dylib) as follows:

NB. Physics are disabled when editing, to see your physics in action you have to Play the scene by clicking the play button in the editor.

  • Attach a RigidBodyComponent to a scene object
  • Determine if you want it to be automatically or manually added to a simulation (by setting its ActivationMode property accordingly)
    • (ActivationMode = Manual) – the rigid body will be added to the simulation when you invoke its Activate method either explicitly in the editor or via script
    • (ActivationMode = Automatic) – the rigid body will be added to the simulation immediately
  • Specify its physical properties (i.e. mass, restitution, initial velocity, initial position etc) either directly or via a script (see the samples that install with the editor for details)
  • Activate the object by invoking its Activate function either by right-clicking the object to which the component is attached and selecting Action -> Activate or in script by obtaining a handle to the object’s RigidBodyComponent and invoking its Activate function programmatically.

For programmatic manipulation of Physics see the following examples provided with the editor and accessible from the Samples & Templates dialog:

  • example_Physics
  • example_Physics_Impulse
  • example_Collision
  • example_Collision_Triggers

12.2.Collision monitoring triggers and scripts

The engine supports the monitoring of collisions, collision response and invoking collision scripts in response to collisions.

The key concepts are:

  • Collision groups
  • Collision masks
  • Collision scripts

For coverage of these concepts see the following tutorial:

13.Text & Annotations

To add text / annotate objects:

  • Select the object you want to annotate

(In versions <= v0.40.30)

NB. Fonts once loaded are serialised into the scene file upon saving.

  • Add a TextComponent
  • Specify the font, color and text
  • Check the “Track Object” checkbox (this will position the text at the object’s location)

(In versions > v0.40.30)

NB. Fonts once loaded are serialised into the scene file upon saving. Improves on previous releases by allowing multiple TextComponents to refer to the same font asset.

  • Load a font asset (Assets -> Fonts)
  • Add a TextComponent
  • Specify the font asset and text
  • Check the “Track Object” checkbox (this will position the text at the object’s location

14.Transform Hierarchies

Available in v0.40.26+

Transform hierarchies support the grouping and moving / rotating of objects relative to a parent coordinate frame.

To group several objects under a single Transform cooridinate frame:

  • (In the scene tree) add a Transform node to the scene (right click -> Add child -> Geometry -> Transform)
  • Drag your objects (in the scene tree) under the Transform node
  • Set your objects IgnoreParentTransform property to false
  • (Optional) if you want to display a widget for the transform set its ShowWidget property to true. You can then move and rotate the transform via the transform’s widget by selecting the transform manipulator tools in the editor

For an example of transform hierarchies in action see the TransformHierarchy example scene shown on editor startup.

 

15.Texturing

(Available from v0.40.6)

The editor / engine supports texturing. Typical workflow is:

  • Drag texture images into the editor (Assets -> Textures)
  • Assign the texture(s) to a shader that supports texturing (i.e. has texture inputs)
  • Assign the shader to a material
  • Apply the material to a model

The short video below demonstrates the workflow:

15.1.Multitexturing

Follows a similar workflow to texturing (see video below).

16.Sprites

(Available in v0.41.4+)

The Sprite plugin comes with the following:

  • SpriteSystem
  • SpriteComponent

The SpriteComponent can be attached to a scene item to render on screen sprites. 

SpriteComponent exposes the OnClicked() signal that will emit an event when clicked. This can be wired to a slot (for example a Script’s execute() slot) either in code or in the editor’s Action panel.

17.Input handling

Keyboard input handling

  1. Add a KeyboardSceneItem to your scene (scene tree view -> Right Click -> Add Child -> Input -> Keyboard
  2. Add a script to your scene (scene tree view -> Right Click -> Add Child -> Scripting -> Script
  3. In the Actions pane connect the Keyboard’s key events to the Script’s execute method

For an example see the example scenes shown in the Templates & Samples browser shown on startup in the editor.

Mouse input handling

  1. Add a MouseSceneItem to your scene (scene tree view -> Right Click -> Add Child -> Input -> Mouse
  2. Add a script to your scene (scene tree view -> Right Click -> Add Child -> Scripting -> Script
  3. In the Actions pane connect the Mouse’s key events to the Script’s execute method

For an example see the example scenes shown in the Templates & Samples browser shown on startup in the editor.

On-screen input handling

For mobile devices that don’t have physical input devices you can:

  1. Add a SpriteComponent to a scene item
  2. In the Actions pane, connect the SpriteComponent’s OnClicked event to a script’s execute() slot

For more details see the Sprite documentation.

18.Pathfinding & Navigation

(NB. Navigation System & Components available in version 0.40.29+ prior versions had (now deprecated) SceneItem based Navigation)

The engine comes with a path-finding plugin that provides a path-finding system, components and an example application that demonstrates their use.

The key concepts are:

  • Navigation Mesh
  • Navigation Crowd
  • Navigation Agent

Each of these concepts are modelled as ECS components (in v0.40.29+) that can be attached to scene items.

See the example_Navigation source code accessible from the editor’s Start here -> Templates & Samples Browser menu alternatively a short video below demonstrates these concepts.