The Manual

1.Getting started

Welcome to the documentation for the SDK and editor. The best way to learn about the SDK is to browse the demo apps and source code shown on editor startup. You can also find a collection of video tutorials here.

1.1.Getting in touch

If you:

  • Encounter any problems that aren’t addressed in this manual’s Troubleshooting section or
  • Have any suggestions for any feature requests or improvements

Then get in touch over the following channels:


Download the sdk & editor from here:

Steam Deck:

  • Launch the app from your steam client (best enjoyed in Steam Deck’s desktop mode)
  • If you want to publish your apps as code projects you’ll need an IDE (Integrated Development Environment) for coding in. I use Qt Creator but you can use CLion also. Simply go to your SteamDeck Desktop Mode start menu and type “Discover” then install your IDE of choice. You can then open a published game’s CMakeLists.txt file in your IDE and start coding / debugging etc.


  • UPDATE: Ubuntu 22.04 users – you need to `sudo apt install libfuse2` as it is a required AppImage dependency (no longer bundled with Ubuntu)
  •  Either:
    • Right click the .AppImage and set the executable check box or
    • chmod u+x Firefly*.AppImage from the command line in the directory containing the downloaded .AppImage
  • Double click to launch


  • Extract the zip file and then double click editor.bat to launch the editor

macOS (Deprecated):

  • Double click the fireflyeditor.dmg
  • Drag the into the Applications folder (linked to in the installer)
  • Launch the from your Applications folder


Supported architectures: armeabi-v7a, x86,  arm64-v8a and x86-64

Supported OpenGL ES version: 3.2

NB. No tooling is required to play scenes on Android however if you want to publish your scene to the play store you’ll need to:

  • Install Android Studio and NDK r18
  • (In the editor from v0.41.0+) File -> Publish will publish both a desktop and android-studio project

2.Key Concepts

The engine is ECS (Entity Component System) based and comes bundled with its own custom event system that allows you to connect signals (emitted by senders) to slots (on receivers).

NB. This engine does NOT use any GPL or LGPL libraries internally and so games produced by it have no licensing requirements. The engine comes with its own (pluggable) custom modern OpenGL based back end renderer, ECS implementation and signal & slot framework.

A brief description of the key concepts:

  • System (AbstractSystem) – is a system responsible for updating a set of components of a specific type in a cache friendly manner
  • Component (AbstractComponent) – is a collection of functions and data that can be attached to a SceneItem and is processed by an associated System
  • SceneItem – is this engine’s “entity” to which Components are attached
  • Signal (AbstractSignal) – an event emitted by a SceneItem or Component
  • Slot (AbstractSlot) – a function on a SceneItem or Component

For more details see the code samples and demos bundled with the SDK (displayed in the editor on startup).


3.1.Code Samples

The editor ships with the following code samples (runnable demos with accompanying code accessible from the editor’s “Start Here -> Templates & Samples Browser” menu).

A brief summary of the samples:

  • example_Model – model loading
  • example_PrefabricatedScene – how to instance a “prefabricated” scene (i.e. a prefabricated scene being a scene created in the editor with animations / models / shaders / scripts / textures etc) into a containing scene
  • example_Collisions – collision detection and response
  • example_Collisions_Triggers – collision trigger volumes and collision scripts
  • example_Physics – rigid body physics
  • example_EmptyGame – a simple empty game skeleton
  • example_Animation – key frame based property animation support

3.2.Walkthrough: Creating a game project


Typical Workflow:

Creating a game is a 3 step process:

  1. Open the editor and create a basic scene (or select one of the template scenes from the Start Here -> Templates & Samples Browser editor menu)
  2. File -> Publish editor menu to publish the scene as a code project and then run CMake on the generated project as per the instructions displayed on generation
    • Android: simply open the published game project (an Android specific project named as [YourGame_Android] is generated when publishing) in Android Studio, build and run.
    • Windows: when configuring in CMake be sure to set the target architecture to x64 – see image below. NB. (In your IDE) build the code project under either the Release or RelWithDebInfo configuration as the SDK only provides the engine Release binaries.
    • macOSto fix signing errors use an appropriate code signing identity and profile and set Other Code Signing Flags  to –deep (see image below)
  3. Add code and / or editor generated scenes to the code project (code examples are supplied with the editor under the Start Here-> Templates & Samples Browser editor menu)
Windows CMake settings
Xcode code signing settings

3.3.Walkthrough: Creating a System / Component

(Available in Editor/SDK version 0.39.0+)

SimulationStarterKit is a plugin-based Entity Component System (ECS) architecture. ECS allows you to add multiple (and varying) behaviours (components) at runtime to entities (an entity being a game object or “SceneItem” in this SDK’s parlance). All components of a specific type are processed by an associated system in a cache friendly manner for performance.

There are 2 ways to extend the engine adding custom systems and components:

  • As plugins as per the video below
  • Explicitly in code as per the example_Components code sample (accessible from the editor’s “Start here -> Templates & Samples Browser” menu)

In this walkthrough we will extend the engine adding a new Component and System plugin using an editor supplied code wizard.

Steps For Creating a System / Component Plugin


To ensure your plugins are loaded correctly you need to provide the engine with your plugin paths.

You can do this in one of 2 ways:

  1. In the editor -> Tools -> Settings -> Plugin Paths or
  2. Programmatically as follows:
SDK::SceneSettings::PathList paths;

SDK::SceneSettings settings;
settings.PluginPaths = paths;

Troubleshooting plugins

If your plugin contains bugs that crash the editor but you have already set the editor’s plugin paths and can’t open it to unset them then you can launch the editor with the safemode command line argument and then unset them inside (the now loaded) editor.

With this in mind watch the videos below for platform specific instructions.



Open the editor

Click the Menu -> Tools -> Development -> New SceneItem / System / Component plugin menu item

Set the:

  • Plugin name to MyPlugin
  • Initial Scene Item name to MySceneItem
  • Specify the path you want to generate code in (should be a writable location)

Click the Generate button

In the wizard’s output pane you should see some output like:

Successfully generated plugin MyPlugin6. Your initial scene item’s uuid is {92e6af43-6776-45a0-9e63-159463e3005e}.

Next, open CMakeGUI and set your source path to:


Launch the CMake GUI (see below) and set the path to your Plugin’s source code as indicated by the wizard output and a directory to build your plugin in.

Hit the Configure button

Fix the error (i.e. CMake can’t locate the SDK) by setting the firefly_DIR variable to the install location of the app pointing to the path of the SDK’s CMake package config file (under the’s bundle in the Contents/Resources/CMake directory – see the above image for a typical path)

Hit the Configure button

Hit the Generate button

Hit the Open Project button to launch your project in Xcode (assuming it’s installed)

Hit the build button in Xcode

NB. The generated Xcode project contains a post-build step that will copy your plugin to the editor’s ‘Frameworks’ folder and so is immediately available in the engine / editor.

Relaunch the editor

Go to the System tab and confirm you have a system called MySystem also confirm that your component is available for adding to game objects by clicking the Component drop down in the editor right hand pane (beneath the properties pane, the component name should be MyComponent)

Congratulations, you’ve created a plugin and it is now accessible within the editor / engine, can be attached to scene items and will be updated each frame by your MySystem system.

NB. To debug your plugin in Xcode set the launching executable for your plugin to the in your /Applications folder (or your install location) and then add your breakpoints then build / run as usual.

Next, we will add a property to our component and modify the system to do something with that property.



See the sub-sections below.



Publishing to Android

In order to build the published game for Android you need:

  • Android Studio
  • Android NDK 18
  • Define ANDROID_NDK_HOME environment variable to point to your NDK installation. Typically, install Android Studio, then download your NDK from within Android Studio (Tools -> SDK Manager menu) then in there you should see the path to your sdk, your ndk path will be beneath that in your file system (open File Explorer to confirm) and then paste the NDK location into an ANDROID_NDK_HOME environment variable and restart Android Studio
  • Also, check that the file in your Android project doesn’t have any invalid paths in it

Errors and fixes

  • Undefined pthread_atfork – bump APP_PLATFORM and minSdkVersion to be >= 21 (this is fixed in projects generated in the editor from v0.42.3+)


If you experience an application crash you can obtain the crash dump file from your machine at the following location: C:\Users\[your_user_name]\AppData\Local\CrashDumps which you can share over the product’s discord channel or Steam community hub.

If you are running under the SYSTEM account the crash dump location is: C:\Windows\System32\config\systemprofile\AppData\Local\CrashDumps.

If a crash dump hasn’t been generated then create the following registry key (in regedit): HKLM\SOFTWARE\Microsoft\Windows\Windows Error Reporting\LocalDumps then re-run the app and obtain the crash dump file from one of the locations above.



Here are the code signing setting I use during development:



My ECS plugins aren’t being loaded

Ensure your plugin paths have been specified either in the editor or programatically in your game code. For more details see the plugins tutorial.

5.The Editor

5.1.The toolbar

The toolbar is loaded by the editor on startup by scanning the editor directory for plugins that expose specific entry points that provide access to tool icons and behaviours and so is fully customisable through plugins. By default the editor ships with an interaction plugin that provides pick, move, rotate and add tools.

From left to right (beneath the File / Edit / etc menu):

  • View: Displays a drop down of viewpoints (i.e. cameras) currently added to the scene. You can change the currently active camera here. The active viewpoint can also be modified in script and through an action connection described later on in this page.
  • Snap: Snap to grid settings that can be customised by clicking on the elipsis button to its right
  • Play tool -Runs the scene in a standalone executable for testing the scene as it might appear as a (published) standalone scene. This also enables you to have both the editor and the running scene open simultaneously on a multi-monitor display. The standalone executable’s scene can be reloaded by pressing the F5 key.
  • Pan tool Activates panning mode. HINT: To pan relative to an object, first, select the object you want to pan relative to using the Pick tool then re-activate the pan tool and start panning.
  • Pick tool – Activates picking. Holding down CTRL whilst picking allows you to pick multiple objects
  • Move tool – Activate to move picked objects
  • Rotate tool – Activate to rotate picked objects
  • Add tool – Activate to add objects to the scene
  • Tool options – This button with an elipsis to the right of the Add tool can be clicked to open an application modal dialog that displays the options associated with the active tool. The options are provided by the tool’s plugin for a given tool.

NB. These tools are provided through the Interaction plugin. You can add your own icons to this toolbar as described in a walk through on this page.

The Scene Tree View

The scene tree (below) represents the scene hierarchy. This is purely a data representation, in reality, scene items are rendered in batches by the engine to reduce state switches so isn’t necessarily executed in the order seen in the tree however transform hierarchies are respected (internally – basically any changes in a parent transform are efficiently communicated to child items if they have their IgnoreParentTransforms property set to false).

5.2.Adding objects

  1. Select the Add object tool
  2. Choose Box in the tool options dialog (you can open this dialog by clicking the tool options button on the main toolbar as described above)
  3. Click on the scene (or an object within the scene) to add boxes under the clicked position. If no mesh is found under the mouse click then the newly added object will be created at the scene origin
  4. Verify the boxes have been placed in the positions as expected
  5. Perform undo (CTRL + Z) to remove all added boxes
  6. Perform redo (CTRL + Y) to re-add all boxes
  7. Save scene (File -> Save menu)
  8. New scene (File -> New menu)
  9. Load the scene saved in step 7 (File -> Open menu)
  10. With the scene now loaded hit the “Run!” toolbar button and verify your scene loads in a separate window

5.3.Importing models

(Available in version 0.40.11+)

Scenes can be imported from a wide variety of formats. Once imported scenes can be saved and then instanced in containing scenes. This process is demonstrated in the video below:

5.4.Selecting objects

  1. Active the picking tool in the toolbar
  2. Left mouse click on an object to select
  3. To select multiple objects hold down either the Command key (Apple) or CTRL key (Windows) to select multiple objects

5.5.Duplicating objects

  1. Load a previously generated scene
  2. Select a scene object using the picking tool
  3. Activate the Move tool
  4. Hit CTRL + D to duplicate the scene object
  5. Move the duplicated scene object with the move widget

5.6.Moving, rotating and scaling objects

  1. Load a previously generated scene
  2. Select a single object in the scene
  3. Activate the Move tool
  4. Move the picked object
  5. Undo the object move (CTRL + Z)
  6. Redo the object move (CTRL + Y)
  7. Activate the pick tool
  8. Select multiple scene objects
  9. Activate the Move tool
  10. Move the picked objects
  11. Perform undo and redo and verify the movement is un-done / re-done as expected

5.7.Snapping movement to a grid

Snapping movement to a grid

(Available in version 0.40.12+)

Movement can be snapped to a grid. Multiple snap settings can be defined and activated directly within the editor’s main window as demonstrated in the video below:

5.8.Previewing a scene

  1. Select the Play button in the editor toolbar
  2. Verify your scene opens in a separate window
  3. (With the preview window still open) In the editor window activate the add object tool
  4. Place another object in the scene by clicking in the editor’s scene view
  5. Save the scene File -> Save menu
  6. Activate the preview window by mouse clicking in the preview window
  7. Hit F5 to re-load the preview window’s scene
  8. Verify the scene reloaded in the preview window and its appearance reflects the scene in the editor

5.9.Switching viewpoints

You can add multiple viewpoints to a scene. You can specify which viewpoint should be the default (i.e. when the scene is run) by setting the viewpoint’s DefaultCamera property to true.

Types of cameras supported currently are:

  • EditorCamera – this is an implementation of a 3D editor style camera. By default it pans when left mouse button dragging, zooms when using the mouse middle wheel and can have its view direction changed by holding down the SHIFT key whilst left mouse button dragging.
  • FirstPersonCamera – an implementation of a simple first person perspective camera.

To change between viewpoints in the editor:

  1. Beneath the menu bar (i.e. beneath the panel that hosts the main menus File, Edit, Tools etc) there’s a viewpoint drop-down list. Modify the selected camera.
  2. Verify the viewpoint changes
  3. Expand the scene tree until the Editor Camera node is visible
  4. Select the Editor camera and right click choosing the “Make this a child of -> Cameras -> Editor Camera”
  5. Verify the newly added camera appears in the viewpoint drop-down list in step 2.

NB. To allow a user to toggle between different views in your scene you can add an Input capture device (i.e. Keyboard) to the scene and create an Action (as described in “Scripting – Activating a script”) to wire a Keyboard key press event to a Camera’s “Activate” slot. You can assign an action (i.e. key press) to each Camera of interest.

6.Publishing & Packaging

Publish your scene as a code project

  1. You can export your scene either as a standalone executable or a code-project from the editor’s File -> Publish menu
  2.  To build and package the generated code project refer to the instructions below for your target platform

Packaging Code Projects

Whilst exporting as a standalone executable requires no action on your behalf when publishing your game as a code project (for extra flexibility when it comes to customisation) you can generate a package from your code project as follows:

Steam Deck:

Open the generated CMakeLists.txt file in your IDE of choice (I use Qt Creator – you can install this on SteamDeck by going to the SteamDeck’s Desktop Mode start menu, typing “Discover” and then install Qt Creator from the Software Centre app displayed).

Linux (for games published in editor v0.43.3+):

This is described in your generated project’s file but simply:

  • Prerequisites: (One time only) install required dev library packages: `
    sudo apt install build-essential libglu1-mesa-dev freeglut3-dev mesa-common-dev libxext-dev swig cmake mesa-utils libfuse2`
  • cd [your_generated_game_project_dir]
  • build it by running ./
  • package it by running ./
  • your generated is in the build directory and will be called something like: my-game-x86_64.AppImage
  • to run:
    • Steam Deck: simply run your generated AppImage or right click it and add it to your steam library or send to a friend
    • Linux: simply mark your AppImage as executable (File -> Properties -> Permsiions)


  • Prerequisites:
  • (In the generated Visual Studio project) – right click the PACKAGE project and build – this will generate a zip file package for your game in your game’s build directory that you can then distribute


  • Prerequisites: Android Studio Electric Eel, Gradle 5.6.4, NDK 22.1.7171670, Android SDK Build Tools 34, Android SDK Platform-Tools 34 – see the README.txt in your generated project for more details
  • Open the generated Android project in Android Studio and build the APK


The engine is fully scriptable using LUA. All classes described in the API documentation (Help -> API documentation menu) are scriptable with the API documentation being a reference to the (scriptable) API.

There are 2 scripting approaches:

  • Adding a Script component to a scene item
  • Creating a global script


7.1.Script Component

(Available from version 0.40.2+)

  1. Select an object in the scene
  2. In the Property Pane (right hand side of editor) select the “Script” component in the component drop down
  3. Set the script’s source

A script component has access to the following variables:

  • object – a reference to the object to which this component is attached
  • time – the current frame time
  • sdk – a reference to the SDK object through which all scene objects can be accessed

7.2.Global Scripts

Creating a global script

  1. Create a new scene (File -> New)
  2. Expand the scene tree (left hand tree view) until you can see the “MyScene” node
  3. Click the “MyScene” node
  4. Right click and select “Add child -> Scripting -> Script”
  5. Select the added Script node and verify the scripts source code is visible in the right hand property pane
  6. Right click and select “Actions -> Execute”
  7. Verify the script has executed

Binding a global script to a keyboard event

  1. Continuing on from the Global scripting steps above, select the root “Group” node in the scene tree view (left hand pane)
  2. Right click and select “Add child -> Input -> Keyboard” to add a Keyboard input capture node to the scene
  3. In the Actions pane (see image below), recreate the details in the image below as follows:
    1. Click “Add” to add an action
    2. Click the Sender field and choose “Keyboard”
    3. Click the Signal field and choose “F – key down”
    4. Click the Receiver field and choose “Script”
    5. Click the Slot field and choose “Execute”
  4. Set focus on the scene view by clicking into the scene view
  5. Hit the “F” key to trigger your action (executes script)


8.1.Rigid Bodies

(Available from v0.40.19+)

The engine supports rigid body simulations via the Physics plugin ( / .dll / .dylib) as follows:

NB. Physics are disabled when editing, to see your physics in action you have to Play the scene by clicking the play button in the editor.

  • Attach a RigidBodyComponent to a scene object
  • Determine if you want it to be automatically or manually added to a simulation (by setting its ActivationMode property accordingly)
    • (ActivationMode = Manual) – the rigid body will be added to the simulation when you invoke its Activate method either explicitly in the editor or via script
    • (ActivationMode = Automatic) – the rigid body will be added to the simulation immediately
  • Specify its physical properties (i.e. mass, restitution, initial velocity, initial position etc) either directly or via a script (see the samples that install with the editor for details)
  • Activate the object by invoking its Activate function either by right-clicking the object to which the component is attached and selecting Action -> Activate or in script by obtaining a handle to the object’s RigidBodyComponent and invoking its Activate function programmatically.

For programmatic manipulation of Physics see the following examples provided with the editor and accessible from the Samples & Templates dialog:

  • example_Physics
  • example_Physics_Impulse
  • example_Collision
  • example_Collision_Triggers

8.2.Rigid Bodies - Cameras

(Available from v0.43.7+)

To add physics to your camera add a RigidBodyComponent component to your camera and set its IsKinematicObject and CollisionResponseEnabled properties to true.

For more details see the RigidBodyCamera.msf sample scene in the examples shown on editor startup.

8.3.Collision monitoring triggers and scripts

The engine supports the monitoring of collisions, collision response and invoking collision scripts in response to collisions.

The key concepts are:

  • Collision groups
  • Collision masks
  • Collision scripts

For coverage of these concepts see the following tutorial:


(Available in v0.42.0+)

Graphical user interfaces can be fully defined in script using the GUIComponent. The GUI framework is a lua wrapping of Dear ImGUI with the lua module name instead being ext_GUIScript, in other words, to use, simply following the Dear ImGUI examples replacing the imgui:: namespace with ext_GUIScript.

To add a GUI to your scene :

  1. (In the scene tree view) Right click your scene graph -> Add Child -> Standard -> GhostItem
  2. Select the added item in the scene tree then in the Component panel -> Add Component -> GUIComponent
  3. A sample GUI will appear positioned at 0,0 in your scene
  4. Edit your GUIComponent‘s script as appropriate (for details on how to create GUIs refer to the scripting reference in the editor’s help menu or alternatively refer to the examples in the Templates & Examples browser accessible from the editor’s help menu)



The engine ships with some standard shaders and in addition supports 3 methods for adding new shaders:

  1. (From v0.41.5+) Registering custom shaders in your scenes within the editor
  2. Shader discovery from disk (informed by shader search paths that you register in the engine)
  3. Programmatically registering shaders

These are described below.

Method 1: Adding shaders to your scenes in the editor

(Available in v0.41.5+)

Of the three methods this is the easiest:

  1. Click on Assets
  2. Select the Shaders tab
  3. Click Add
  4. When the Shader Picker dialog is shown click on the “+” to register a new custom shader and supply the shader source code to the Add / Edit shader dialog that is shown (see below)

Method 2: Shader discovery from disk

The engine supports shader discovery by loading shaders from a special directory:

  • macOS [your_install_location] / / Contents / Frameworks / Shaders
  • Windows [your_install_location] / bin / Shaders
  • Linux supported from version 0.41.4+ – see note below

(Available in v0.41.4+)

In addition to the above, for versions >= 0.41.4 custom shaders can be loaded by providing additional custom shader paths for the engine to search.

You can do this in one of 3 ways:

  1. In the editor -> Tools -> Settings -> Shader Paths or
  2. Programmatically by specifying a list of shader search directories

Specifying shader directories

SDK::SceneSettings::PathList paths;

SDK::SceneSettings settings;
settings.ShaderPaths = paths;

Method 3: Registering shaders programmatically

For vertex / fragment shader pairs:

firefly::ShaderManager::GetInstance().RegisterShader(shaderName, vertexShaderText, fragmentShaderText)

For post-processing shader vertex / fragment shader pairs:

firefly::SDK::GetInstance().GetRenderer().RegisterPostProcessShader(shaderName, vertexShaderText, fragmentShaderText)

Shaders are automatically loaded from these locations on startup and are then create-able within the editor / engine.

To instantiate these shaders in-editor

  1. Click on the Assets tab
  2. Click on the Shaders tab
  3. Click Add
  4. Choose the shader
  5. Set its properties
  6. Choose the Assets/Materials tab
  7. Create a new Material
  8. Set the Material’s shader (to the one you created above)
  9. Assign the shader to a scene object by selecting the object in the scene and then set its Material property (to the one you created above) or by dragging the material onto 1 or more selected objects

NB. To instantiate shaders in code see the example_Shaders demo source code (accessible from the demo launcher on editor startup).

10.1.1.Built-in shader variables

Refer to the Shader Reference (accessible from the Editor’s Start Here-> Shader Reference menu).

10.1.2.Post processing shaders

(NB. Available from version 0.40.1+)

A post processing shader effect is a special type of shader that renders anything beneath it in the scene graph (usually your scene) to a texture. The texture is then displayed on a full screen rectangle so that the rectangle’s fragments cover the entire display and, in so doing, allow a pixel shader to be applied to every pixel on the display to create a full screen (post processing) effect.

From version 0.40.14 the engine supports post-process shader discovery by loading post-processing shaders from a special directory:

  • (macOS) [your_install_location]/
  • (Windows) [your_install_location]/bin/PostProcess

PostProcess shaders are automatically loaded from these locations on startup and are then can be applied within the editor with a single click.

This short video demonstrates how a post processing shader can be applied to a scene within the editor.

NB. For details on how to apply a post processing shader to a scene programatically see the example_PostProcessing source code and demo in the demo browser displayed on editor startup.


(Available from v0.43.9+)

FXAA-based antialiasing is available as a post process shader from v0.43.9. For more details refer to the City.msf demo show on editor startup in the templates list (the realtime settings tab in this demo’s GUI allows you to enable / disable FXAA and so serves as a guide on how to do this).

10.3.Mesh Instancing

(Available in v0.41.8+)

With mesh instancing you can render many thousands of copies of a mesh in a single draw call. Per instance data (such as color, position etc) is specified as data that is mapped to shader inputs that are then available within a shader on a per instance basis.

See the example_Instancing and example_Instancing_Simple shipped with the editor for a complete code example and runnable demo.

10.4.Lighting & Materials

A surface’s appearance is determined by the interaction of light with the material applied to the surface. Typically a material references a shader with particular values set i.e. ambient, diffuse and specular colours. The material’s shader will be updated in each frame by the engine with the scene’s light data (i.e. the properties of the lights added to the scene graph).

The engine comes bundled with some simple shaders for colouring and lighting surfaces:

  • Ambient_Diffuse_Specular shader – as the name suggest supports the setting of a surface’s ambient, diffuse and specular colour components
  • Ambient_Diffuse_Specular_Attenuated shader – as above but attenuates the light’s intensity by multiplying it by the inverse square of the light’s distance from the surface

To reference scene lighting data within your own custom shaders add the following uniform:

uniform struct LightInfo
	vec4  position;      // w = 0 for directional 1 for positional
	vec3  ambientColor;
	vec3  diffuseColor;
	vec3  specularColor;
	float attenuation;

When the engine compiles shaders and encounters this uniform it will cache references to it so it can be updated efficiently with the light data you’ve defined in your scene on a per-frame basis.

For a more complete example see the Ambient_Diffuse_Specular_Attenuated source code included with the SDK.

Adding lights to a scene

With one of the above shaders attached to a material (and the material attached to a mesh) you can add lights to your scene by either right clicking the scene graph in the tree view or by clicking the + icon in the tool bar and setting the tool’s option (by clicking the toolbar’s elipsis button) and choosing “Light”.

The following light types are currently supported:

  • Positional
  • Directional

You can then move the lights around in the scene using the move widget and set (and undo) properties via the light’s property panel.

A short clip demonstrating the effect of light attenuation:


(Available from version 0.44.0+)

To mark a light or other pickable scene item as casting shadows:

  • Select the light in the scene graph and set its “IsShadowCaster” property to true
  • Repeat for each of the items you want to cast shadows
  • Adjust the light’s frustum (i.e. the size of the light volume) by setting the light’s FrustumOrthographic and FrustumNearFar properties (see below for an explanation)

A brief description of light properties that influence shadows:

  • IsShadowCaster
  • FrustumOrthographica 4 value vector where x = left, y = right, z = bottom, w = top extents of the light’s frustum
  • FrustumNearFarA 2 value vector where x = near clip plane and y = far clip plane extents of the light’s frustum

NB. Currently the shaders that apply the generated shadow map to a surface are restricted to the Ambient / Diffuse / Specular shaders (both non-instanced and instanced versions). Broader support for shadows maps will improve in future releases.

10.5.Text & Annotations

To add text / annotate objects:

  • Select the object you want to annotate

(In versions <= v0.40.30)

NB. Fonts once loaded are serialised into the scene file upon saving.

  • Add a TextComponent
  • Specify the font, color and text
  • Check the “Track Object” checkbox (this will position the text at the object’s location)

(In versions > v0.40.30)

NB. Fonts once loaded are serialised into the scene file upon saving. Improves on previous releases by allowing multiple TextComponents to refer to the same font asset.

  • Load a font asset (Assets -> Fonts)
  • Add a TextComponent
  • Specify the font asset and text
  • Check the “Track Object” checkbox (this will position the text at the object’s location

10.6.Transform Hierarchies

Available in v0.40.26+

Transform hierarchies support the grouping and moving / rotating of objects relative to a parent coordinate frame.

To group several objects under a single Transform coordinate frame either:

  1. (In the scene tree) add a Transform node to the scene (right click -> Add child -> Geometry -> Transform)
  2. Drag your objects (in the scene tree) under the Transform node
  3. Set your objects IgnoreParentTransform property to false

Or (from version v0.41.6+):

  1. (In the editor) select the objects you want to group in the rendering view
  2. Right click -> Group

To move or rotate the objects as a group:

  1. Display a widget for the (parent) transform (created when grouping objects) by setting its ShowWidget property to true.
  2. Activate the move or rotate tool
  3. Select the transform’s widget (in the scene render view)
  4. Move / rotate the objects as a group

To change the offset of an object in a group (relative to the group’s parent transform):

  1. Select the object (in the scene render view)
  2. Activate the move or rotate tool
  3. Move / rotate the object

For an example of transform hierarchies in action see the TransformHierarchy example scene shown on editor startup.



(Available from v0.40.6)

The editor / engine supports texturing. Typical workflow is:

  • Drag texture images into the editor (Assets -> Textures)
  • Assign the texture(s) to a shader that supports texturing (i.e. has texture inputs)
  • Assign the shader to a material
  • Apply the material to a model

IMPORTANT: Texture assets are only saved if their “IsStockTexture” property is set to false.

The short video below demonstrates the workflow:


Follows a similar workflow to texturing (see video below).

10.8.Nested Scenes

(Available from v0.40.15+)

Entire scenes can be nested within other containing scenes using the PrefabricatedScene class. The SDK comes with an example_PrefabricatedScene application sample that demonstrates how to do this programmatically.

The typical workflow is:

  • Create a scene in the editor (or programmatically) adding your models, shaders, animations etc
  • Save your scene
  • Instance your saved scene into a containing scene using the PrefabricatedScene utility class either in the editor or programmatically. NB. You can optionally partially load a branch of the nested scene’s graph through its scene root property. This can be useful if your nested scene contains additional geometry and assets to assist in the editing of the scene but are not needed when nesting into a containing scene.


(Available in v0.41.4+)

The Sprite plugin comes with the following:

  • SpriteSystem
  • SpriteComponent

The SpriteComponent can be attached to a scene item to render on screen sprites. 

SpriteComponent exposes the OnClicked() signal that will emit an event when clicked. This can be wired to a slot (for example a Script’s execute() slot) either in code or in the editor’s Action panel.

10.10.Frame rate capping (Vsync)

(Available from version 0.41.6)

The frame rate can be capped to synchronize with the display device’s refresh rate (this is called vsync) on a per scene basis. The setting that controls this is Swap Interval and can be set either:

  • In the editor in Settings -> Rendering -> Swap Interval (0 == max frame rate, 1 == synchronize with refresh rate)
    1. Change the Swap Interval value in Settings
    2. Save your scene File -> Save
    3. Play the scene (by clicking the Play button in the editor)
    4. Observe the swap interval has changed
  • Programmatically via SDK::GetInstance().SetSceneSettings()

NB. The editor’s frame rate is always VSync-ed only played scenes observe the swap interval value.

Warning: Setting Swap Interval to 0 results in high frame rates but will thrash the CPU and turn your device into a heater (handy in the winter)!

10.11.Ambient Occlusion Map Generation

10.12.Depth Buffer - Visualisation

(Available from version 0.43.0)

To visualise your scene’s depth buffer:

  1. Select your scene’s root node in the scene tree view
  2. Right click -> Make This A Child Of -> PostProcess / postprocess_depthbuffer

11.Input handling

Keyboard input handling

  1. Add a KeyboardSceneItem to your scene (scene tree view -> Right Click -> Add Child -> Input -> Keyboard
  2. Add a script to your scene (scene tree view -> Right Click -> Add Child -> Scripting -> Script
  3. In the Actions pane connect the Keyboard’s key events to the Script’s execute method

For an example see the example scenes shown in the Templates & Samples browser shown on startup in the editor.

Mouse input handling

  1. Add a MouseSceneItem to your scene (scene tree view -> Right Click -> Add Child -> Input -> Mouse
  2. Add a script to your scene (scene tree view -> Right Click -> Add Child -> Scripting -> Script
  3. In the Actions pane connect the Mouse’s move / key events to the Script’s execute method

For an example see the example scenes shown in the Templates & Samples browser shown on startup in the editor.

Touch-screen input handling

For mobile devices that don’t have physical input devices you can:

  1. Add a FirstPersonCamera to your scene
  2. Set its InputController property to TouchScreen

For more details see the Baked Lighting sample scene shown on startup as this has a touch screen controller set on the FirstPersonCamera

Adding Buttons

(Nb. in a future release full UI capability will be added but for the time being buttons can be implemented as follows):

  1. Add a SpriteComponent to a scene item
  2. In the Actions pane, connect the SpriteComponent’s OnClicked event to a script’s execute() slot (Nb. the scripting reference is accessible from the editor’s Start Here menu)

For more details see the Sprite documentation.

12.Pathfinding & Navigation

(NB. Navigation System & Components available in version 0.40.29+ prior versions had (now deprecated) SceneItem based Navigation)

The engine comes with a path-finding plugin that provides a path-finding system, components and an example application that demonstrates their use.

The key concepts are:

  • Navigation Mesh
  • Navigation Crowd
  • Navigation Agent

Each of these concepts are modelled as ECS components (in v0.40.29+) that can be attached to scene items.

See the example_Navigation source code accessible from the editor’s Start here -> Templates & Samples Browser menu alternatively a short video below demonstrates these concepts.

13.Geometry Processing

The editor supports an evolving number of convenient geometry processing routines described in more detail in the following sections.

13.1.Boolean Mesh Operations

The editor comes bundled with a CSG plugin that supports the following operations:

  • Union
  • Intersection
  • Difference

These tools can be useful, for example, in combining meshes for generating navigation meshes as follows (also demonstrated in the video below):

  1. Import or add 1 or more meshes to the scene
  2. Activate the selection tool and then command click to select multiple meshes
  3. Activate the Union, Intersection or Difference toolbar tool icon
  4. Hit Enter key to apply the operation
  5. A MeshInstance that contains the modified geometry is created
  6. Add a TiledNavMeshComponent to the new MeshInstance to generate a navigation mesh from the updated geometry

14.Computer Vision

(Available in v0.43.9+)

The ComputerVision plugin adds utilities that help support the rapid creation of scriptable computer vision workflows that can be published as standalone applications in a few clicks.

The key utilities are:

  • FrameCaptureComponent – This component grabs the frames from the selected input camera and converts them to textures for display in the scenes framebuffer in addition to making them available for forwarding to a user specified DNN (deep neural network) model for inference
  • ObjectDetectionScriptComponent – This is a script component that can be added to your scenes to perform custom processing on detected objects. The script forwards several arguments to you, namely detected object’s class identifier, name and confidence level. You can then implement script to perform custom actions in response to these detections

NB. The DNN’s specified in the above are compressed and embedded into the scene file for hassle-free transfer / compilation along with any other scene elements (script, GUI etc) that are referenced in your scene files.

Performing actions upon detection

With the ObjectDetectionScriptComponent attached to the scene and the output image (containing the detection) called “FrameCapture_Detections” we can filter and write the detections to file by setting the ObjectDetectionScriptComponent script property to:

function onObjectDetected(this, time, detections)
    for i=0,detections:Size()-1 do
        detection = detections:At(i)
        msg       = string.format('Detection at time %f for classId: %d (%s) with confidence: %f at {top=%d,left=%d,right=%d,bottom=%d}', time, detection.classId, detection.className, detection.confidence,, detection.left, detection.right, detection.bottom)
        if detection.classId == 0 then
            -- we have detected classId 0 i.e. a person so write the detection image 
            -- called "FrameCapture_Detections" (i.e. the output of the object detector) to the /home/test directory
            -- and optionally append the detected time as a suffix to the file name (set to false in the example below)
	    ext_CVScript.WriteImageToFile('FrameCapture_Detections', '/home/test', false)

This is achieved by the computer vision plugin’s use of OpenCV internally with initial support for Tensorflow models. For details on converting a Tensorflow model to OpenCV see this guide here. A good source of pre-trained Tensorflow models can be found on the Tensorflow Hub.

For examples on the use of the above refer to the ComputerVision.msf sample displayed in the example projects list on editor startup.


Available from v0.43.9+

The Audio plugin adds the following utilities for playing audio:

  • AudioSystem
  • AudioClipComponent

Initially an AudioClipComponent is provided with a path to an audio clip. This is read and then serailised as a binary resource into the scene file in which the AudioClipComponent is attached for hassle free sharing of Audio resources.

For a demo see the ComputerVision.msf example scene displayed on editor startup as this utilises the AudioClipComponent in order to play a sound clip in response to a detected object.


(Available in v0.43.9+)

The Networking plugin adds the following utilities:

  • ClientHost – This class can be added to scenes to handle either client or server based socket communications (UDP) i.e. connecting to remote machines, sending and receiving data
  • NetworkScriptComponent – This component can be added to scenes to respond to (and receive information relating to) client / server network event such as peer connection, disconnection and message receipt 

17.Web Requests (HTTP/HTTPS)

Available in v0.43.9+

Refer to the ComputerVision.msf sample scene displayed in the example scene browser on editor startup for a working example of both HTTPS and email support courtesy of the libCURL based CURL plugin.

18.Technical Notes


Safe Mode

To disable all scene loading launch the editor with the safemode command line argument.

18.2.Rendering overview

The engine is a hybrid scene graph / ECS architecture.

The rendering algorithm:

  1. Scene graph traversal: A pre-order traversal of the scene graph. For each item in the scene graph:
    1. The scene item’s Prepare() method is called
    2. Typically the Prepare() method enqueues an item onto a render queue (or if the item is a scene manager, an Octree for instance, any further traversal of the Octree’s scene-graph is short-circuited allowing the scene manager to perform frustum / occlusion culling enqueueing only the visible items that exist in the Octree’s sub scene-graph)
  2. System Update: Once the scene graph has been traversed the systems are updated (a system is effectively an array of components of a specific type). A system component can optionally query the visibility of the scene item (to which it is attached) to determine if it should be updated or not.
  3. Rendering: The render queue renders the scene by performing state-based sorting of the render items it contains followed by calling each item’s Render() method

Improvements: The scene graph might be replaced with a rendering system and component at some point in the future simplifying the above to a two step process i.e. system update followed by rendering.


The framework is fully versionable and it achieves this through a couple of mechanisms.

  • A framework version – this is the version of the framework i.e. the core engine. Typically, new classes inherit framework classes adding new fields (data) and methods.If a framework class is modified then the framework version is updated and tested for in the modified serialisation code.
  • A class specific current version. When you add a new custom class be it a System, Component or SceneItem that derives the framework AbstractSystem, AbstractComponent or SceneItem then there will come a time when you want to add new fields in a version friendly manner. To do this you need to override the GetCurrentVersion method on your class (a monotonically increasing value) that you can then test for when reading your data.

For further implementation details refer to the source code.