Wolf Modules

Author: thothonegan
Tags: c++ gamedev wolf_engine


You are writing a library in C++. Its a great library with lots of functionality, but theirs one key function that must be called before anything else. Most libraries solve this by creating some init function the host application has to call (SDL_Init, etc). But if you have a lot of libraries with a lot of these types of functions, it quickly becomes repetative. What if theirs a way we can automate this?

Possible solutions:

Build system - Use shared libraries

If we are able to use/require shared libraries, this generally becomes simple. On every platform, there is usually some form of function that gets called when the shared library is connected and/or disconnected (e.g. DllMain(), __attribute__((constructor)) in GNU, etc) . Even if there isnt, we can wrap our loader to automatically call the function we want, e.g (psudocode):

    for (module : modulesFoundOnDisk)
        auto modInfo = getFunction("createModuleInfo")();

There is a lot of reasons we might be using static libraries though. What then?

Code - Static Initilization

Static initilization is the process C++ uses to setup global/static variables before main() is ran. For example, if you have a global variable call a function or a lambda, it'll run before the program does. e.g.

int i = []() -> int {
    std::cout << "Before main" << std::endl; return 0;

int main () {
    std::cout << "Main" << std::endl; return 0;

With making the function just register itself somewhere else, we can control when the code we want gets ran (see unit testing frameworks for an example of this), therefore this is exactly what we want, right?

Except one tiny problem. If you use this in an application, it will work exactly as expected. If you use this in a shared library, it'll run it on load (similar to our previous case). If you try this on a static library though, it wont run at all!

The big benefit of static libraries is they just act like containers for your code. When you link it in to an application, it then only brings in the parts that are needed. If you have a 100MB static library, but only use 1MB of it, it will only pull in the 1MB. And the static initilization trick is pretty much abusing a global which nothing in the library ever used, thus its first on the chopping block.

So to fix this, you need a callchain from the app to reach that variable. Which means you need some function which people are guaranteed to call, and we're back with the original problem. (e.g. android NDK apps have to call app_dummy() so the compiler doesnt destroy some of the non-called helpers : http://blog.beuc.net/posts/Make_sure_glue_isn__39__t_stripped/ ).

So is there any other possibilities?

Code Generation

Since the problem is we dont want them to have to manually call the functions, why dont we do it ourselves! As long as we know which modules they linked in, we could figure out the calls and generate the code as part of the build system. Though then our problems go up a layer, how do we know what libraries linked to what? If I include WolfWindow, i get WolfRenderer as part of it, meaning i need to static initialize both of them. But its a doable solution! All you have to do is overhaul your entire build system. And of course thats what I ended up doing. So lets talk modules!

Wolf Modules

A Wolf Module is basically a library packaged in a specific way using a series of specialized tools which guarantee some requirements such as having initilization. Theirs three major parts to it : first as a user of modules, the file format on disk if you need to poke around, and last how its implemented underneath.

User Perspective

In Wolf, one of the core classes is Application. It basically represents a program and lets you react to events that happen. Most important is it has an init() function which is ran on startup, and a free() function when its destroyed. It also managed the program lifecycle.

A Module in wolf is essentially doing the same thing for a library : it consists of a v_init() and v_free() functions (using the proper naming conventions unlike init : blog post on that later). On load or static initilization, its guaranteed to call that function and on free the same, allowing the library to automatically prepare itself. Just like an application, its just subclass WolfModule, write a few macros, and you're magically a module with its own WolfCore_[name]_createModuleInfo function.

If you're using an older Wolf application, you also have to add a few new macros to your application class so it knows to register itself, and start using wolf_begin_application instead of catalyst_begin_executable which will tell Catalyst (the build system) to generate the extra information needed. Once thats done, it'll magically call all static modules setup and teardown before/after your normal init()! No more WolfRenderer::registerResources() nonesense required.


Note the filesystem layout isnt guaranted to be completely stable (might add more folders later), but its working so far. If you look into a module, you'll see something that doesnt really look like a normal library file:

├── Debug
├── Headers
│   └── WolfAnalytics
│       ├── Logger.hpp
│       ├── Manager.hpp
│       ├── Module.hpp
│       └── WolfAnalytics.hpp
├── Library
│   └── x86_64-linux
│       └── Debug
│           └── libWolfAnalytics-static.a
├── Licenses
└── module.json
  • Debug : Contains debug information. At the moment, only windows puts its PDB files here.

  • Headers : Contains all of the headers to use the library. Catalyst automatically adds this as an include directory. Generally you have a toplevel folder the same as the module, which has all your headers (including the meta header ModuleName/ModuleName.hpp and the module header Module.hpp).

  • Library : Contains all of the variations of the library. All of them are in the format Library/{catalystarch}-{catalysttarget}/{buildtype}/{moduleName}[-static].{ext} The specifics of the name are based on the platform and what the linker expects. Basically linking 'moduleName' should automatically pick the right library in this.

  • Licenses : Right now its just a place where a text version of the license if any can go. Eventually I would like it to have a control file which will know to automatically copy it into the app directory, or things like that. Look at unreal's license system for a similar idea.

  • module.json : Might eventually be module.wexpr. This file contains information about a module, including what modules it depends on, any other system libraries it needs to link to, and so on. Basically all the information the build system will need to be able to use the module. e.g.

    { "name" : "WolfAnalytics", "dependent_modules" : [ "WolfCore" ] }

Eventually I want to add:

  • CMake (or Catalyst) : CMake/Catalyst macros which are automatically included when loaded
  • Resources : Resources that can be copied into the resulting product. Wolf would automatically add it as a resource source.

At the moment, modules are completely a build time construct, but eventually it'll have a stripped form which allows them to be packaged with the application (such as if you use it in its shared module form). Luckily all of these details are handled by Catalyst.

Internals of the module system

Theirs quite a lot of gory guts here, and pieces of it I will explain more later. The core of the system from a build perspective though is Catalyst. Catalyst is essentially a build system which sits on top of CMake, adding a ton of extra functionality. For example, instead of the normal add_executable command, it has multiple variants which will automatically add functionality. Such as catalyst_begin_module / catalyst_end_module which will automatically create a module in the correct format. Also supports catalyst_module_use instead of target_link_libraries which boths links the module, and will find/link all child modules. It even adds all of the include directories properly, so you dont have to add extra include dirs you wont use.

Lastly there is wolf_add_application which tells catalyst it needs to generate some extra code. When you do that, Catalyst will create a source file which contains every module you statically linked, and notifies Wolf's module system they were loaded statically including all dependencies. Looks something like:

// --- GENERATED BY Catalyst ---

#include <WolfConsole/Console.hpp>

using WolfConsole::console;

WOLFCORE_MODULE_DECLARE(WolfConsoleDriverVFS, WolfConsoleDriverVFS::Module)
// ... etc

void p_CATALYST_Endless_Application_registerStaticModules (); // extern

void p_CATALYST_Endless_Application_registerStaticModules ()
    auto res = WolfCore::ModuleManager::manager()->loadModulesStatically({
        // ... etc
    if (res.isError())
        console.error() << "Unable to load static modules Catalyst had selected." << WolfConsole::EndLine;

The application macros which are a part of your normal Application.hpp file then automatically wrap the function into your Application which guarantees its setup before it runs init. Eventually when shared libraries are added, it'll run automatically as part of the same system (though loaded dynamically), which will lead to the ultimate goal of a lot of this : dynamically reloadable C++ code! But thats a topic for another day.

So thats wolf modules! They basically allow a library to be more than a bundle of code with seperate headers : the build system knows a lot more about them, and the runtime side allows the module to expose more information too along with a guaranteed lifecycle thats easy to use. And no more 'if you use this library, you have to call this first' crap!