ML for Games Course documentation

A deep dive on the NPC-Playground

Hugging Face's logo
Join the Hugging Face community

and get access to the augmented documentation experience

to get started

A deep dive on the NPC-Playground

The Tech Stack

To create this demo, the teams used three main tools:

  • Cubzh: the cross-platform UGC (User Generated Content) game engine.

  • Gigax: the engine for smart NPCs.

  • Hugging Face Spaces: the most convenient online environment to host and iterate on game concepts in an open-source fashion.

What is Cubzh?

Cubzh is a cross-platform UGC game engine, that aims to provide an open-source alternative to Roblox.

It offers a rich gaming environment where users can create their own game experiences and play with friends.

Cubzh

In Cubzh, you can:

  • Create your own worlds items and avatars.

  • Build fast, using community made voxel items (+25K so far in the library) and open-source Lua modules.

  • Code games using a simple yet powerful Lua scripting API.

Cubzh is in public Alpha. You can download and play Cubzh for free on Desktop via Steam, Epic Game Store, or on Mobile via Apple’s App Store, Google Play Store or even play directly from your browser.

In this demo, Cubzh serves as the game engine running directly within a Hugging Face Space, users can easily clone it to experiment with custom scripts and NPC personas.

What is Gigax?

Gigax is the platform game developers use to run LLM-powered NPCs at scale.

Gigax has fine-tuned large language models for NPC interactions, using the “function calling” principle.

It’s easier to think about this in terms of input/output flow:

  • In input, the model reads a text description of a 3D scene, alongside a description of the recent events and a list of the NPC’s available actions (e.g., <say>, <jump>, <attack>, etc.).

  • The model then outputs one of these actions using parameters that refer to 3D entities that exist in the scene, e.g. say NPC1 "Hello, Captain!".

gigax

Gigax has open-sourced their stack! You can clone their inference stack on Github. For this demo, their models are hosted in the cloud, but you can download them yourself on the 🤗 Hub:

< > Update on GitHub