Check out our latest project ✨ OpenChapter.io: free ebooks the way its meant to be 📖

Godot Llama

An asset by oceancucumber
The page banner background of a mountain and forest
Godot Llama hero image

Quick Information

0 ratings
Godot Llama icon image
oceancucumber
Godot Llama

Godot Llama brings llama.cpp into Godot 4.x as a native GDExtension so you can run local GGUF models directly in your game or tool.Included API classes:LlamaModel for model loading, tokenization/detokenization, vocab, and metadataLlamaContext for context creation, generation, streaming signals, cancellation, and perf statsLlamaSampler for configurable sampling behaviorLlamaAsyncWorker for non-blocking generation workflowsFeatures:Synchronous and token-streaming text generationCommon generation controls (temperature, top_k, top_p, min_p, penalties, stop sequences, seed, max tokens)Context state helpers (save/load state to memory or file, clear KV cache)Demo scene included at demo.tscnRequirements:Godot 4.5Local GGUF model filesPlatform binaries in addons/godot_llama/bin/ (provided by release artifacts)Quick start:Copy addons/godot_llama/ into your Godot project.Open res://addons/godot_llama/demo/demo.tscn.Select a .gguf model, create context, and generate.

Supported Engine Version
4.5
Version String
1.0.0
License Version
MIT
Support Level
community
Modified Date
2 days ago
Git URL
Issue URL

Godot Llama brings llama.cpp into Godot 4.x as a native GDExtension so you can run local GGUF models directly in your game or tool.

Included API classes:

LlamaModel for model loading, tokenization/detokenization, vocab, and metadata
LlamaContext for context creation, generation, streaming signals, cancellation, and perf stats
LlamaSampler for configurable sampling behavior
LlamaAsyncWorker for non-blocking generation workflows
Features:

Synchronous and token-streaming text generation
Common generation controls (temperature, top_k, top_p, min_p, penalties, stop sequences, seed, max tokens)
Context state helpers (save/load state to memory or file, clear KV cache)
Demo scene included at demo.tscn
Requirements:

Godot 4.5
Local GGUF model files
Platform binaries in addons/godot_llama/bin/ (provided by release artifacts)
Quick start:

Copy addons/godot_llama/ into your Godot project.
Open res://addons/godot_llama/demo/demo.tscn.
Select a .gguf model, create context, and generate.

Reviews

0 ratings

Your Rating

Headline must be at least 3 characters but not more than 50
Review must be at least 5 characters but not more than 500
Please sign in to add a review

Quick Information

0 ratings
Godot Llama icon image
oceancucumber
Godot Llama

Godot Llama brings llama.cpp into Godot 4.x as a native GDExtension so you can run local GGUF models directly in your game or tool.Included API classes:LlamaModel for model loading, tokenization/detokenization, vocab, and metadataLlamaContext for context creation, generation, streaming signals, cancellation, and perf statsLlamaSampler for configurable sampling behaviorLlamaAsyncWorker for non-blocking generation workflowsFeatures:Synchronous and token-streaming text generationCommon generation controls (temperature, top_k, top_p, min_p, penalties, stop sequences, seed, max tokens)Context state helpers (save/load state to memory or file, clear KV cache)Demo scene included at demo.tscnRequirements:Godot 4.5Local GGUF model filesPlatform binaries in addons/godot_llama/bin/ (provided by release artifacts)Quick start:Copy addons/godot_llama/ into your Godot project.Open res://addons/godot_llama/demo/demo.tscn.Select a .gguf model, create context, and generate.

Supported Engine Version
4.5
Version String
1.0.0
License Version
MIT
Support Level
community
Modified Date
2 days ago
Git URL
Issue URL

Open Source

Released under the AGPLv3 license

Plug and Play

Browse assets directly from Godot

Community Driven

Created by developers for developers