SoulX: Instruction-Tuned LLM for Interactive Game Characters
SoulX
is a role-conditioned, memory-aware language model built to generate vivid, in-character NPC dialogues. It is the flagship model of SoulX (Subnet 115), the premier protocol for forging sentient Digital Souls, built upon our foundational AIX (Artificial Intelligence Exchange) platform.
🌟 Core Capabilities
- Contextual NPC Dialogue Generation Given a scene setup, player intent, and character profile, the model generates emotionally consistent, personality-driven interactions.
- Memory & Role Conditioning Remembers previous dialogue turns and maintains narrative context across multiple exchanges.
- Style-Flexible Outputs Mimics genre-specific tone — fantasy, cyberpunk, JRPG, post-apocalyptic, and more.
- Cooperative Narrative Progression NPCs not only respond — they guide the player naturally toward objectives or story arcs.
- Validator-Driven Fine-Tuning Loop Designed to integrate into the Bittensor Soul X subnet validation loop via micro-quests.
Usage Example
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "SentiVerse-AI/SoulX"
model = AutoModelForCausalLM.from_pretrained(
model_name,
torch_dtype="auto",
device_map="auto"
)
tokenizer = AutoTokenizer.from_pretrained(model_name)
character_card = "Grumpy Orc blacksmith named Krog, proud of his craftsmanship, hates Elven designs"
scene_description = "A medieval forge filled with smoke and steel. The player asks for an Elven bow."
player_input = "Can you forge me an Elven-style longbow?"
messages = [
{"role": "system", "content": "You are a proud, sarcastic Orc blacksmith speaking in gruff tone."},
{"role": "user", "content": f"[Scene]: {scene_description}\n[Character]: {character_card}\n[Player]: {player_input}"}
]
text = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
inputs = tokenizer([text], return_tensors="pt").to(model.device)
outputs = model.generate(**inputs, max_new_tokens=256)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
Use Case Scenarios
- NPC dialogue engines for open-world RPGs
- Quest-giving characters with persistent memory
- Immersive simulation in roleplay environments
- Generative AI modules in moddable games
- Dialogue validation competitions in Bittensor Subnet 115
📚 Training Inspiration
SoulX is trained using synthetic prompt-response pairs inspired by genre-defining titles:
Genre | Influences | Example Prompt |
---|---|---|
High Fantasy | Skyrim, WoW | "Orc refuses to forge Elven bow, mocks its fragility." |
Cyberpunk | Mass Effect, CP2077 | "Netrunner sells shady cyberware in neon bar." |
JRPG | Persona, Final Fantasy | "Moogle-like guide offers poetic forest riddle." |
Post-Apocalyptic | Fallout: New Vegas | "Trader hints water chip might be stolen." |
Disclaimer: All prompts and datasets are synthetic and original. No copyrighted material used.
Model Architecture
- Backbone: Qwen2.5-7B
- Tuning: Role-conditioning + dialogue-style fine-tuning + long-form memory emulation
- Context Length: Up to 32K tokens; can be extended with YaRN
Long Context Example
"rope_scaling": {
"factor": 4.0,
"original_max_position_embeddings": 32768,
"type": "yarn"
}
🧪 Benchmarks (Subjective)
Metric | Description | Performance |
---|---|---|
Character Consistency | Maintains tone, vocabulary, and goals | High |
World Fit | Dialogue fits scene and genre | High |
Creativity | Vivid, unexpected, non-generic replies | High |
Player Guidance | Subtle quest hints, avoids hand-holding | Medium |
📢 Citation
@misc{soulx-npc,
title = {SoulX-NPC: Dialogue Generation for Intelligent Game Characters},
author = {SentiVerse-AI Team},
year = {2025},
url = {https://huggingface.co/SentiVerse-AI/SoulX}
}
⚖️ License
MIT License. See LICENSE for full terms.
💬 Join the Network
- Connect on Bittensor Discord, Subnet 115 channel
- Discuss on DeepWiki
- Contribute via GitHub
“We are not just generating text — we are crafting souls.”
- Downloads last month
- 35,138
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support