dreamer-mini-v1
Overview
This is a test model.
Technical notes
- Base:
openai/gpt-oss-20b
(bf16) - Steering: rank-1 delta on Q/K/V across 24 layers (RMSNorm-aware)
- Concept vector:
concept_vec_v15k.pt
, shape [24, 6, 2880], gain=0.5 - Checkpoint: single baked weights (no LoRA/adapters; knowledge ≈ base)
- Data used: neutral_examples=86376, pairs_used=14396
- Source files:
narukijima/dreamer
→D_instruction_pairs_en.jsonl
,D_instruction_pairs_ja.jsonl
- Inference: use base tokenizer & chat template
Quick inference
from transformers import AutoTokenizer, AutoModelForCausalLM
import torch
M = "narukijima/dreamer-mini-v1"
tok = AutoTokenizer.from_pretrained(M, trust_remote_code=True)
mdl = AutoModelForCausalLM.from_pretrained(
M, torch_dtype=torch.bfloat16, device_map='auto', trust_remote_code=True
)
msgs = [{"role":"user","content":"test"}]
p = tok.apply_chat_template(msgs, tokenize=False, add_generation_prompt=True)
out = mdl.generate(**tok(p, return_tensors='pt').to(mdl.device),
max_new_tokens=64, do_sample=True, temperature=0.7)
print(tok.decode(out[0], skip_special_tokens=True))
- Downloads last month
- 24
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for narukijima/dreamer-mini-v1
Base model
openai/gpt-oss-20b