metadata
language:
- en
- multilingual
base_model: Tomlim/myt5-large
model_type: myt5
model_creator: Nekochu
pretty_name: myt5-large-Stable-Diffusion-Prompt
library_name: transformers
pipeline_tag: text-generation
tags:
- stable-diffusion
- prompt-generation
- text-generation
- encoder-decoder
- structured-output
- human-curated
- finetune
datasets:
- Nekochu/discord-unstable-diffusion-SD-prompts
license: mit
prompt_template: >-
### Instruction: Create stable diffusion metadata based on the given english
description. {prompt}
### Response: {output}
widget:
- text: >
### Instruction:
Create stable diffusion metadata based on the given english description. a
futuristic city
### Response:
example_title: Cyberpunk City
- text: >
### Instruction:
Create stable diffusion metadata based on the given english description. a
dragon in a mystical forest
### Response:
example_title: Fantasy Dragon
🎨 Stable Diffusion
Text2Text
Architecture
myT5
1.2B
400k samples trained
Run the Model
from transformers import AutoModelForSeq2SeqLM, AutoTokenizer model = AutoModelForSeq2SeqLM.from_pretrained("Nekochu/myt5-large-SD-prompts", device_map="auto") tokenizer = AutoTokenizer.from_pretrained("Nekochu/myt5-large-SD-prompts")
prompt = "### Instruction:\nCreate stable diffusion metadata based on the given english description. a futuristic city\n\n### Response:\n" inputs = tokenizer(prompt, return_tensors="pt", max_length=256, truncation=True).to(model.device) outputs = model.generate(**inputs, max_length=256, num_beams=5, early_stopping=True) result = tokenizer.decode(outputs[0], skip_special_tokens=True) print(result)
Cyberpunk City
SFW
Nikon Z9 200mm f_8 ISO 160, (giant rifle structure), flawless ornate architecture, cyberpunk, neon lights, busy street, realistic, ray tracing, hasselblad
Fantasy Dragon
SFW
masterpiece, best quality, cinematic lighting, 1girl, solo,
Anime Succubus
NSFW
masterpiece, best quality, highly detailed background, intricate, 1girl, (full-face blush, aroused:1.3), long hair, medium breasts, nipples
Experimental Model
- Structured output tasks perform best.
Training Evolution & Alternative Attempts
tokenizer failed
google/byt5
byte-level