metadata
license: apache-2.0
base_model:
- Qwen/Qwen2.5-Coder-7B-Instruct
Model Information
This model is the reasoning model for Text2SQL task introduced in Think2SQL: Reinforce LLM Reasoning Capabilities for Text2SQL
Intended use
The best model performance are given with its System and User prompt. The model is intended to use with three input: question, evidence and the database schema.
Starting with transformers >= 4.43.0
onward, you can run conversational inference using the Transformers pipeline
abstraction or by leveraging the Auto classes with the generate()
function.
Make sure to update your transformers installation via pip install --upgrade transformers
.
import transformers
import torch
model_id = "simone-papicchio/Think2SQL-7B"
pipeline = transformers.pipeline(
"text-generation",
model=model_id,
model_kwargs={"torch_dtype": torch.bfloat16},
device_map="auto",
)
system_message = (
"You are a helpful AI Assistant that provides well-reasoned and detailed responses. "
"You first think about the reasoning process as an internal monologue and then provide the user with the answer. "
"Respond in the following format: <think>\n...\n</think>\n<answer>\n...\n</answer>"
).strip()
user_message = (
"Answer the following question with the SQL code. Use the piece of evidence and base your answer on the database schema. "
"Given the question, the evidence and the database schema, return in the <answer> tags only the SQL script that addresses the question.\n"
"Question:\n{question}\n\n"
"Evidence:\n{evidence}\n\n"
"Database Schema:\n{schema}\n\n"
"Return only the SQL script enclosed in <answer> tags."
).strip()
messages = [
{"role": "system", "content": system_message},
{"role": "user", "content": user_message},
]
outputs = pipeline(
messages,
max_new_tokens=256,
)
print(outputs[0]["generated_text"][-1])
Citation
@misc{papicchio2025think2sqlreinforcellmreasoning,
title={Think2SQL: Reinforce LLM Reasoning Capabilities for Text2SQL},
author={Simone Papicchio and Simone Rossi and Luca Cagliero and Paolo Papotti},
year={2025},
eprint={2504.15077},
archivePrefix={arXiv},
primaryClass={cs.LG},
url={https://arxiv.org/abs/2504.15077},
}