🦙 Shiv's Custom Local LLM (Ollama Ready)

Welcome to ollama-model-shiv, a custom-built local language model designed to run completely offline using Ollama. This repo packages everything — from Modelfile to model blobs — ready for Docker-based deployment and local inference.


Link to model - https://huggingface.co/shiv1119/ollama-model-shiv

📁 Directory Structure

OllamaModelBuild/ ├── model_test.py # Sample Python script to interact with the model ├── .gitattributes # Git LFS or text handling config ├── ollama_clean/ │ ├── docker-compose.yml # Docker config to run Ollama with this model │ ├── Modelfile # Ollama build instructions │ └── models/ # Model weights & metadata │ ├── blobs/ # Binary blobs of the model │ └── manifests/ # Manifest describing model structure


🚀 Features

  • ✅ 100% Offline — No internet or API key needed
  • 🐋 Docker-ready with docker-compose.yml
  • ⚡ Works with Ollama CLI
  • 🔁 Full model reproducibility via blobs and manifests
  • 🧠 Based on LLaMA2 / open-weight LLM architecture

🛠️ Getting Started

🔧 1. Install Prerequisites

🐋 2. Run the Model Using Docker

In the root of the project:

cd ollama_clean
docker-compose up --build

This builds and runs the model container locally with your custom blobs and Modelfile. 🧪 Test the Model (Optional)

docker-compose exec ollama ollama run tinyllama "Hello"

Use the included Python script to test interaction:

python model_test.py

Customize it to query your local Ollama model running at http://localhost:11434. 🧰 Model Components

Modelfile: Blueprint for Ollama to build the model

blobs/: Raw model weights

manifests/: Metadata describing model format/version

docker-compose.yml: Encapsulates build/run config

🧠 About Ollama

Ollama makes it simple to run LLMs locally on your own machine — private, fast, and API-free.

📦 Repo Purpose

This repository was created to:

Host a working local LLM solution

Enable offline inference

Serve as a template for packaging custom models with Ollama

📜 License

This repo is for educational/research purposes only. Please ensure you comply with the license of any base models used (e.g., LLaMA2, Mistral, etc.). 🙌 Credits

Crafted with ❤️ by Shiv Nandan Verma

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for shiv1119/ollama-model-shiv

Finetuned
(1533)
this model