metadata
tags:
- kernel
license: apache-2.0
Optimizer
Optimizer is a python package that provides:
- PyTorch implementation of recent optimizer algorithms
- with support for parallelism techniques for efficient large-scale training.
Currently implemented
Usage
import torch
from torch.distributed.fsdp import FullyShardedDataParallel as FSDP
from kernels import get_kernel
optimizer = get_kernel("motif-technologies/optimizer")
model = None # your model here
fsdp_model = FSDP(model)
optim = optimizer.Muon(
fsdp_model.parameters(),
lr=0.01,
momentum=0.9,
weight_decay=1e-4,
)
Pre-commit Hooks
This project uses pre-commit to automatically check and format code before commits.
Setup
Install pre-commit:
pip install pre-commitInstall the git hooks:
pre-commit install
Once installed, the configured hooks will run automatically on each commit.
Included Hooks
The following tools are run via pre-commit:
- yapf – Python code formatter
- typos – Spell checker for common typos
- isort – Organizes and sorts Python imports
- clang-format – Formats C++/CUDA code (
--style=file) - pymarkdown – Lints and auto-fixes Markdown files
- actionlint – Validates GitHub Actions workflows
Usage
Run all checks on the entire codebase:
pre-commit run --all-filesRun a specific hook (example: isort):
pre-commit run isort --all-files