Skip to content

Klus3kk/fit

Repository files navigation

FIT: Flexible and Interpretable Training Library

Tests codecov License: MIT

FIT is a lightweight machine learning library built from scratch in Python with NumPy. It provides a PyTorch-like API with automatic differentiation and neural network components.

Features

  • Automatic Differentiation: Build and train neural networks with automatic differentiation
  • Neural Network Components: Linear layers, activations (ReLU, Softmax), BatchNorm, Dropout
  • Optimizers: SGD, SGD with momentum, Adam
  • Training Utilities: Training loop, learning rate schedulers, gradient clipping
  • Monitoring: Training metrics tracking and visualization
  • Model I/O: Save and load your trained models

Installation

Using pip (recommended)

pip install git+https://github.com/Klus3kk/fit.git

From source

git clone https://github.com/Klus3kk/fit.git
cd fit
pip install -e .

Using Docker

docker-compose up fit-ml

Quick Start

from fit.core.tensor import Tensor
from fit.nn.sequential import Sequential
from fit.nn.linear import Linear
from fit.nn.activations import ReLU, Softmax
from fit.train.loss import CrossEntropyLoss
from fit.train.optim import Adam
from fit.train.trainer import Trainer

# Create a simple model for binary classification
model = Sequential(
    Linear(2, 4),
    ReLU(),
    Linear(4, 2),
    Softmax()
)

# Prepare data
X = Tensor([[0, 0], [0, 1], [1, 0], [1, 1]])  # XOR problem
y = Tensor([0, 1, 1, 0])

# Define loss function and optimizer
loss_fn = CrossEntropyLoss()
optimizer = Adam(model.parameters(), lr=0.01)

# Create trainer and train
trainer = Trainer(model, loss_fn, optimizer)
trainer.fit(X, y, epochs=100)

# Make predictions
predictions = model(X)

About

A lightweight machine learning framework built from scratch in Python.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published