Markdown Converter
Agent skill for markdown-converter
This repository is for the **CSIRO - Image2Biomass Prediction** Kaggle competition, a collaboration between CSIRO, Meat & Livestock Australia (MLA), and Google Australia.
Sign in to like and favorite skills
This repository is for the CSIRO - Image2Biomass Prediction Kaggle competition, a collaboration between CSIRO, Meat & Livestock Australia (MLA), and Google Australia.
Objective: Build a model that estimates pasture biomass from top-view images of Australian pastures.
Prize Pool: $75,000 USD
Timeline:
Evaluation Metric: Globally weighted coefficient of determination (R²)
Competition URL: https://www.kaggle.com/competitions/csiro-biomass
Predict 5 biomass measurements (in grams) from pasture images:
Dry_Green_g - Dry green matterDry_Dead_g - Dry dead matterDry_Clover_g - Dry clover matterGDM_g - Green dry matterDry_Total_g - Total dry matter (highest weight in loss: 0.5)Loss Weights:
[0.1, 0.1, 0.1, 0.2, 0.5] - Dry_Total_g is weighted highest
image2biomass/ ├── src/ │ ├── train.py # Main training script with W&B integration │ ├── models.py # UnifiedModel class for flexible backbones │ ├── utils.py # Model configs and helper functions │ ├── image_processing.py # BiomassDataset and transforms │ ├── data_preprocessing.py # Data wrangling utilities │ ├── submission.py # Inference script for Kaggle submission │ └── eda.ipynb # Exploratory data analysis ├── data/ │ ├── train/ # Training images (357 images, 2000x1000px) │ ├── test/ # Test images for submission │ ├── train.csv # Raw training data (long format) │ ├── y_train.csv # Processed targets (wide format) │ └── sample_submission.csv ├── models/ # Saved model checkpoints (.pt files) ├── scripts/ # Cloud setup scripts ├── results/ # EDA visualizations ├── config.yaml # Default training configuration ├── sweep.yaml # W&B hyperparameter sweep config └── pyproject.toml # Dependencies (uv/pip)
# Using uv (recommended) uv sync # Or pip pip install -e .
Dependencies: PyTorch 2.9+, timm 1.0+, wandb, pandas, matplotlib
# Standard training with config.yaml python src/train.py # Hyperparameter sweep with W&B python src/train.py --sweep
The
src/submission.py script loads a traced model and generates predictions. Update MODEL_PATH to point to your trained model.
Uses
UnifiedModel from src/models.py - a flexible wrapper around any timm backbone:
Input Image (2000x1000) -> Crop center (500-1500 px) -> Resize to model-specific size (224-518px) -> timm backbone (feature extraction) -> Regression head (Linear->ReLU->Dropout->Linear->ReLU) -> 5 biomass predictions
Defined in
src/utils.py MODEL_CONFIGS:
lr: 0.0001 n_epochs: 1 train_split: 0.8 batch_size: 4 bf16: true # Mixed precision (CUDA/MPS only) patience: 5 # Early stopping patience model_class: UnifiedModel model_name: vit_tiny_patch16_224
Bayesian optimization over model architectures with Hyperband early termination.
Training data: 357 unique images with 5 targets each
Image preprocessing: Center crop (500-1500px) -> Resize -> Normalize
Uses Weights & Biases (wandb):
image2biomass| File | Purpose |
|---|---|
| Main function |
| class |
| dict |
| factory |
| class |