IBM-GPT-5.4-Coder-1B

This model is a full fine-tuned derivative of ibm-granite/granite-4.0-1b.

Training setup:

  • Full model fine-tuning
  • No adapters
  • No LoRA
  • No QLoRA
  • Dual-GPU DDP training
Downloads last month
13
Safetensors
Model size
2B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Oxy29/IBM-GPT5.4-Coder-1B

Finetuned
(7)
this model
Quantizations
2 models

Datasets used to train Oxy29/IBM-GPT5.4-Coder-1B