This is a small, self-contained C11 MNIST trainer with a three-layer fully connected
network, forward pass, softmax, and SGD backprop; the training loop runs end-to-end
against IDX files under assets/, and the README’s log is a sample run.
sample 10000 loss = 1.566644
sample 20000 loss = 0.032247
sample 30000 loss = 0.081100
sample 40000 loss = 0.212121
sample 50000 loss = 0.000302
sample 60000 loss = 0.006562
epoch 0:
avg loss = 0.471291
accuracy = 86.97%
sample 10000 loss = 0.000480
sample 20000 loss = 1.143894
sample 30000 loss = 0.001678
sample 40000 loss = 0.000514
sample 50000 loss = 0.000205
sample 60000 loss = 0.005218
epoch 1:
avg loss = 0.245554
accuracy = 92.99%
sample 10000 loss = 0.001167
sample 20000 loss = 0.001660
sample 30000 loss = 0.080382
sample 40000 loss = 0.219283
sample 50000 loss = 0.145265
sample 60000 loss = 0.004101
epoch 2:
avg loss = 0.197328
accuracy = 94.31%
Final sample check:
loss = 0.000201
prediction = 7 (label = 7)