gradc is a small, explicit automatic differentiation engine implementing reverse-mode autodiff from scratch. The project focuses on correctness, clarity, and mechanical sympathy rather than convenience-driven abstractions.
The core engine represents computation as an explicit directed acyclic graph, with gradient propagation performed via a topologically ordered backward pass. Memory ownership, graph lifetime, and gradient accumulation are handled deterministically, without hidden allocations or runtime magic.
gradc is intended as:
- a learning vehicle for understanding modern ML systems,
- a foundation for differentiable physics and rendering,
- and a reference implementation that prioritizes transparency over scale.
The design avoids framework-level assumptions and exposes the full computation graph explicitly, making it suitable for experimentation, instrumentation, and extension into domain-specific differentiable systems.