A lightweight C++ neural network library inspired by Andrej Karpathy's micrograd. This implementation provides a minimalistic yet powerful neural network framework with automatic differentiation capabilities.
- Automatic differentiation engine
- Neural network primitives:
- Neurons with configurable inputs
- Layers with multiple neurons
- Multi-layer perceptron (MLP) architecture
- Modern C++ implementation (C++17)
- Header-only library
- Efficient memory management using smart pointers
- C++17 compatible compiler
- CMake (for building)
mkdir build
cd build
cmake ..
make
#include "core/Neuron.h"
// Create a simple neural network
size_t inputs = 3;
std::vector<size_t> layer_sizes = {4, 2, 1}; // 4 neurons in first layer, 2 in second, 1 in output
MultiLayerPerceptron mlp(inputs, layer_sizes);
// Create input values
std::vector<ValuePtr> input_values = {
create_value(1.0),
create_value(0.5),
create_value(-1.0)
};
// Forward pass
auto output = mlp(input_values);
// Backpropagate
output[0]->backpropagate();
// Visualize
output[0]->visualize("neural_network");
-
Value Class
- Represents a value in the computation graph
- Handles automatic differentiation
-
Neuron Class
- Basic computational unit
- Contains weights and bias
- Applies ReLU activation function
-
Layer Class
- Collection of neurons
- Handles forward propagation through neurons
-
MultiLayerPerceptron Class
- Complete neural network
- Manages multiple layers
- Provides forward propagation through the entire network
Contributions are welcome! Please feel free to submit a Pull Request.
This project is licensed under the MIT License - see the LICENSE file for details.
- Inspired by Andrej Karpathy's micrograd
- Built with modern C++ features for efficiency and ease of use
- Add backward propagation examples
- Implement additional activation functions
- Add serialization support
- Include more comprehensive testing
- Add benchmarking suite
- Improve documentation with more examples
- Optimize + Add more features to the NN