A comprehensive collection of Multilayer Perceptron (MLP) implementations in C and Python for neural network development from scratch.
This repository contains various implementations of Multilayer Perceptrons (MLPs) in both C and Python. It serves as a resource for understanding neural network fundamentals by providing implementations that range from basic MLPs built from scratch to integrations with modern deep learning frameworks like TensorFlow and Keras.
The project focuses on:
- Building neural networks from first principles
- Implementing forward and backward propagation algorithms
- Working with common datasets (MNIST, Iris)
- Comparing different activation functions and architectures
- Transferring weights between different implementations
C/mnist_read.cpp
&C/mnist_read.h
: Utilities for reading MNIST dataset in CC/mnist.c
: MNIST dataset handling in CC/rsp_mlp.c
: MLP implementation in C with support for various activation functionsC/nloh_json.cpp
&C/nloh_json.h
: JSON utilities for weight serialization/deserialization
Python/mlp.py
: Pure Python implementation of MLP with backpropagationPython/mlp_gen.py
: MLP generator utilitiesPython/mnist_changed.py
: Modified MNIST dataset loaderPython/plot_mnist_image.py
: Utilities for visualizing MNIST imagesPython/tf_mnist.py
&Python/tf_mnist_esti.py
: TensorFlow implementations for MNISTPython/keras_mlp_mnist.py
: Keras implementation of MLP for MNIST
Python/keras/keras_get_mnistWeights.py
: Extract weights from Keras MNIST modelsPython/keras/keras_mnist_fforward.py
: Forward propagation with Keras for MNISTPython/keras/keras_train_mnist.py
: Training MNIST models with Keras
Python/custom estimator.py
: Custom TensorFlow estimator implementationPython/pre-made estimator.py
: Using pre-made TensorFlow estimatorsPython/estimator_backend.py
: Backend utilities for TensorFlow estimators
- Activation Functions: Sigmoid, Tanh, ReLU, Leaky ReLU
- Weight Initialization: Random and pre-trained weight loading
- Forward Propagation: Implementations in both C and Python
- Backward Propagation: Training algorithms with gradient descent
- Softmax Output: For classification tasks
- MNIST: Handwritten digit recognition (28x28 pixel images, 10 classes)
- Iris: Flower classification (4 features, 3 classes)
# Initialize and train a network on the Iris dataset
network = init_network(4, 3, 1, 10, 10) # 4 inputs, 3 outputs, bias, 2 hidden layers with 10 neurons each
trained_network = back_propagation(train_dataset, test_dataset, 0.0001, 1000, relu, relu_derivative, 10, 10)
# Train a simple MLP on MNIST using Keras
model = Sequential()
model.add(Dense(256, activation='relu', input_shape=(784,)))
model.add(Dense(256, activation='relu'))
model.add(Dense(10, activation='softmax'))
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
model.fit(x_train, y_train, batch_size=128, epochs=20, validation_data=(x_test, y_test))
// Load weights from a JSON file and run forward propagation
pw = getWeights(PY_MLP);
initialize_bias_PYMLP(pw);
initialize_PYMLPweights(pw);
forward_propagation(x_testset[i]);
- TensorFlow
- Keras
- NumPy
- Matplotlib (for visualization)
- Standard C libraries
- JSON parsing library (included)
- Clone the repository
- For Python implementations:
- Install required Python packages:
pip install tensorflow keras numpy matplotlib
- Run examples from the Python directory
- Install required Python packages:
- For C implementations:
- Compile C files with your preferred compiler
- Ensure MNIST dataset files are in the correct location as specified in header files