|
| 1 | +# ClassifAI |
| 2 | +This repository contains a machine learning project focused on recognizing flower species. It consists of two main components: |
| 3 | + |
| 4 | +- A trainer Python application, used for training a TensorFlow Lite Convolutional Neural Network on a dataset consisting of images of five different flower species. |
| 5 | + |
| 6 | +- A client Android application that utilizes the model trained to recognize flower species. It can identify flowers either from static images or live camera feeds. |
| 7 | + |
| 8 | + |
| 9 | + |
| 10 | +## Dependencies |
| 11 | +To run this project, you need: |
| 12 | +* **Python 3+** (optional, for training the neural model - the trained model is already included in this repo, if you want to use it as is and not re-train it from scratch) |
| 13 | +* **TensorFlow Lite library** - For inference on mobile devices using TFLite models built for Android apps that can be developed with Java or Kotlin programming languages, respectively. |
| 14 | +* **Android Studio IDE** - For developing, testing and compiling the Android app. |
| 15 | + |
| 16 | +## Python Model Training |
| 17 | +<p style="text-align: center; background: #000000;"> |
| 18 | + <img src="./documents/screenshots/model_quant.tflite.svg"/> |
| 19 | +</p> |
| 20 | + |
| 21 | + |
| 22 | +The Python script `deep_neural_network_classifier.py` handles training a Deep Neural Network classifier using the [TensorFlow](https://www.tensorflow.org/api_docs/python/tf/) and [Keras APIs](https://keras.io/api/). |
| 23 | + |
| 24 | + |
| 25 | +It downloads and extracts the Flower Photos dataset from TensorFlow, consisting of 3670 photos of 5 flower species - daisies, dandelions, roses, sunflowers and tulips. |
| 26 | + |
| 27 | + |
| 28 | +The images are split 80/10/10 into training, validation and test sets respectively. |
| 29 | +Several data augmentation techniques are applied to the training set, including random rotations, flips and zooms. This expands the effective size and diversity of the training dataset. |
| 30 | + |
| 31 | + |
| 32 | +The model architecture is a Convolutional Neural Network (CNN) with the following layers: |
| 33 | + |
| 34 | +- 3 [Convolution2D](https://keras.io/api/layers/convolution_layers/convolution2d) layers for feature extraction |
| 35 | +- 3 [MaxPooling2D](https://keras.io/api/layers/pooling_layers/max_pooling2d) layers for spatial dimensionality reduction |
| 36 | +- 1 [Dropout layer](https://keras.io/api/layers/regularization_layers/dropout) to prevent overfitting |
| 37 | +- 1 [Flatten layer](https://keras.io/api/layers/reshaping_layers/flatten) |
| 38 | +- 1 [Dense hidden layer](https://keras.io/api/layers/core_layers/dense) with 128 units |
| 39 | +- 1 Output layer with 5 nodes (one per flower class) |
| 40 | + |
| 41 | +<p style="text-align: center;"> |
| 42 | + <img src="./documents/screenshots/model_quant_begin.png"/> |
| 43 | +</p> |
| 44 | + |
| 45 | + |
| 46 | +The model is trained for 15 epochs with a batch size of 32, using the [Adam optimizer](https://arxiv.org/abs/1412.6980) and [Sparse Categorical Crossentropy](https://www.tensorflow.org/api_docs/python/tf/keras/losses/SparseCategoricalCrossentropy) loss function. |
| 47 | + |
| 48 | +Model training and validation plots are generated to visualize accuracy and loss metrics. |
| 49 | + |
| 50 | +The trained model is exported to [TensorFlow Lite format](https://www.tensorflow.org/lite/api_docs), applying [full integer quantization](https://www.tensorflow.org/lite/performance/post_training_integer_quant) for optimized size, suitable for mobile processing power. Metadata including category labels is also embedded. |
| 51 | + |
| 52 | +## Android Recognition App |
| 53 | + |
| 54 | + |
| 55 | +<p style="text-align: center;"> |
| 56 | + <img src="./documents/screenshots/UI_002.png" style="margin: 5px; width: 200px;"/> |
| 57 | + <img src="./documents/screenshots/UI_003.png" style="margin: 5px; width: 200px;"/> |
| 58 | + <img src="./documents/screenshots/UI_004.png" style="margin: 5px; width: 200px;"/> |
| 59 | +</p> |
| 60 | + |
| 61 | + |
| 62 | +The Android app provides real-time flower recognition functionality using the TFLite model trained with the Python couterpart application. |
| 63 | + |
| 64 | +The main `RealtimeRecogActivity` activity shows a live camera preview and runs analysis on each frame. |
| 65 | + |
| 66 | +The `ImageAnalyzer` class handles frame processing, invoking the `Classifier` to run inference using the TFLite Interpreter API. |
| 67 | + |
| 68 | +Recognition results with confidence scores are displayed on screen. Photos of recognized flowers can be captured and saved to the device storage. |
| 69 | + |
| 70 | +The app demonstrates how an offline on-device ML model can enable real-time intelligent features, specifically optimized for the computational constraints typical of mobile devices. |
| 71 | + |
| 72 | +<p style="text-align: center;"> |
| 73 | + <img src="./documents/screenshots/Activity Diagram.png"/> |
| 74 | +</p> |
| 75 | + |
| 76 | + |
| 77 | +## License |
| 78 | + |
| 79 | +This project is licensed under the GPLv3.0. See the [LICENSE](./documents/LICENSE.md) file for details. |
0 commit comments