Skip to content

A lightweight and efficient implementation of neural networks in C++ with support for SGD and Adam.

License

Notifications You must be signed in to change notification settings

Obentemiller/neurama

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

neurama

A lightweight and efficient implementation of neural networks in C++.

Features

  • Layered Architecture: Supports multi-layer networks with customizable input and output sizes.
  • Activation Functions: Provides ReLU and Sigmoid activations, with easy integration of additional functions.
  • Weight Initialization: Offers random and Xavier initialization methods for setting initial weights.
  • Optimizers: Includes Stochastic Gradient Descent (SGD) and Adam optimizers for training.
  • Loss Function: Utilizes Mean Squared Error (MSE) for measuring prediction accuracy.

Getting Started

To incorporate Neurama into your project:

  1. Include the Header: Add #include "neurama.h" to your source file.
  2. Create a Neural Network: Instantiate a NeuralNetwork object.
  3. Add Layers: Use the addLayer method to define layers, specifying input size, output size, activation function, and initialization method.
  4. Train the Network: Provide training data and call the trainModel function, specifying the number of epochs and learning rate.
  5. Make Predictions: Use the forward method to obtain outputs for new inputs.

Example

/*
funções de ativação:

    ReLU,      // 0
    sigmoid,   // 1
    tanh,      // 2
    leakyReLU, // 3
    elu,       // 4
    softplus   // 5
    softmax    // 6 - NOVO!

funções de otimização:

    SGD,
    ADAM

funções de erro:

    MSE,               // Erro Quadrático Médio
    MAE,               // Erro Absoluto Médio
    BinaryCrossEntropy // Entropia Cruzada Binária
*/


#include "neurama.h"
#include <vector>
#include <iostream>

int main() {
    // 🚀 Inicialização da Rede Neural
    NeuralNetwork nn;
    nn.addLayer(2, 4, actMethod::ReLU, InitMethod::XAVIER);
    nn.addLayer(4, 1, actMethod::sigmoid, InitMethod::XAVIER);

    // 🔍 Dados de treinamento para o problema XOR
    std::vector<std::vector<double>> trainInputs = { {0, 0}, {0, 1}, {1, 0}, {1, 1} };
    std::vector<std::vector<double>> trainTargets = { {0}, {1}, {1}, {0} };

    // 💡 Exemplo de entrada para visualização do treinamento
    std::vector<double> sampleInput = {0, 1};

    int ephocs = 100;
    float learning_rate = 0.01
    // ⚙️ Treinamento da rede
    std::cout << "Iniciando treinamento... 🔥" << std::endl;
    nn.trainModel(trainInputs, trainTargets, ephocs, learning_rate, sampleInput, OptimizerType::ADAM);
    std::cout << "Treinamento concluído! ✅" << std::endl;

    // 🧪 Testando a rede com os dados de treinamento
    std::cout << "\nResultados dos testes:" << std::endl;
    nn.testModel(trainInputs, trainTargets);

    return 0;
}

Documentation

For detailed information on Neurama's classes, functions, and customization options, refer to the Neurama Documentation.

Contributions

Contributions are welcome! To contribute:

  1. Fork the repository.
  2. Create a new branch for your feature or fix.
  3. Commit your changes.
  4. Submit a pull request detailing your modifications.

License

Neurama is licensed under the mozilla license. See the LICENSE file for more information.

Acknowledgments

Neurama draws inspiration from various open-source neural network libraries, including:

  • MiniDNN: A header-only C++ library for deep neural networks.
  • OpenNN: An open-source neural networks library for machine learning.
  • FANN: Fast Artificial Neural Network Library.
  • tensorflow: The most powerful framework today.

These projects have influenced Neurama's design and functionality.

About

A lightweight and efficient implementation of neural networks in C++ with support for SGD and Adam.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages