EasyNN: Demystifying Neural Networks

Logo

Unlock the world of neural networks with EasyNN—a user-friendly C++ library that transforms theoretical understanding into practical implementation.

View the Project on GitHub azadwasan/neuralnetwork

Implementing Neural Networks in C++

Andrew Ng has made Neural networks an extremely approachable subject for masses through the intuitive explanations of complex processes and mathematical equations needed to understand and use neural networks using his course here https://www.coursera.org/learn/machine-learning-course/. This has inspired me to develop EasyNN, an implementation of Neural networks from scratch in C++, to demonstrate it is only only very easy to understand and use neural networks, as shown by Andrew Ng, but it is also very easy to implement it from scratch.

EasyNN is built on the idea that making your own neural networks from scratch isn’t as tough as it sounds. So, instead of getting lost in theory, we’re diving into implementation details, assuming a few things:

We will not be discussing the neural networks concepts, as they have already been very well explained by Andrew. Rather, they would be limited only to the extent that are needed for implementation. As Coursera terms and conditions do not allow sharing my the original material, hence we would be using the same equations as shown in the lectures and reproducing them for the discussion and implement the code accordingly.

Linear Regression

Linear Regression

Linear Regression Cost Function

Logistic Regression

Classification using Logistic Regression

Logistic Regression Cost Function

Gradient Descent

Gradient Descent

Gradient Descent Evaluation

Neural Networks

Back Propagation