Toynet

A simple implementation of naural network with no dependencies


Keywords
simple, neural, network, machine, learning, artifical, no, dependecies, scratch, from
License
MIT
Install
pip install Toynet==0.2

Documentation

ToyNet

A Simple Python Neural Network
Please consider visiting my youtube channel.

Introduction:

Implementation of 3-layered fully connected feed-forward neural network.
Almost all article and posts about creating a neural network from scratch on internet uses libraries like tensorflow, keras ,etc or at least numpy.This basic library don't have any dependencies except python3 shell.
Great resources for learning about neural networks:

 3Bule1brown Playlist

Own Neural Network By Tariq Rashid

Pros:

 It can be used for standered classification purposes . Training with backpropagation algorithm. It is best suited to be used for NEAT (neuroevolution) as toy neural networks.

Cons:

Training process is'nt very fast because it uses standered python lists.

Documentation:

Let's create this neural network

#import toynet library
from toynet import *

#NeuralNetwork(num of input nodes, num of hidden nodes, num of output nodes)
model = NeuralNetwork(2,2,1)

Let's use neural network created above to solve the xor problem.

if you dont know about xor problem vist this link

import random
from toynet import *

random.seed(0)

#creating the model
model = NeuralNetwork(2,2,1)

#prints summary of the model
model.summary()

#sets learning rate to 0.1
#by default it is 0.1
model.setLearningRate(0.1)

#dataset
inputs = [[0,0],[0,1],[1,1],[1,0]]
targets = [[0],[1],[0],[1]]

input_samples = 10000

for i in range(input_samples):
	#gives a random index from length of inputs 
	index = random.randint(0,len(inputs)-1)

	#train model on randomly generated index
	#if index=1
	#following line is same as
	#model.train([0,1],[1])
	model.train(inputs[index],targets[index])

#use save_model(model) to save trained model
#use load_model(model) to load previously saved model. eg: newModel=load_model(model)


#using trained model to predict an input
print(model.predict([0,0]))


the output of above code is [0.09879658143110515] that is very close to 0 (the actuall answer).

It might have taken lesser input samples if i add mode hidden nodes and a diffrent learning rate.

Activation function used by this library is sigmoid. In future I may improve it and add more.