With the popularity that deep learning has acquired recently, artificial neural networks have become very relevant in the world of AI.
Artificial neural networks are not a new concept, they have suffered for a long time before being recognized as a powerful tool. If you review the history of artificial neural networks, you will find that there have been several big challenges that the technology had to overcome to exist as we know it today.
As powerful as they are, the implementation can be pretty simple because of
I will not explain every piece or concept related to artificial neural networks. Instead I will focus on showing how to work with
Just start by creating a model. In this case we will use a sequential model, which just means that we are going to stack the layers one after another:
from keras.models import Sequential model = Sequential()
from keras.layers import Dense model.add(Dense(units=64, activation='relu', input_dim=100)) model.add(Dense(units=10, activation='softmax'))
There is a wide range of activation functions that you can try. If none of them work for you, you can always provide your own.
After we have our architecture ready, we must configure how the network will learn. We will do this by “compiling” the model and passing information like the loss function and the optimizer function:
And that’s it. We have an artificial neural network ready to be trained. If we have our data ready for this, we can just train it like this:
The training is the most expensive process when working with artificial neural networks, so we don’t want to do that every time. To save your trained network you can do it like this:
An hdf5 file will be generated that will contain all the information required to reload your network at a later time:
from keras.models import load_model model = load_model(filepath)
Keras provides a high level interface to interact with your artificial neural network, and it is nicely documented. If you are using another library to work with your networks, give keras a try and compare the APIs.