-
Notifications
You must be signed in to change notification settings - Fork 55
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Basic neural networks #660
Conversation
Since last comment, I have added well-know networks from literature: These networks are also customizable. For example, in The implementations mostly rely on |
@camillebrianceau @ravih18 |
Last commits, I changed the config classes to match the new networks. I also changed the factory function |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM !
* add customizable networks (MLP, ConvEncoder, ConvDecoder, CNN, Generator, AutoEncoder, VAE) * add sota networks (ResNet, DenseNet, SE-ResNet, UNet, Attention-UNet, Vision Transformer) *update config classes *update factory function
* add customizable networks (MLP, ConvEncoder, ConvDecoder, CNN, Generator, AutoEncoder, VAE) * add sota networks (ResNet, DenseNet, SE-ResNet, UNet, Attention-UNet, Vision Transformer) *update config classes *update factory function
I this PR, I suggest the basic neural network architectures that could be trained via the commande line. @camillebrianceau @ravih18 @msolal @HuguesRoy could you please have a look at it?
The implemented networks are the following:
MLP
: a Multi Layer Perceptron (or Fully Connected Network), where linear, activation, normalization and dropout layers can be customized.ConvEncoder
: a Fully Convolutional Encoder, where convolutional, pooling, activation, normalization and dropout layers can be customized.ConvDecoder
: a Fully Convolutional Decoder, very similar toConvEncoder
but convolutions are replaced by transposed convolutions and pooling layers by unpooling layers.CNN
: a regressor/classifier with first a convolutional part and then a fully connected part. It is a simple aggregation of aConvEncoder
and aMLP
, so it is entirely customizable.Generator
: the symmetrical network ofCNN
, made with the aggregation of aMLP
and aConvDecoder
.AutoEncoder
: a symmetrical autoencoder, built from the aggregation of aCNN
(encoder) and the correspondingGenerator
(decoder).VAE
: a Variational AutoEncoder, built on the same idea asAutoEncoder
.Please look at the docstrings for some examples.
All the architectures are tested with unittests.
Feel free to comment the choices I made. To me, the goal is to have sufficiently flexible architectures to enable the user to tune their parameters, without going into too complicated neural networks that could be implemented via the API.
Feel also free to comment my choices on network names.