Capsule Networks are also known as Capsule Neural Network, which is a machine learning system that is used to better model hierarchical relationships. It is also commonly known as CapsNet. CapsNet is a neural net architecture that has a profound impact on deep learning, especially in the field of computer vision. CapsNets were first introduced in 2011 by Geoffrey Hinton, et al., in the paper titled as Transforming Autoencoders but few months ago, in the month of November 2017, Sara Sabour, Nicholas Frosst, and Geoffrey Hinton published a paper called Dynamic Routing between Capsules, where they introduced a CapsNet architecture that reached the state-of-the-art performance on MNIST (the famous data set of handwritten digit images), and got better results than CNNs on MultiMNIST (a variant with overlapping pairs of different digits).
A capsule is a group of neurons whose activity vector represents the instantiation parameters of a specific type of entity such as an object or an object part. We use the length of the activity vector to represent the probability that the entity exists and its orientation to represent the instantiation parameters.
If we say in simpler terms, CapsNet is composed of numerous capsules where each capsule is a small group of neurons that learns to detect a particular object within a given region of the image. CapsNets are capable of using much less data and never lose information between the layers. They can provide the hierarchy of characteristics found.
A CapsNet is organized in multiple layers, which is almost similar to a regular neural network. The capsules in the lowest layer are called primary capsules: each of them receives a small region of the image as input. It tries to detect the presence and pose of a particular pattern. Capsules in higher layers are called as routing capsules, which detect larger and more complex objects. With the help of Capsule Networks, a new building block has been introduced which is used in Deep Learning in order to better the model’s hierarchical relationships inside of internal knowledge representation of a neural network.