Deep learning algorithms

Deep learning algorithmsbanner/>
            </div>





    </header> <!-- end article header -->



    <section class=

In recent years, end-to-end learning algorithms have revolutionized many areas of research, such as computer vision 1, natural language processing 2, gaming 3, robotics, etc. Deep-learning techniques have achieved the highest levels of success in many of these tasks, given their astonishing capability to model both the features/filters and the classification rule.

The algorithms developed in this line of research will focus on enhancing deep-learning architectures and improving their learning capabilities, in terms of invariant (rotation, translation, warping, scaling) feature extraction 4, computational efficiency and parallelization 5, speeding up the network learning times 6,7 and connecting images to sequences.

These algorithms will be applied to real computer vision problems in the field of Neuroscience, in collaboration with the Princeton Neuroscience Institute. These range from detection and tracking of rodents in low resolution videos, image segmentation and limb detection, motion estimation of whiskers using high-speed cameras and in vivo calcium image segmentation of neural network activity in rodents 8.

  1. A. Krizhevsky, I. Sutskever, and G. E. Hinton. Imagenet classification with deep convolutional neural networks. In Advances in Neural Information Processing Systems, 2012.
  2. Sutskever, I., Vinyals, O., & Le, Q. V. (2014). Sequence to sequence learning with neural networks. In Advances in neural information processing systems (pp. 3104-3112).
  3. Silver, D., Huang, A., Maddison, C. J., Guez, A., Sifre, L., Van Den Driessche, G., … & Dieleman, S. (2016). Mastering the game of Go with deep neural networks and tree search. Nature, 529(7587), 484-489.
  4. Jaderberg, M., Simonyan, K., & Zisserman, A. (2015). Spatial transformer networks. In Advances in Neural Information Processing Systems (pp. 2017-2025).
  5. Jaderberg, M., Czarnecki, W. M., Osindero, S., Vinyals, O., Graves, A., & Kavukcuoglu, K. (2016). Decoupled neural interfaces using synthetic gradients. arXiv preprint arXiv:1608.05343.
  6. Ha, D., Dai, A., & Le, Q. V. (2016). HyperNetworks. arXiv preprint arXiv:1609.09106.
  7. Bakhtiary, A. H., Lapedriza, A., & Masip, D. (2015). Speeding Up Neural Networks for Large Scale Classification using WTA Hashing. arXiv preprint arXiv:1504.07488.
  8. Grewe, B. F., Langer, D., Kasper, H., Kampa, B. M., & Helmchen, F. (2010). High-speed in vivo calcium imaging reveals neuronal network activity with near-millisecond precision. Nature methods, 7(5), 399-405.
Contact person