Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Gil Keren is active.

Publication


Featured researches published by Gil Keren.


international symposium on neural networks | 2016

Convolutional RNN: An enhanced model for extracting features from sequential data.

Gil Keren; Björn W. Schuller

Traditional convolutional layers extract features from patches of data by applying a non-linearity on an affine function of the input. We propose a model that enhances this feature extraction process for the case of sequential data, by feeding patches of the data into a recurrent neural network and using the outputs or hidden states of the recurrent units to compute the extracted features. By doing so, we exploit the fact that a window containing a few frames of the sequential data is a sequence itself and this additional structure might encapsulate valuable information. In addition, we allow for more steps of computation in the feature extraction process, which is potentially beneficial as an affine function followed by a non-linearity can result in too simple features. Using our convolutional recurrent layers, we obtain an improvement in performance in two audio classification tasks, compared to traditional convolutional layers.


conference of the international speech communication association | 2016

Convolutional Neural Networks with Data Augmentation for Classifying Speakers' Native Language.

Gil Keren; Jun Deng; Jouni Pohjalainen; Björn W. Schuller

We use a feedforward Convolutional Neural Network to classify speakers’ native language for the INTERSPEECH 2016 Computational Paralinguistic Challenge Native Language SubChallenge, using no specialized features for computational paralinguistics tasks, but only MFCCs with their first and second order deltas. In addition, we augment the training data by replacing the original examples with shorter overlapping samples extracted from them, thus multiplying the number of training examples by almost 40. With the augmented training dataset and enhancements to neural network models such as Batch Normalization, Dropout, and Maxout activation function, we managed to improve upon the challenge baseline by a large margin, both for the development and the test set.


Archive | 2018

Deep learning for multisensorial and multimodal interaction

Gil Keren; Amr El-Desoky Mousa; Olivier Pietquin; Stefanos Zafeiriou; Björn W. Schuller


international conference on multimedia and expo | 2017

End-to-end learning for dimensional emotion recognition from physiological signals

Gil Keren; Tobias Kirschstein; Erik Marchi; Fabien Ringeval; Björn W. Schuller


national conference on artificial intelligence | 2016

Tunable Sensitivity to Large Errors in Neural Network Training.

Gil Keren; Sivan Sabato; Björn W. Schuller


arXiv: Machine Learning | 2018

Weakly Supervised One-Shot Detection with Attention Siamese Networks.

Gil Keren; Maximilian Schmitt; Thomas Kehrenberg; Björn W. Schuller


Archive | 2018

The Principle of Logit Separation

Gil Keren; Sivan Sabato; Björn W. Schuller


IEEE Access | 2018

Calibrated Prediction Intervals for Neural Network Regressors

Gil Keren; Nicholas Cummins; Björn W. Schuller


Acta Acustica United With Acustica | 2018

Emotion Recognition in Speech with Latent Discriminative Representations Learning

Jing Han; Zixing Zhang; Gil Keren; Björn W. Schuller


arXiv: Machine Learning | 2017

Fast Single-Class Classification and the Principle of Logit Separation.

Gil Keren; Sivan Sabato; Björn W. Schuller

Collaboration


Dive into the Gil Keren's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jing Han

University of Augsburg

View shared research outputs
Top Co-Authors

Avatar

Jun Deng

University of Passau

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge