Jugurta Montalvão
Universidade Federal de Sergipe
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Jugurta Montalvão.
ieee international telecommunications symposium | 2006
Jugurta Montalvão; Carlos Augusto S. Almeida; Eduardo O. Freire
The effect of parametric equalization of time interval histograms (key down-down intervals) on the performance of keystroke-based user verification algorithms is analyzed. Four algorithms are used throughout this analysis: a classic one for static (structured) texts, a second one, also proposed in literature, for both static and arbitrary (free) text, a new one for arbitrary text based verification, and an algorithm recently proposed, where keystroke timing is indirectly addressed in order to compare user dynamics. The algorithms performances are presented before and after time interval histogram equalization, and the results corroborate with the hypothesis that the nonlinear memoryless time interval transform proposed here, despite its simplicity, can be a useful and almost costless building block in keystroke-based biometric systems.
Pattern Recognition Letters | 2015
Jugurta Montalvão; Eduardo O. Freire; Murilo A. Bezerra; Rodolfo Garcia
Keystroke pattern stabilization strongly affects verification performance.Each sequence of symbols induces a rhythmic profile.Rhythm stabilization seems to be a learning/forgetting process.Enrollment with post-habituation samples improves verification rates.EER is consistently lowered as the number of strokes increases. Rhythmic patterns in passwords are addressed through biometric verification tests. Experimental results are obtained through three publicly available databases, whereas experiments are guided by questions Q1: How does a subject develop a stable rhythmic signature associated to a new password? and Q2: How does the number of symbols affect biometric performance? Measurements show that even if subjects are instructed to train themselves before sample acquisition, a clear habituation phenomenon is noticed at the beginning of the first sessions, both for the password .tie5Roanl and the passphrase greyc laboratory, with significant consequences in terms of biometric verification performances. As for Q2, all experiments show that error rates are consistently lowered as password length increases. Additionally, a marginal but potentially useful observation is the stabilization of patterns around a rhythmic profile which seems to be induced by the corresponding sequence of symbols, whose consequences are addressed.
Signal Processing | 2001
Jugurta Montalvão; Bernadette Dorizzi; João Cesar M. Mota
A brief overview of Bayesian equalizers will be given. Knowing that all realistic implementations of Bayesian equalizers are approximations of the optimal one (in the sense of minimum decision error probability), we review some classical implementations found in literature and we study the performance degradation induced by the use of finite memory Bayesian equalizers instead of growing memory structures. Finally, we also point out some new ideas and results concerning sub-optimal Bayesian structures.
international workshop on machine learning for signal processing | 2013
Denis G. Fantinato; Daniel G. Silva; Everton Z. Nadalin; Romis Attux; João Marcos Travassos Romano; Aline Neves; Jugurta Montalvão
The efforts of Yeredor, Gutch, Gruber and Theis have established a theory of blind source separation (BSS) over finite fields that can be applied to linear and instantaneous mixing models. In this work, the problem is treated for the case of convolutive mixtures, for which the process of BSS must be understood in terms of space-time processing. A method based on minimum entropy and deflation is proposed, and structural conditions for perfect signal recovery are defined, establishing interesting points of contact with canonical MIMO equalization. Simulation results give support to the applicability of the proposed algorithm and also reveal the important role of efficient entropy estimation when the complexity of the mixing system is increased.
Pattern Recognition Letters | 2012
Jugurta Montalvão; Marcos Renato Rodrigues Araujo
We hypothesize that spectral masking may account for most of the gains in robustness against noise using ensemble interval histogram (EIH) and zero crossing with peak amplitude (ZCPA) compared to Mel-frequency cepstral coefficients (MFCCs). To test this hypothesis, we focus on this issue by comparing two MFCC implementations for which the only difference is spectral masking. The comparison involved biometric speaker verification tasks using two publicly available databases. The results confirm the superiority of MFCC with masking, thus corroborating our hypotheses that masking is a key aspect for improved robustness in feature extraction.
Signal Processing | 2013
Daniel G. Silva; Everton Z. Nadalin; Jugurta Montalvão; Romis Attux
In 2007, a theory of ICA over finite fields emerged and an algorithm based on pairwise comparison of mixtures, called MEXICO, was developed to deal with this new problem. In this letter, we propose improvements in the method that, according to simulations in GF(2) and GF(3) scenarios, lead to a faster convergence and better separation results, increasing the application possibilities of the new theory in the context of large databases.
Signal Processing | 2015
Daniel G. Silva; Jugurta Montalvão; Romis Attux; Luis Coradine
This work proposes a new approach to the blind inversion of Wiener systems. A Wiener system is composed of a linear time-invariant (LTI) sub-system followed by a memoryless nonlinear function. The goal is to recover the input signal by knowing just the output of the Wiener system, and the straightforward scheme to achieve this is called the Hammerstein system - apply a memoryless nonlinear mapping followed by a LTI sub-system to the output signal of the Wiener system. If the input of the Wiener system is originally iid and some mild conditions are satisfied, the inversion is possible. Based on this statement and the limitations of relevant previous works, a solution is proposed combining (i) immune-inspired optimization algorithms, (ii) information theory and (iii) IIR filters that yield a robust scheme with a relatively reduced risk of local convergence. Experimental results indicated a similar or superior performance of the new approach, in comparison with two other blind methodologies. HighlightsThis work proposes an immune-inspired algorithm to perform blind inversion of Wiener systems.The method employs mutual information-based criteria and IIR filters to adapt the inverse structure.The experimental results indicate a superior or equivalent performance of the new technique, in comparison with gradient-based search or kurtosis-based criterion.
issnip biosignals and biorobotics conference biosignals and robotics for better and safer living | 2013
Jânio Canuto; Bernadette Dorizzi; Jugurta Montalvão
In this paper, we propose to use the minimum jerk principle for representing on-line signatures. We briefly describe the minimum jerk model and the automatic procedure we propose for its implementation with signatures. Results on the MCYT-100 signature database are analysed regarding reconstruction error, residual analysis and stability. These results show that, despite its simplicity, the proposed model agrees with previous works on signature modelling and entropy-based signature categorization. Finally, possible applications and future works are suggested.
international workshop on machine learning for signal processing | 2012
Daniel G. Silva; Everton Z. Nadalin; Romis Attux; Jugurta Montalvão
The theory of ICA over finite fields, established in the last five years, gave rise to a corpus of different separation strategies, which includes an algorithm based on the pairwise comparison of mixtures, called MEXICO. In this work, we propose an alternative version of the MEXICO algorithm, with modifications that - as shown by the results obtained for a number of representative scenarios - lead to performance improvements in terms of the computational effort required to reach a certain performance level, especially for an elevated number of sources. This parsimony can be relevant to enhance the applicability of the new ICA theory to data mining in the context of large discrete-valued databases.
international conference on biometrics theory applications and systems | 2016
Aythami Morales; Julian Fierrez; Marta Gomez-Barrero; Javier Ortega-Garcia; Roberto Daza; John V. Monaco; Jugurta Montalvão; Jânio Coutinho Canuto; Anjith George
This paper presents the first Keystroke Biometrics Ongoing evaluation platform and a Competition (KBOC) organized to promote reproducible research and establish a baseline in person authentication using keystroke biometrics. The ongoing evaluation tool has been developed using the BEAT platform and includes keystroke sequences (fixed-text) from 300 users acquired in 4 different sessions. In addition, the results of a parallel offline competition based on the same data and evaluation protocol are presented. The results reported have achieved EERs as low as 5.32%, which represent a challenging baseline for keystroke recognition technologies to be evaluated on the new publicly available KBOC benchmark.