Lance C. Pérez
University of Nebraska–Lincoln
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Lance C. Pérez.
IEEE Transactions on Circuits and Systems for Video Technology | 2013
Jeȩdrzej Kowalczuk; Eric T. Psota; Lance C. Pérez
High-quality real-time stereo matching has the potential to enable various computer vision applications including semi-automated robotic surgery, teleimmersion, and 3-D video surveillance. A novel real-time stereo matching method is presented that uses a two-pass approximation of adaptive support-weight aggregation, and a low-complexity iterative disparity refinement technique. Through an evaluation of computationally efficient approaches to adaptive support-weight cost aggregation, it is shown that the two-pass method produces an accurate approximation of the support weights while greatly reducing the complexity of aggregation. The refinement technique, constructed using a probabilistic framework, incorporates an additive term into matching cost minimization and facilitates iterative processing to improve the accuracy of the disparity map. This method has been implemented on massively parallel high-performance graphics hardware using the Compute Unified Device Architecture computing engine. Results show that the proposed method is the most accurate among all of the real-time stereo matching methods listed on the Middlebury stereo benchmark.
IEEE Communications Letters | 1999
Christian Schlegel; Lance C. Pérez
Turbo-codes have been hailed as the ultimate step toward achieving the capacity limit Shannon established some 50 years ago. We look at the performance of turbo-codes with respect to various information theoretic error bounds. This comparison suggests that, if (block, or) frame error rates are considered, careful interleaver design is necessary to ensure an error performance within a fraction of a decibel of the theoretical limit for large block sizes, while random interleavers perform well for block sizes smaller than about 2 K. If the bit error performance is considered, interleaver design seems to have only a minor effect, and the codes perform close to the limit for all block sizes considered.
2008 5th International Symposium on Turbo Codes and Related Topics | 2008
Nathan Axvig; Deanna Dreher; Katherine Morrison; Eric T. Psota; Lance C. Pérez; Judy L. Walker
Simulations have shown that the outputs of min-sum (MS) decoding generally behave in one of two ways: either the output vector eventually stabilizes at a codeword or it eventually cycles through a finite set of vectors that may include both codewords and non-codewords. The latter behavior has significantly contributed to the difficulty in studying the performance of this decoder. To overcome this problem, a new decoder, average min-sum (AMS), is proposed; this decoder outputs the average of the MS output vectors over a finite set of iterations. Simulations comparing MS, AMS, linear programming (LP) decoding, and maximum likelihood (ML) decoding are presented, illustrating the relative performances of each of these decoders. In general, MS and AMS have comparable word error rates; however, in the simulation of a code with large block length, AMS has a significantly lower bit error rate. Finally, AMS pseudocodewords are introduced and their relationship to graph cover and LP pseudocodewords is explored, with particular focus on the AMS pseudocodewords of regular LDPC codes and cycle codes.
IEEE Transactions on Information Theory | 1994
Steven S. Pietrobon; Gottfried Ungerboeck; Lance C. Pérez; Daniel J. Costello
A general parity-check equation is presented that defines rotationally invariant trellis codes of rate k/(k+1) for two-dimensional signal sets. This parity-check equation is used to find rate k/(k+1) codes for 4PSK, 8PSK, 16PSK, and QAM signal sets by systematic code searches. The MPSK codes exhibit smaller free Euclidean distances than nonrotationally invariant linear codes with the same number of states. However, since the nonlinear codes have a smaller number of nearest neighbors, their performance at moderate signal to noise ratios is close to that of the best linear codes. The rotationally invariant QAM codes with 8, 32, 64, and 256 states achieve the same free Euclidean distance as the best linear codes. Transparency of user information under phase rotations is accomplished either by conventional differential encoding and decoding, or by integrating this function directly into the code trellis. >
international conference on rfid | 2012
Jason L. Brchan; Lianlin Zhao; Jiaqing Wu; Robert E. Williams; Lance C. Pérez
This paper introduces a real-time localization system (RTLS) using efficient multiple propagation models to compensate for the drawback of the received signal strength technique. The RTLS is implemented on an active RFID system and uses received signal strength measurements and reference tags for ranging. The RTLS is implemented purely in software that post processes the received signal strength data from the reader and does not require any additional hardware or any modifications to the RFID reader or tags. The proposed algorithm using multiple propagation models improves the performance of the RTLS. Two-dimensional localization results are given for a four-reader system covering a 4.5 by 5.5 meter room. The scenarios of both single tag and two tags for the tag object are developed. It has been proven that tag multiplicity, two tags for the target object, improves the performance of the system by reducing inaccurate received signal strength measurements due to poor tag orientation. Experimental results show that the proposed system achieves a localization accuracy within 1 meter in over 50 percent of the experiments and outperforms other comparable systems. Currently developed three-dimensional space extension research is discussed and results are presented.
Communications of The ACM | 2010
Stephen Cooper; Lance C. Pérez; Daphne Y. Rainey
Enhancing student learning and understanding by combining theories of learning with the computers unique attributes.
IEEE Transactions on Information Theory | 2009
Nathan Axvig; Deanna Dreher; Katherine Morrison; Eric T. Psota; Lance C. Pérez; Judy L. Walker
The role of pseudocodewords in causing non-codeword outputs in linear programming decoding, graph cover decoding, and iterative message-passing decoding is investigated. The three main types of pseudocodewords in the literature-linear programming pseudocodewords, graph cover pseudocodewords, and computation tree pseudocodewords-are reviewed and connections between them are explored. Some discrepancies in the literature on minimal and irreducible pseudocodewords are highlighted and clarified, and the minimal degree cover necessary to realize a pseudocodeword is found. Additionally, some conditions for the existence of connected realizations of graph cover pseudocodewords are given. This allows for further analysis of when graph cover pseudocodewords induce computation tree pseudocodewords. Finally, an example is offered that shows that existing theories on the distinction between graph cover pseudocodewords and computation tree pseudocodewords are incomplete.
Science | 2010
Kent Foster; K. B. Bergin; A. F. McKenna; D. L. Millard; Lance C. Pérez; J. T. Prival; Daphne Y. Rainey; H. M. Sevian; E. A. VanderPutten; J. E. Hamos
Schoolteachers and higher-education faculty can benefit one another to improve teaching and student learning. As leaders in higher education, industry, and government (1) bemoan the limited academic success of students in science, technology, engineering, and mathematics (STEM), many practices of academe impede the ability of college and university faculty to address the issues. Consistent with barriers to community-engaged scholarship in general (2), STEM faculty engagement in elementary and secondary schools (K–12) can be undermined, for example, by (i) low status accorded to STEM education research and publications, (ii) a zero-sum view of faculty time allocation (e.g., K–12 engagement means time away from work more highly rewarded during promotion, tenure, and merit review), and (iii) bureaucracies that hinder collaboration between STEM faculty and K–12 teachers and administrators (3).
Proceedings of the 16th annual conference reports on Innovation and technology in computer science education - working group reports | 2011
Lance C. Pérez; Stephen Cooper; Elizabeth K. Hawthorne; Susanne Wetzel; Joel Brynielsson; Asım Gençer Gökce; John Impagliazzo; Youry Khmelevsky; Karl J. Klee; Margaret Leary; Amelia Philips; Norbert Pohlmann; Blair Taylor; Shambhu J. Upadhyaya
The 2011 ITiCSE working group on information assurance (IA) education examined undergraduate curricula at the two- and four-year levels, both within and outside the United States (US). A broad set of two-year IA degree programs were examined in order to get a sense of similarities and differences between them. A broad set of four-year IA degree programs were also examined to explore their similarities and differences. A comparison between the two-year and fourfour-year degree programs revealed that the common challenge of articulation between two- and four-year programs exists in IA as well. The challenge of articulation was explored in some depth in order to understand what remedies might be available. Finally, a number of IA programs at international institutions were examined in order to gain insight into differences between US and non-US IA programs.
international conference on computer vision | 2015
Eric T. Psota; Jedrzej Kowalczuk; Mateusz Mittek; Lance C. Pérez
A new method is introduced for stereo matching that operates on minimum spanning trees (MSTs) generated from the images. Disparity maps are represented as a collection of hidden states on MSTs, and each MST is modeled as a hidden Markov tree. An efficient recursive message-passing scheme designed to operate on hidden Markov trees, known as the upward-downward algorithm, is used to compute the maximum a posteriori (MAP) disparity estimate at each pixel. The messages processed by the upward-downward algorithm involve two types of probabilities: the probability of a pixel having a particular disparity given a set of per-pixel matching costs, and the probability of a disparity transition between a pair of connected pixels given their similarity. The distributions of these probabilities are modeled from a collection of images with ground truth disparities. Performance evaluation using the Middlebury stereo benchmark version 3 demonstrates that the proposed method ranks second and third in terms of overall accuracy when evaluated on the training and test image sets, respectively.