Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Ronald Mellado Miller is active.

Publication


Featured researches published by Ronald Mellado Miller.


Archives of Clinical Neuropsychology | 2014

Using Likelihood Ratios to Detect Invalid Performance with Performance Validity Measures

John E. Meyers; Ronald Mellado Miller; Lisa M. Thompson; Adam M. Scalese; Bonnie C. Allred; Zachary W. Rupp; Zacharias P. Dupaix; Amy Junghyun Lee

Larrabee (2008) applied chained likelihood ratios to selected performance validity measures (PVMs) to identify non-valid performances on neuropsychological tests. He presented a method of combining different PVMs with different sensitivities and specificities into an overall probability of non-validity. We applied his methodology to a set of 11 PVMs using a sample of 255 subjects. The results of the study show that in various combinations of two or three PVMs, a high reliability of invalidity can be determined using the chained likelihood ratio method. This study advances the ability of clinicians to chain various PVMs together and calculate the probability that a set of data is invalid.


Applied Neuropsychology | 2013

A Validated Seven-Subtest Short Form for the WAIS-IV

John E. Meyers; Margaret M. Zellinger; Tim Kockler; Mark T. Wagner; Ronald Mellado Miller

This study presents a short form of the Wechsler Adult Intelligence Scale-Fourth Edition (WAIS-IV; Wechsler, 2008) using the subtests (Block Design, Similarities, Digit Span, Arithmetic, Information, Coding, and Picture Completion) suggested by Ward (1990). These seven subtests were used to predict the full WAIS-IV Full-Scale IQ, as well as the Verbal Comprehension, Perceptual Reasoning, Working Memory, and Processing Speed Index scores. Two different data sets were used: the first consisted of 70 subjects and the second consisted of 32 subjects. The first data set was used to create a linear regression and the second data set was used to validate the results and compare to the prorated score method from the WAIS-IV manual. The prorated estimated scores correlated significantly with their counterparts and proved to be a better method of estimating the Full-Scale IQ and most of the index scores, but the regression equation was better at predicting the Processing Speed Index. The current study is consistent with the Ward (1990) and Pilgrim, Meyers, Bayless, & Whetstone (1999) studies and represents a reliable and valid way of assessing intellectual functioning in an abbreviated format.


Applied Neuropsychology | 2014

An adaptation of the MMPI-2 Meyers Index for the MMPI-2-RF.

John E. Meyers; Ronald Mellado Miller; Nathan A. Haws; Jason L. Murphy-Tafiti; Thomas D. Curtis; Zachary W. Rupp; Taylor A. Smart; Lisa M. Thompson

Using an overall sample of 278 individuals who had taken the Minnesota Multiphasic Personality Inventory-Second Edition (MMPI-2) and who had clear diagnostic information available in their medical records, the Meyers Index (MI) for the MMPI-2 (Meyers, Millis, & Volkert, 2002) was calculated for each individual, and a new version of the MI created for the MMPI-2 Restructured Form (MMPI-2-RF) was calculated. The MI is a method of combining multiple MMPI-2 validity scales into a single weighted index to assess exaggerated self-report on the MMPI-2. The new index is intended to provide the same type of global assessment of validity but for the MMPI-2-RF (MI-r). The MI and the MI-r were compared at both individual and group levels and were found to correlate well (r = .87). Diagnostic groups of litigants and nonlitigants of traumatic brain injury, chronic pain, and posttraumatic stress disorder were also examined; and the performance of the MI and the MI-r were similar. Similarly, the pass and fail agreement rate for the two scales was 93%. The results indicate that the MI and MI-r perform very similarly and are good methods of assessing overall validity of MMPI-2 and MMPI-2-RF test performance.


Applied Neuropsychology | 2015

Normative Data for the Neurobehavioral Symptom Inventory

John E. Meyers; James English; Ronald Mellado Miller; Amy Junghyun Lee

The demographically diverse populations served by large health care systems (Veterans Affairs, Department of Defense, Medicare, Medicaid) are routinely screened with the Neurobehavioral Symptom Inventory (NSI). The extent to which a patient’s report of symptoms either initially and/or across time is affected by demographic variables—gender, ethnicity, age, or education—has not been investigated despite widespread use of the NSI. In practice, the effectiveness of this tool might be improved with demographically based norms. A large data set of normal community-dwelling individuals was collected using the NSI. Emphasis was made to collect data from individuals of diverse ethnic backgrounds. It was hypothesized that ethnic/cultural backgrounds would have an impact on NSI scores. The results provide normative data for the NSI applicable to a wide variety of individuals of various ages and ethnic backgrounds. An analysis of variance indicated there was no significant difference in NSI responses based on ethnic/cultural background; however, age and gender were found to contribute significantly to the variance associated with symptom endorsement. The NSI appears to be a reliable measure of self-report postconcussive symptoms. Age is a variable associated with differential symptom endorsement on the NSI. Follow-up studies are needed to provide a measure of the sensitivity and specificity of this measure.


Applied Neuropsychology | 2013

Are Self-Ratings of Functional Difficulties Objective or Subjective?

Ronald Mellado Miller; Nathan A. Haws; Jason L. Murphy-Tafiti; Carlyn D. Hubner; Thomas D. Curtis; Zachary W. Rupp; Taylor A. Smart; Lisa M. Thompson

In this study, we compared objective neuropsychological data using the Meyers Neuropsychological Battery (MNB; Meyers & Rohling, 2004) and self-report measures of emotional distress using the Symptom Checklist 90-Revised (SCL-90-R; Derogatis, 1994) with self-ratings of functional difficulties as measured by the Patient Competency Rating Scale (PCRS; Prigatano, 1986). The results showed a high correlation between the PCRS and scales on the SCL-90-R (r = .65), whereas correlation with the overall test battery mean of the MNB was quite small (r = .18). Our results indicate that self-report of cognitive difficulties is more related to current emotional distress than to objective measures. Therefore, any diagnostic considerations that rely on self-report need to be tempered by considerations of current emotional status. This has implications for diagnoses such as posttraumatic stress disorder and other diagnoses that rely on self-report as a source of diagnostic information.


Applied Neuropsychology | 2014

Emotional Distress Affects Attention and Concentration: The Difference Between Mountains and Valleys

John E. Meyers; Chad E. Grills; Margaret M. Zellinger; Ronald Mellado Miller

The current study tests the hypothesis that the “mountains and valleys pattern” (MVP) observed within the Attention and Concentration domain of the Meyers Neuropsychological Battery reflects the interference of emotional distress/anxiety on the patients cognitive test performance. First, the MVP was objectively quantified using a formula that took into account both increased and decreased scores, rather than canceling them out through averaging. Using a total sample of 787 subjects, the Minnesota Multiphasic Personality Inventory-Second Edition Restructured Form (MMPI-2-RF) profile scores of cases with and without this pattern were then compared using an extensive database followed by a smaller, matched-groups design. The presence of the MVP was related to MMPI-2-RF test performance. In particular, this pattern was related to emotional distress/anxiety scales but was not related to scales reflecting neurological or cognitive complaints. The degree of emotional distress experienced may affect attention and concentration test performance in a way that sometimes heightens focus and at other times disrupts focus. The MVP may be used to assess the effects of emotional distress on the consistency of an individual patients attention and concentration test performance.


international conference on engineering psychology and cognitive ergonomics | 2009

Cognitive Workload as a Predictor of Student Pilot Performance

Nathan F. Tilton; Ronald Mellado Miller

This study examined the relationship between cognitive task load and performance in pilot training in a civilian pilot training program. It was found that the NASA task load index was indicative of training success, with the most successful pilot trainees showing the most cognitive task load and vice versa for those performing poorly. The implications for this finding are discussed as is their relation to possible advantages to military pilot trainees over their civilian counterparts.


international conference on engineering psychology and cognitive ergonomics | 2007

Sequential analyses of error rate: a theoretical view

Ronald Mellado Miller; Richard J. Sauque

Though error rate is a ubiquitous measure of human performance, as typically measured in terms of overall error rate or percentage, there are a number of predictive variables lost by summing or averaging the errors made. In this paper, we present a sequential analysis of error rate, where the pattern of errors is analyzed. By examining such concepts as the number of transitions from incorrect responses (I) to correct responses (C) or IC transitions as well a concept called I-length, which refers to the number of incorrect responses followed by a correct response, valid ordinal predictions of persistence in the face of continuous failure can be made. This paper develops this theoretical construct in the hopes that utilizing such data will facilitate the analysis and predictive quality of error rate data.


Learning and Motivation | 2004

Serial learning in rats: A test of three hypotheses

E. J. Capaldi; Ronald Mellado Miller


Applied Neuropsychology | 2014

Using Pattern Analysis Matching to Differentiate TBI and PTSD in a Military Sample

John E. Meyers; Ronald Mellado Miller; Alexa R. R. Tuita

Collaboration


Dive into the Ronald Mellado Miller's collaboration.

Top Co-Authors

Avatar

John E. Meyers

University of South Dakota

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Zachary W. Rupp

Brigham Young University–Hawaii

View shared research outputs
Top Co-Authors

Avatar

Amy Junghyun Lee

Brigham Young University–Hawaii

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jason L. Murphy-Tafiti

Brigham Young University–Hawaii

View shared research outputs
Top Co-Authors

Avatar

Nathan A. Haws

Brigham Young University–Hawaii

View shared research outputs
Top Co-Authors

Avatar

Taylor A. Smart

Brigham Young University–Hawaii

View shared research outputs
Top Co-Authors

Avatar

Thomas D. Curtis

Brigham Young University–Hawaii

View shared research outputs
Researchain Logo
Decentralizing Knowledge