Theingi Thway
Amgen
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Theingi Thway.
Journal of Bone and Mineral Research | 2010
Xiaodong Li; Kelly Warmington; Qing-Tian Niu; Franklin J. Asuncion; Mauricio Barrero; Mario Grisanti; Denise Dwyer; Brian Stouch; Theingi Thway; Marina Stolina; Michael S. Ominsky; Paul J. Kostenuik; William Scott Simonet; Chris Paszty; Hua Zhu Ke
The purpose of this study was to evaluate the effects of sclerostin inhibition by treatment with a sclerostin antibody (Scl‐AbII) on bone formation, bone mass, and bone strength in an aged, gonad‐intact male rat model. Sixteen‐month‐old male Sprague‐Dawley rats were injected subcutaneously with vehicle or Scl‐AbII at 5 or 25 mg/kg twice per week for 5 weeks (9–10/group). In vivo dual‐energy X‐ray absorptiometry (DXA) analysis showed that there was a marked increase in areal bone mineral density of the lumbar vertebrae (L1 to L5) and long bones (femur and tibia) in both the 5 and 25 mg/kg Scl‐AbII‐treated groups compared with baseline or vehicle controls at 3 and 5 weeks after treatment. Ex vivo micro–computed tomographic (µCT) analysis demonstrated improved trabecular and cortical architecture at the fifth lumbar vertebral body (L5), femoral diaphysis (FD), and femoral neck (FN) in both Scl‐AbII dose groups compared with vehicle controls. The increased cortical and trabecular bone mass was associated with a significantly higher maximal load of L5, FD, and FN in the high‐dose group. Bone‐formation parameters (ie, mineralizing surface, mineral apposition rate, and bone‐formation rate) at the proximal tibial metaphysis and tibial shaft were markedly greater on trabecular, periosteal, and endocortical surfaces in both Scl‐AbII dose groups compared with controls. These results indicate that sclerostin inhibition by treatment with a sclerostin antibody increased bone formation, bone mass, and bone strength in aged male rats and, furthermore, suggest that pharmacologic inhibition of sclerostin may represent a promising anabolic therapy for low bone mass in aged men.
Journal of Pharmaceutical and Biomedical Analysis | 2009
Chad A. Ray; Vimal Patel; Judy Shih; Chris Macaraeg; Yuling Wu; Theingi Thway; Mark Ma; Jean W. Lee; Binodh DeSilva
Developing a process that generates robust immunoassays that can be used to support studies with tight timelines is a common challenge for bioanalytical laboratories. Design of experiments (DOEs) is a tool that has been used by many industries for the purpose of optimizing processes. The approach is capable of identifying critical factors and their interactions with a minimal number of experiments. The challenge for implementing this tool in the bioanalytical laboratory is to develop a user-friendly approach that scientists can understand and apply. We have successfully addressed these challenges by eliminating the screening design, introducing automation, and applying a simple mathematical approach for the output parameter. A modified central composite design (CCD) was applied to three ligand binding assays. The intra-plate factors selected were coating, detection antibody concentration, and streptavidin-HRP concentrations. The inter-plate factors included incubation times for each step. The objective was to maximize the logS/B (S/B) of the low standard to the blank. The maximum desirable conditions were determined using JMP 7.0. To verify the validity of the predictions, the logS/B prediction was compared against the observed logS/B during pre-study validation experiments. The three assays were optimized using the multi-factorial DOE. The total error for all three methods was less than 20% which indicated method robustness. DOE identified interactions in one of the methods. The model predictions for logS/B were within 25% of the observed pre-study validation values for all methods tested. The comparison between the CCD and hybrid screening design yielded comparable parameter estimates. The user-friendly design enables effective application of multi-factorial DOE to optimize ligand binding assays for therapeutic proteins. The approach allows for identification of interactions between factors, consistency in optimal parameter determination, and reduced method development time.
Bioanalysis | 2015
Lakshmi Amaravadi; An Song; Heather Myler; Theingi Thway; Susan Kirshner; Viswanath Devanarayan; Yan G. Ni; Fabio Garofolo; Herbert Birnboeck; Susan Richards; Shalini Gupta; Linlin Luo; Clare Kingsley; Laura Salazar-Fontana; Stephanie Fraser; Boris Gorovits; John Allinson; Troy E. Barger; Shannon D Chilewski; Marianne Scheel Fjording; Sam Haidar; Rafiqul Islam; Birgit Jaitner; John Kamerud; Noriko Katori; Corinna Krinos-Fiorotti; David Lanham; Mark Ma; Jim McNally; Alyssa Morimoto
The 2015 9th Workshop on Recent Issues in Bioanalysis (9th WRIB) took place in Miami, Florida with participation of 600 professionals from pharmaceutical and biopharmaceutical companies, biotechnology companies, contract research organizations and regulatory agencies worldwide. WRIB was once again a 5 day, week-long event - A Full Immersion Bioanalytical Week - specifically designed to facilitate sharing, reviewing, discussing and agreeing on approaches to address the most current issues of interest in bioanalysis. The topics covered included both small and large molecules, and involved LCMS, hybrid LBA/LCMS and LBA approaches, including the focus on biomarkers and immunogenicity. This 2015 White Paper encompasses recommendations emerging from the extensive discussions held during the workshop, and is aimed to provide the bioanalytical community with key information and practical solutions on topics and issues addressed, in an effort to enable advances in scientific excellence, improved quality and better regulatory compliance. Due to its length, the 2015 edition of this comprehensive White Paper has been divided into three parts. Part 3 discusses the recommendations for large molecule bioanalysis using LBA, biomarkers and immunogenicity. Part 1 (small molecule bioanalysis using LCMS) and Part 2 (hybrid LBA/LCMS and regulatory inputs from major global health authorities) have been published in volume 7, issues 22 and 23 of Bioanalysis, respectively.
Journal of Pharmaceutical and Biomedical Analysis | 2010
Theingi Thway; Chris Macaraeg; Dominador Calamba; Vimal Patel; Jennifer Tsoi; Mark Ma; Jean Lee; Binodh DeSilva
Development and validation of ligand binding methods that can measure therapeutic antibodies (TA) accurately and precisely are essential for bioanalysis that supports regulated pharmacokinetic (PK) and toxicokinetic (TK) studies. Non-bead (planar) electrochemiluminescence (ECL) methods are known to have high sensitivity and a wide assay range and are therefore potentially useful in supporting research studies in the early phases of development as well as for diagnostic fields and multiplex biomarker applications. Here, we demonstrate the applications for using ECL for regulated studies associated with protein drug development. Three planar ECL methods were developed, validated, and implemented to quantify three different TAs to support PK/TK studies. An automated liquid handler was used for the preparation of standards, quality controls, and validation samples to minimize assay variability. Robustness and ruggedness were tested during pre-study validations. During method optimization, the potential assay ranges were 3 log orders. To improve assay accuracy and precision, assay ranges in all 3 methods were truncated by at least 50% at the upper end before proceeding to pre-study validations. All 3 methods had assay ranges of about 2 logs during pre-study validations. The inter-assay accuracy and precision during pre-study validations were <6% and 8%, respectively. The total error of the assays was <15% for both in-study and pre-study validations in all 3 methods. With the incorporation of a robotic workstation we concluded that performance in all 3 planar ECL methods was extremely precise and accurate during pre-study and in-study validations, enabling >90% assay success during sample analyses. Although there were limitations in the assay ranges, the strength of this technology in assay accuracy, precision, and reproducibility can be beneficial for macromolecule analyses in support of PK and TK studies in a regulated environment.
Journal of Pharmaceutical and Biomedical Analysis | 2009
Theingi Thway; Mark Ma; Jean Lee; Bethlyn Sloey; Steven Yu; Yow-Ming C. Wang; Binodh DeSilva; Tom Graves
A case study of experimental and statistical approaches for cross-validating and examining the equivalence of two ligand binding assay (LBA) methods that were employed in pharmacokinetic (PK) studies is presented. The impact of changes in methodology based on the intended use of the methods was assessed. The cross-validation processes included an experimental plan, sample size selection, and statistical analysis with a predefined criterion of method equivalence. The two methods were deemed equivalent if the ratio of mean concentration fell within the 90% confidence interval (0.80-1.25). Statistical consideration of method imprecision was used to choose the number of incurred samples (collected from study animals) and conformance samples (spiked controls) for equivalence tests. The difference of log-transformed mean concentration and the 90% confidence interval for two methods were computed using analysis of variance. The mean concentration ratios of the two methods for the incurred and spiked conformance samples were 1.63 and 1.57, respectively. The 90% confidence limit was 1.55-1.72 for the incurred samples and 1.54-1.60 for the spiked conformance samples; therefore, the 90% confidence interval was not contained within the (0.80-1.25) equivalence interval. When the PK parameters of two studies using each of these two methods were compared, we determined that the therapeutic exposure, AUC((0-168)) and C(max), from Study A/Method 1 was approximately twice that of Study B/Method 2. We concluded that the two methods were not statistically equivalent and that the magnitude of the difference was reflected in the PK parameters in the studies using each method. This paper demonstrates the need for method cross-validation whenever there is a switch in bioanalytical methods, statistical approaches in designing the cross-validation experiments and assessing results, or interpretation of the impact of PK data.
Bioanalysis | 2010
Theingi Thway; Chris Macaraeg; Dominador Calamba; Laura Brunner; Michael Eschenberg; Ramak Pourvasei; Liana Zhang; Mark Ma; Binodh DeSilva
BACKGROUND Incurred sample reanalysis (ISR) is the most recent in-study validation parameter that regulatory agencies have mandated to ensure reproducibility of bioanalytical methods supporting pharmacokinetic/toxicokinetic and clinical studies. The present analysis describes five representative case studies for macromolecule therapeutics. METHOD Single ISR acceptance criteria (within 30% of the averaged or original concentration) and a modified Bland-Altman (BA) approach were used to assess accuracy and precision of ISR results. General concordance between the two criteria was examined using simulation studies. RESULTS All five methods met the ISR criteria. The results indicated that thorough method development and prestudy validation were prerequisites for a successful ISR. The overall agreement between the original and reanalyzed results as determined by BA was within 20%. Simulation studies indicated that concordance between the ISR criteria and BA was observed in 95% of the cases. Dilution factors had no significant impact on the ISR, even for C(max) samples where 1:100 or higher dilutions were used. CONCLUSION The current ISR acceptance criteria for macromolecules was scientifically and statistically meaningful for methods with a total error of 25% or less.
Bioanalysis | 2014
Theingi Thway; Hossein Salimi-Moosavi
BACKGROUND The accuracy of highly sensitive biomarker methods is often confounded by the presence of various circulating endogenous factors in samples causing matrix effects. METHOD This article outlines two different biomarker methods: hepcidin enzyme-linked immunosorbent assay (ELISA) for which an orthogonal assessment of ELISA to liquid chromatography-tandem mass spectrometry was performed to examine the potential matrix effect, and sclerostin ELISA to evaluate the matrix effect. RESULTS Although the potential interfering effects of the endogenous hepcidin variants (prohepcidin and clipped) showed that these proteins had >30% immunoreactivity in ELISA, the hepcidin ELISA preferentially measures full-length hepcidin when the molar ratios of full-length to variants remain >1. The correlation of ELISA to liquid chromatography-tandem mass spectrometry results showed full-length hepcidin as the major form in diseased populations. CONCLUSION A fit-for-for-purpose assessment of matrix effect/selectivity was also performed for each method. This article demonstrates the utility of a fit-for-purpose approach to assess the validity of biomarker methods in evaluating the interconnected parameters of matrix effects, sensitivity and selectivity.
Aaps Journal | 2011
Theingi Thway; Michael Eschenberg; Dominador Calamba; Chris Macaraeg; Mark Ma; Binodh DeSilva
Incurred sample reanalysis (ISR) is recommended by regulatory agencies to demonstrate reproducibility of validated methods and provide confidence that methods used in pharmacokinetic and toxicokinetic assessments give reproducible results. For macromolecules to pass ISR, regulatory recommendations require that two thirds of ISR samples be within 30% of the average of original and reanalyzed values. A modified Bland–Altman (mBA) analysis was used to evaluate whether total error (TE), the sum of precision and accuracy, was predictive of a method’s passing ISR and to identify potential contributing parameters for ISR success. Simulated studies determined minimum precision requirements for methods to have successful ISR and evaluated the relationship between precision and the probability of a method’s passing ISR acceptance criteria. The present analysis evaluated ISRs conducted for 37 studies involving ligand-binding assays (LBAs), with TEs ranging from 15% to 30%. An mBA approach was used to assess accuracy and precision of ISR, each with a threshold of 30%. All ISR studies met current regulatory criteria; using mBA, all studies met the accuracy threshold of 30% or less, but two studies (5%) failed to meet the 30% precision threshold. Simulation results showed that when an LBA has ≤15% imprecision, the ISR criteria for both the regulatory recommendation and mBA would be met in 99.9% of studies. Approximately 71% of samples are expected to be within 1.5 times the method imprecision. Therefore, precision appears to be a critical parameter in LBA reproducibility and may also be useful in identifying methods that have difficulty passing ISR.
Aaps Journal | 2015
Jeffrey J. Talbot; Dominador Calamba; Melody Y. Pai; Mark Ma; Theingi Thway
ABSTRACTDecisions about efficacy and safety of therapeutic proteins (TP) designed to target soluble ligands are made in part by their ex vivo quantification. Ligand binding assays (LBAs) are critical tools in measuring serum TP levels in pharmacokinetic, toxicokinetic, and pharmacodynamic studies. This study evaluated the impact of reagent antibody affinities, assay incubation times, and analytical platform on free or total TP quantitation. An ELISA-based LBA that measures monoclonal anti-sclerostin antibody (TPx) was used as the model system. To determine whether the method measures free or total TPx, the effects of Kon, Koff, and KD were determined. An 8:1 molar ratio of sclerostin (Scl) to TPx compared to a 1:1 molar ratio produced by rabbit polyclonal antibodies to TPx was required to achieve IC50, a measure of TPx interference effectiveness, making it unclear whether the ELISA truly measured free TPx. Kinetic analysis revealed that Scl had a rapid dissociation rate (Koff) from TPx and that capture and detection antibodies had significantly higher binding affinities (KD) to TPx. These kinetic limitations along with long ELISA incubation times lead to the higher molar ratios (8:1) required for achieving 50% inhibition of TPx. However, a microfluidic platform with the same reagent pairs required shorter incubations to achieve a lower Scl IC50 molar ratio (1:1). The findings from this study provide the bioanalytical community with a deeper understanding of how reagent and platform selection for LBAs can affect what a particular method measures, either free or total TP concentrations.
Aaps Journal | 2012
Sheldon S. Leung; Joel Usansky; Robert Lynde; Theingi Thway; Robert Hendricks; David Rusnak
With the ever increasing speed with which data are generated and the continual implementation of new instrumentation, maximizing the efficiency of data management for ligand-binding assays (LBAs) remains an increasing challenge. At the American Association of Pharmaceutical Scientists Workshop on the 21st Century Bioanalytical Laboratory: Maximizing Quality and Efficiency through Innovation, the attendees recognized the need to address the issue of data management. The eSolutions team, made up of end users and vendors, was formed with this challenge in mind, under the 21st Century Laboratory initiative. The eSolutions team has identified the need for a fully automated data interchange process as the first step to optimizing data management. This will require at least two major advances. First, for instruments capable of capturing raw data and metadata, a common open-source data standard is critical. Software independent of instruments will need to be compatible with the common data standard. Secondly, vendors must ensure that instruments and data analysis systems that process LBA data are enabled for direct bidirectional, file-less transfer of data between laboratory information management systems (LIMS), and instrument systems. In many laboratories conducting assays, LBA data systems and laboratory instruments are often islands of information, separated by an ocean of non-communication. Multiple applications within the laboratories generate reams of data that are stored in separate, non-connected silos of files. Data translation is required to share the data stored within these files with a LIMS or other analytical software. At best, this can be resolved using information technology (IT) resources. However, in the worst case scenario, this translation is performed manually leading to tedious error-prone tasks that require additional quality control (QC) oversight. In order to meet US FDA 21 CFR Part 11 (Electronic Records Electronic Signatures, ERES) regulations, data transfer through a file-based process requires assurance that the data imported into the LIMS are the same data exported from the data acquisition instrument. Therefore, one way to increase laboratory productivity would be through the use of a file-less automated process for data interchange tasks. Automation offers the performance of tasks in a secure, repeatable, and consistent manner without human intervention. Additional benefits of implementing an automated data interchange include: Facilitation of results and metadata transfer from source laboratory instruments (e.g., plate reader) to a data processing system (e.g., a LIMS) The ability to automate laboratory business processes where the format-consistent data can be programmed to be parsed and read by computer software Consistency of data formats between versions of the same software Data comparability between software applications even from different vendors avoiding the business risk of having to maintain legacy systems in order to retain the ability to read stored proprietary raw data in the future While many LBA laboratories clearly see a need for an automated data interchange process, these laboratories must raise awareness of this requirement by supporting the eSolutions team in establishing this process as well as supporting those vendors who choose to participate in the effort. It is absolutely necessary to have active vendor participation to be successful. There is a compelling reason for instrument and software vendors to participate in the creation of an automated data interchange process and to support its adoption. Innovative vendors of LIMS and analytical software will benefit from not having to keep up with the ever changing landscape of data formats. The automated data interchange process also allow new instrument vendors to more easily comply with data management requirements of LBA laboratories, thereby increasing the likelihood that innovative technologies will be adopted. On the other hand, LBA laboratories should be willing to accept an increase in cost for applications to account for costs incurred during adoption of the process and to patronize vendors that adopt the automated data interchange process. In this article, we provide perspectives on the benefit to the LBA community (vendors and users) and what is required to establish an automated data interchange process for LBA data.