Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Cenk Undey is active.

Publication


Featured researches published by Cenk Undey.


IFAC Proceedings Volumes | 2009

Applied Advanced Process Analytics in Biopharmaceutical Manufacturing: Challenges and Prospects in Real-time Monitoring and Control

Cenk Undey; Sinem Ertunç; Thomas Mistretta; Manuj Pathak

Abstract Abstract Biopharmaceutical manufacturing processes are inherently complex due to their nonlinear bioprocess dynamics, variability in batch operations and manufacturing schedule, raw materials involved, and automatic process control. A typical processed lot generates large amounts of data that need to be analyzed and interpreted for process troubleshooting and continuous improvement purposes in addition to product release. Multivariate Batch Process Modeling, Monitoring and Control approaches in real-time are elaborated by providing industrial examples from the commercial manufacturing processes. Examples and opportunities in cell culture (e.g., bioreactor applications) and purification (e.g., large-scale chromatography) operations are summarized. Impact of Process Analytical Technologies (PAT), soft-sensor development, first principles modeling applications and commercial-scale examples are presented.


Archive | 2015

Automation and High-Throughput Technologies in Biopharmaceutical Drug Product Development with QbD Approaches

Vladimir I. Razinkov; Jerry Becker; Cenk Undey; Erwin Freund; Feroz Jameel

Quality by Design (QbD) of biopharmaceuticals assumes that desired properties of the final drug product can be predicted with the help of various analytical and characterization methods during the development process. This should be translated into a sound, scientific understanding of drug behavior to design a stability space with clearly defined parameters that produce acceptable product quality attributes. In the QbD framework, the combination and interaction of multiple input variables that have been demonstrated to provide assurance of quality is called the design space. Approaches for identifying the design space in the biopharmaceutical industry generally rely on statistical design of experiments (DOEs) which can be resource intensive (Horvath et al., Mol Biotechnol 45(3):203–206, 2010; Ng and Rajagopalan, Quality by design for biopharmaceuticals: perspectives and case studies, Wiley, Hoboken, 2009). In cases where the number of variables is large, performing a full factorial DOE may be resource intensive. The automated high-throughput methods have been found to be useful in exploration of multiple conditions, defined by so many variables. In this chapter, we describe the use of QbD for applications of automation and high-throughput technology in biopharmaceutical processes from early molecular assessment to the late stages of commercial drug development. High-throughput techniques reinforce the QbD approach not only by providing a large pool of data points using minimal resources but also by obtaining comparable and structural results, which allow thorough statistical analysis.


Archive | 2015

Application of QbD Elements in the Development and Scale-Up of a Commercial Filtration Process

Feroz Jameel; Cenk Undey; Paul M. Kovach; Jart Tanglertpaibul

Filtration is commonly employed in biopharmaceuticals as an alternate to terminal sterilization and is one of the critical unit operations of manufacturing. Understanding filter characteristics and factors that contribute to their performance is central to the selection of the filter and the design of the process. Quality by Design (QbD) is a systematic approach to development that enables product and process understanding and process control, based on sound science and quality risk management. This chapter describes a mock case study with illustration of how QbD elements could be applied in the selection of optimal filter(s) and the development of filtration process that is well understood, well controlled, safe, and robust.


Archive | 2018

High Performance Agent-Based Modeling to Simulate Mammalian Cell Culture Bioreactor

Robert Jackson; Elif Seyma Bayrak; Tony Wang; Myra Coufal; Cenk Undey; Ali Cinar

Abstract Agent-based modeling (ABM) is a novel modeling approach to address the complexity of systems that comprise heterogeneous interacting individuals. ABM is naturally hybrid for its ability to integrate quantitative and qualitative knowledge, deals with multiple levels of actions, and performs best when combined with conventional modeling approaches. In this study, a hybrid agent-based platform is developed using high performance computing to simulate a mammalian cell culture bioreactor. This platform enables communication of the agent-based model with the first principle models to account for quantitative changes in nutrient and metabolite concentrations and their distribution in the bioreactor. The model can predict viable cell density, and the cell cycle distributions along with the important nutrients and metabolites such as glucose and lactate. Integrating this cell culture agent-based model with highperformance computing leverages parallel processing to allow the ABM program to run faster, more efficiently, and with a higher capacity for the number of cells that can be modeled. The model is validated using bench-scale bioreactor experiments and showed good agreement with experimental data.


Biotechnology and Bioengineering | 2018

Advances in industrial biopharmaceutical batch process monitoring: Machine-learning methods for small data problems

Aditya Tulsyan; Christopher Garvin; Cenk Undey

Biopharmaceutical manufacturing comprises of multiple distinct processing steps that require effective and efficient monitoring of many variables simultaneously in real‐time. The state‐of‐the‐art real‐time multivariate statistical batch process monitoring (BPM) platforms have been in use in recent years to ensure comprehensive monitoring is in place as a complementary tool for continued process verification to detect weak signals. This article addresses a longstanding, industry‐wide problem in BPM, referred to as the “Low‐N” problem, wherein a product has a limited production history. The current best industrial practice to address the Low‐N problem is to switch from a multivariate to a univariate BPM, until sufficient product history is available to build and deploy a multivariate BPM platform. Every batch run without a robust multivariate BPM platform poses risk of not detecting potential weak signals developing in the process that might have an impact on process and product performance. In this article, we propose an approach to solve the Low‐N problem by generating an arbitrarily large number of in silico batches through a combination of hardware exploitation and machine‐learning methods. To the best of authors’ knowledge, this is the first article to provide a solution to the Low‐N problem in biopharmaceutical manufacturing using machine‐learning methods. Several industrial case studies from bulk drug substance manufacturing are presented to demonstrate the efficacy of the proposed approach for BPM under various Low‐N scenarios.


Biotechnology Progress | 2014

Multivariate statistical monitoring as applied to clean-in-place (CIP) and steam-in-place (SIP) operations in biopharmaceutical manufacturing

Kevin Roy; Cenk Undey; Thomas Mistretta; Gregory Naugle; Manbir S. Sodhi

Multivariate statistical process monitoring (MSPM) is becoming increasingly utilized to further enhance process monitoring in the biopharmaceutical industry. MSPM can play a critical role when there are many measurements and these measurements are highly correlated, as is typical for many biopharmaceutical operations. Specifically, for processes such as cleaning‐in‐place (CIP) and steaming‐in‐place (SIP, also known as sterilization‐in‐place), control systems typically oversee the execution of the cycles, and verification of the outcome is based on offline assays. These offline assays add to delays and corrective actions may require additional setup times. Moreover, this conventional approach does not take interactive effects of process variables into account and cycle optimization opportunities as well as salient trends in the process may be missed. Therefore, more proactive and holistic online continued verification approaches are desirable. This article demonstrates the application of real‐time MSPM to processes such as CIP and SIP with industrial examples. The proposed approach has significant potential for facilitating enhanced continuous verification, improved process understanding, abnormal situation detection, and predictive monitoring, as applied to CIP and SIP operations.


Journal of Process Control | 2010

Applied advanced process analytics in biopharmaceutical manufacturing: Challenges and prospects in real-time monitoring and control

Cenk Undey; Sinem Ertunç; Thomas Mistretta; Bryan Looze


IFAC-PapersOnLine | 2015

Computational Modeling of Fed-Batch Cell Culture Bioreactor: Hybrid Agent-Based Approach

Elif Seyma Bayrak; Tony Wang; Ali Cinar; Cenk Undey


Quality by Design for Biopharmaceuticals: Principles and Case Studies | 2008

Pat Tools for Biologics: Considerations and Challenges

Michael Molony; Cenk Undey


IFAC-PapersOnLine | 2016

In Silico Cell Cycle Predictor for Mammalian Cell Culture Bioreactor Using Agent-Based Modeling Approach

Elif Seyma Bayrak; Tony Wang; Matt Jerums; Myra Coufal; Chetan T. Goudar; Ali Cinar; Cenk Undey

Collaboration


Dive into the Cenk Undey's collaboration.

Researchain Logo
Decentralizing Knowledge