Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where George Stephanopoulos is active.

Publication


Featured researches published by George Stephanopoulos.


Computers & Chemical Engineering | 1990

Representation of process trends—Part I. A formal representation framework

J.T.-Y. Cheung; George Stephanopoulos

Abstract A formal methodology capable of transforming time-records of process variables into meaningful and explicit descriptions of process trends in real-time is presented in this paper. The representation is based on the formal definition of “temporal episodes,” which provide descriptions of trends by constructing “histories of episodes” making explicitly all the important “domain landmark values” specified by the user as well as the “geometric landmark values” generic to trends. Model inaccuracies are confined by establishing specific measures of information lost in the descriptions and keeping them below those of the process models. Despite the simplicity of the representation primitives, it is shown that the representation can provide complete, correct, robust and very compact models for process trend histories. In contrast to pure qualitative descriptions, the new representation contains sufficient quantitative information to allow for powerful inferences of the trends of the unmeasure variables.


Computers & Chemical Engineering | 1996

Intelligent systems in process engineering: a review

George Stephanopoulos; Chonghun Han

Abstract The purpose of this review is three-fold. First, sketch the directions that research and industrial applications of “intelligent systems” have taken in several areas of process engineering. Second, identify the emerging trends in each area, as well as the common threads that cut across several domains of inquiry. Third, stipulate research and development themes of significant importance for the future evolution of “intelligent systems” in process engineering. The paper covers the following seven areas: diagnosis of process operations; monitoring and analysis of process trends; intelligent control; heuristics and logic in planning and scheduling of process operations; modeling languages, simulation, and reasoning; intelligence in scientific computing; knowledge-based engineering design. Certain trends seem to be common and will (in all likelihood) characterize the nature of the future deployment of “intelligent systems”. These trends are: (1) Specialization to narrowly defined classes of problems. (2) Integration of multiple knowledge representations , so that all of relevant knowledge is captured and utilized. (3) Integration of processing methodologies , which tends to blur the past sharp distinctions between AI-based techniques and those from operations research, systems and control theory, probability and statistics. (4) Rapidly expanding range of industrial applications with significant increase in the scope of engineering tasks and size of problems.


Computers & Chemical Engineering | 1990

MODEL.LA. A modeling language for process engineering—I. The formal framework

George Stephanopoulos; G. Henning; H. Leone

Abstract A modeling language (MODEL.LA.) has been constructed for the interactive or automatic definition of models for processing systems. It is based on six modeling elements and 11 semantic relationships obeying basic axioms of transitivity, monotonicity, commutativity and merging. Its syntax can be described by an extended BNF (Backus—Naur Form). The structure of process models is depicted by specific digraphs, which are symbolically constructed by algorithmic procedures driven by the context of the modeling activity. MODEL.LA. can generate models of processing systems: (a) at various levels of abstraction; (b) capturing qualitative, semiquantitative and quantitative knowledge; (c) with complete documentation of the modeling context (assumptions, simplifications, process engineering task). Its object-oriented modularity makes it extensible and easily maintainable. Although a large part of MODEL.LA. is domain-independent, its vocabulary and syntax is specific to process engineering activities such as: process development, design, control and operations. A language for modeling processing systems, called MODEL.LA., has been presented. Realizing the limitations of the previous procedural attempts, it is based on an object-oriented, declarative approach. The language has been designed to be capable of: (i) expressing all points of interest assumed to be needed in modeling processing systems; (ii) representing processing systems at any level of detail; (iii) generating automatically the set of basic mathematical relationships that are describing the model components; and (iv) offering explicit documentation of all the assumptions that give rise to a particular model. It can be viewed as a very high-level special-purpose language, that moves the user several levels away from the inherent programming language (e.g. LISP, as in this case, Pascal or C). MODEL.LA. complies with all the requirements that were proposed for its design. Its basic strengths are its modularity and its inherent capability of controlling complexity by breaking down complex systems into smaller, less complex pieces. In fact, the specification of a model-class involves the subsequent specification and characterization of all its components. This idea is recursively applied throughout the definition process. Thus, it is very similar to what we usually do when we describe processing systems in natural languages. Another important feature of this language is its extensibility. At a very simple level, the ability to characterize a processing system can be extended by incorporating a richer terminal vocabulary. Also, its modularity makes possible the incorporation of new blocks in the definition of a class; blocks that may describe other aspects, that presently are not considered. If the incorporation of new modeling points of view requires the definition of new classes of modeling elements, it can be done in a straightforward manner.


Biotechnology and Bioengineering | 1999

Apoptosis in batch cultures of Chinese Hamster Ovary cells

J. Goswami; Anthony J. Sinskey; H. Steller; George Stephanopoulos; Daniel I. C. Wang

One of the main problems in the culture of Chinese Hamster Ovary (CHO) cells continues to be the inability to maintain the viability of the cultures over an extended period of time. The rapid decline in viability at the end of the culture is exacerbated by the absence of serum. In trying to reduce the extent of death in these cultures, we first tried to determine the mode of death. We found that more than 80% of the cells in a standard serum-free batch culture of CHO cells in suspension died via apoptosis--as evidenced by condensed chromatin and the appearance of a characteristic DNA ladder. Furthermore, when protein synthesis was inhibited using cycloheximide, the cells underwent rapid apoptosis indicating that death proteins were present in greater abundance than survival proteins in our CHO cells. Cell lysate from CHO cells showed evidence of cysteine protease (caspase) activity. Caspases of the Interleukin-1-beta-Converting Enzyme (ICE) family, e.g., CPP32, Mch-1, etc., have been implicated in the apoptotic process. Surprisingly, a caspase peptide inhibitor, N-benzyloxycarbonyl-Val-Ala-Asp-fluoro-methyl-ketone (z-VAD.fmk), was unable to substantially extend the life of a serum-free batch culture of CHO cells. In addition, z-VAD.fmk was only marginally able to extend viability in response to withdrawal of growth and survival factors, insulin and transferrin. In both these instances, z-VAD.fmk was able to prevent cleavage of caspase substrates, but not protect cells from death. However, we found that bcl-2 expression was able to significantly extend viabilities in CHO batch culture. Bcl-2 expression also substantially extended the viability of cultures in response to insulin and transferrin withdrawal. These results provide interesting insights into the pathways of death in a CHO cell.


Bioinformatics | 2002

Determination of minimum sample size and discriminatory expression patterns in microarray data

Daehee Hwang; William A. Schmitt; George Stephanopoulos; Gregory Stephanopoulos

MOTIVATION Transcriptional profiling using microarrays can reveal important information about cellular and tissue expression phenotypes, but these measurements are costly and time consuming. Additionally, tissue sample availability poses further constraints on the number of arrays that can be analyzed in connection with a particular disease or state of interest. It is therefore important to provide a method for the determination of the minimum number of microarrays required to separate, with statistical reliability, distinct disease states or other physiological differences. RESULTS Power analysis was applied to estimate the minimum sample size required for two-class and multi-class discrimination. The power analysis algorithm calculates the appropriate sample size for discrimination of phenotypic subtypes in a reduced dimensional space obtained by Fisher discriminant analysis (FDA). This approach was tested by applying the algorithm to existing data sets for estimation of the minimum sample size required for drawing certain conclusions on multi-class distinction with statistical reliability. It was confirmed that when the minimum number of samples estimated from power analysis is used, group means in the FDA discrimination space are statistically different. CONTACT [email protected]


Computers & Chemical Engineering | 1994

Representation of process trends. III: Multiscale extraction of trends from process data

Bhavik R. Bakshi; George Stephanopoulos

Abstract This paper presents a formal methodology for the analysis of process signals and the automatic extraction of temporal features contained in a record of measured data. It is based on the multiscale analysis of the measured signals using wavelets, which allows the extraction of significant temporal features that are localized in the frequency domain, from segments of the record of measured data (i.e. localized in the time domain). The paper provides a concise framework for the multiscale extraction and description of temporal process trends. The resulting algorithms are analytically sound, computationally very efficient and can be easily integrated with a large variety of methods for the interpretation of process trends and the automatic learning of relationships between causes and symptoms in a dynamic environment. A series of examples illustrate the characteristics of the approach and outline its use in various settings for the solution of industrial problems.


Computers & Chemical Engineering | 1987

Design-kit: An object-oriented environment for process engineering

George Stephanopoulos; J. Johnston; T. Kriticos; R. Lakshmanan; Michael L. Mavrovouniotis; C. Siletti

Abstract This paper outlines the structure and implementational features of the DESIGN-KIT, a software support environment developed to aid process engineering activities such as: synthesis of process flowsheets, configuration of control loops for complete plants, planning and scheduling of plant-wide operations and operational analysis. Based on object-oriented and data-driven programming styles, the paper discusses how the DESIGN-KIT is constructed to provide a rich repertory of declarative and procedural knowledge for the development of analytic- or design-oriented knowledge-based expert systems. A series of illustrations describe the construction of knowledge bases, graphic interface support, equation-oriented simulation and design, order-of-magnitude analysis, reasoning strategies and other facilities of the DESIGN-KIT.


Computers & Chemical Engineering | 1994

Representation of process trends—IV. Induction of real-time patterns from operating data for diagnosis and supervisory control

Bhavik R. Bakshi; George Stephanopoulos

A methodology for pattern-based supervisory control and fault diagonsis is presented, based on the multi-scale extraction of trends from process data described in Part III of this series (Bakshi and Stephanopoulos, Computers Chem. Engng 17, 1993). An explicit mapping is learned between the features extracted at multiple scales, and the corresponding process conditions, using the technique of induction by decision trees. Simple rules may be derived from the induced decision tree, to relate the relevant qualitative or quantitative features in the measured process data to process conditions. These rules are often physically interpretable and provide physical insight into the process. Industrial case studies from fine chemicals manufacturing, reactive crystallization and fed-batch fermentation are used to illustrate the characteristics of the pattern-based learning methodology and its application to process supervision and diagnosis.


Transportation Research | 1977

Oversaturated signal systems with queue length constraints—I: Single intersection

Panos G. Michalopoulos; George Stephanopoulos

Abstract Optimal control of oversaturated signals is one of the major concerns of traffic engineering practice today. Although some control strategies have been proposed in the past, due to their limitations they have not been used in practice. One of their major deficiencies is that the effects of queue length constraints on the signal operation have been inadequately investigated. Thus, the optimal control strategy with state variable constraints is studied in this paper. The optimal policy minimizing intersection delay subject to queue length constraints is to switch the signals as soon as the queues are at their limits so that the input and output flows are balanced. Cycle length can remain constant if only one queue length constraint is imposed but it must be free to vary if constraints are imposed on more than one queue. The conditions under which the problem is impossible are stated. A numerical solution method is proposed for determining the optimal control for the case in which the intersection demand is predictable for the entire control period.


Chemical Engineering Science | 1981

Rectification of process measurement data in the presence of gross errors

Jose A. Romagnoli; George Stephanopoulos

Abstract A systematic strategy is developed for the location of the source and the rectification of gross, biased measurement errors in a chemical process. The proposed strategy proceeds in three levels: (a) A structural analysis of the balance equations identifies subsets of balances with measurements which are suspected to possess gross errors. (b) A sequential analysis of the balance equations with suspect measurements further reduces the size of the problem. Statistical criteria are used in this step. (c) Finally, a sequential analysis of the suspect measurements appearing in the reduced set of balances leads to the identification of the source of the gross errors. The proposed strategy: (i) reduces the size of the data reconciliation problem significantly, even for large-scale chemical processes, (ii) is computationally simple and (iii) it conforms with the general process of variable monitoring in a chemical plant. Numerical examples are presented to clarify the elements of the procedure involved and demonstrate their value and effectiveness in dealing with realistic situations.

Collaboration


Dive into the George Stephanopoulos's collaboration.

Top Co-Authors

Avatar

Gregory Stephanopoulos

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Daehee Hwang

Daegu Gyeongbuk Institute of Science and Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Gregory C. Rutledge

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Paul I. Barton

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Chonghun Han

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Alexandros Koulouris

Massachusetts Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge