Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jesper Jørgensen is active.

Publication


Featured researches published by Jesper Jørgensen.


BMC Medical Imaging | 2008

Tumor volume in subcutaneous mouse xenografts measured by microCT is more accurate and reproducible than determined by 18F-FDG-microPET or external caliper

Mette Munk Jensen; Jesper Jørgensen; Tina Binderup; Andreas Kjær

BackgroundIn animal studies tumor size is used to assess responses to anticancer therapy. Current standard for volumetric measurement of xenografted tumors is by external caliper, a method often affected by error. The aim of the present study was to evaluate if microCT gives more accurate and reproducible measures of tumor size in mice compared with caliper measurements. Furthermore, we evaluated the accuracy of tumor volume determined from 18F-fluorodeoxyglucose (18F-FDG) PET.MethodsSubcutaneously implanted human breast adenocarcinoma cells in NMRI nude mice served as tumor model. Tumor volume (n = 20) was determined in vivo by external caliper, microCT and 18F-FDG-PET and subsequently reference volume was determined ex vivo. Intra-observer reproducibility of the microCT and caliper methods were determined by acquiring 10 repeated volume measurements. Volumes of a group of tumors (n = 10) were determined independently by two observers to assess inter-observer variation.ResultsTumor volume measured by microCT, PET and caliper all correlated with reference volume. No significant bias of microCT measurements compared with the reference was found, whereas both PET and caliper had systematic bias compared to reference volume. Coefficients of variation for intra-observer variation were 7% and 14% for microCT and caliper measurements, respectively. Regression coefficients between observers were 0.97 for microCT and 0.91 for caliper measurements.ConclusionMicroCT was more accurate than both caliper and 18F-FDG-PET for in vivo volumetric measurements of subcutaneous tumors in mice.18F-FDG-PET was considered unsuitable for determination of tumor size. External caliper were inaccurate and encumbered with a significant and size dependent bias. MicroCT was also the most reproducible of the methods.


Journal of Logic Programming | 1999

CONJUNCTIVE PARTIAL DEDUCTION: FOUNDATIONS, CONTROL, ALGORITHMS, AND EXPERIMENTS

Danny De Schreye; Robert Glück; Jesper Jørgensen; Michael Leuschel; Bern Martens; Morten Heine Sørensen

Abstract Partial deduction in the Lloyd–Shepherdson framework cannot achieve certain optimisations which are possible by unfold/fold transformations. We introduce conjunctive partial deduction , an extension of partial deduction accommodating such optimisations, e.g., tupling and deforestation. We first present a framework for conjunctive partial deduction, extending the Lloyd–Shepherdson framework by considering conjunctions of atoms (instead of individual atoms) for specialisation and renaming. Correctness results are given for the framework with respect to computed answer semantics, least Herbrand model semantics, and finite failure semantics. Maintaining the well-known distinction between local and global control, we describe a basic algorithm for conjunctive partial deduction, and refine it into a concrete algorithm for which we prove termination. The problem of finding suitable renamings which remove redundant arguments turns out to be important, so we give an independent technique for this. A fully automatic implementation has been undertaken, which always terminates. Differences between the abstract semantics and Prologs left-to-right execution motivate deviations from the abstract technique in the actual implementation, which we discuss. The implementation has been tested on an extensive set of benchmarks which demonstrate that conjunctive partial deduction indeed pays off, surpassing conventional partial deduction on a range of small to medium-size programs, while remaining manageable in an automatic and terminating system.


Journal of Functional Programming | 1993

Efficient analyses for realistic off-line partial evaluation

Anders Bondorf; Jesper Jørgensen

Based on Hengleins efficient binding-time analysis for the lambda calculus (with constants and ‘fix’) (Henglein, 1991), we develop three efficient analyses for use in the preprocessing phase of Similix, a self-applicable partial evaluator for a higher-order subset of Scheme. The analyses developed in this paper are almost-linear in the size of the analysed program. (1) A flow analysis determines possible value flow between lambda-abstractions and function applications and between constructor applications and selector/predicate applications. The flow analysis is not particularly biased towards partial evaluation; the analysis corresponds to the closure analysis of Bondorf (1991b). (2) A (monovariant) binding-time analysis distinguishes static from dynamic values; the analysis treats both higher-order functions and partially static data structures. (3) A new is-used analysis , not present in Bondorf (1991b), finds a non-minimal binding-time annotation which is ‘safe’ in a certain way: a first-order value may only become static if its result is ‘needed’ during specialization; this ‘poor mans generalization’ (Holst, 1988) increases termination of specialization. The three analyses are performed sequentially in the above mentioned order since each depends on results from the previous analyses. The input to all three analyses are constraint sets generated from the program being analysed. The constraints are solved efficiently by a normalizing union/find-based algorithm in almost-linear time. Whenever possible, the constraint sets are partitioned into subsets which are solved in a specific order; this simplifies constraint normalization. The framework elegantly allows expressing both forwards and backwards components of analyses. In particular, the new is-used analysis is of backwards nature. The three constraint normalization algorithms are proved correct (soundness, completeness, termination, existence of a best solution). The analyses have been implemented and integrated in the Similix system. The new analyses are indeed much more efficient than those of Bondorf (1991b); the almost-linear complexity of the new analyses is confirmed by the implementation.


symposium on principles of programming languages | 1994

Formally optimal boxing

Fritz Henglein; Jesper Jørgensen

An important implementation decision in polymorphically typed functional programming language is whether to represent data in boxed or unboxed form and when to transform them from one representation to the other. Using a language with explicit representation types and boxing/unboxing operations we axiomatize equationally the set of all explicitly boxed versions, called completions, of a given source program. In a two-stage process we give some of the equations a rewriting interpretation that captures eliminating boxing/unboxing operations without relying on a specific implementation or even semantics of the underlying language. The resulting reduction systems operate on congruence classes of completions defined by the remaining equations E, which can be understood as moving boxing/unboxing operations along data flow paths in the source program. We call a completion eopt formally optimal if every other completion for the same program (and at the same representation type) reduces to eopt under this two-stage reduction. We show that every source program has formally optimal completions, which are unique modulo E. This is accomplished by first “polarizing” the equations in E and orienting them to obtain two canonical (confluent and strongly normalizing) rewriting systems. The completions produced by Leroys and Poulsens algorithms are generally not formally optimal in our sense. The rewriting systems have been implemented and applied to some simple Standard ML programs. Our results show that the amount of boxing and unboxing operations is also in practice substantially reduced in comparison to Leroys completions. This analysis is intended to be integrated into Toftes region-based implementation of Standard ML currently underway at DIKU.


symposium on principles of programming languages | 1992

Generating a compiler for a lazy language by partial evaluation

Jesper Jørgensen

Compiler generation is often emphasized as being the most important application of partial evaluation. But most of the larger practical applications have, to the best of our knowledge, been outside this field. Expecially, no one has generated compilers for languages other than small languages. This paper describes a large application of partial evaluation where a realistic compiler was generated for a strongly typed lazy functional language. The language, that was called BAWL, was modeled after the language in Bird and Wadler [BW88] and is a combinator language with pattern matching, guarded alternatives, local definitions and list comprehensions. The paper describes the most important techniques used, especially the binding time improvements needed in order to get small and efficient target programs. Finally, the performance of the compiler is compared with two compilers for similar languages: Miranda and LML.


Higher-order and Symbolic Computation \/ Lisp and Symbolic Computation | 1997

An Automatic Program Generator for Multi-Level Specialization

Robert Glück; Jesper Jørgensen

Program specialization can divide a computation into several computation stages. This paper investigates the theoretical limitations and practical problems of standard specialization tools, presents multi-level specialization, and demonstrates that, in combination with the cogen approach, it is far more practical than previously supposed. The program generator which we designed and implemented for a higher-order functional language converts programs into very compact multi-level generating extensions that guarantee fast successive specialization. Experimental results show a remarkable reduction of generation time and generator size compared to previous attempts of multi-level specialization by self-application. Our approach to multi-level specialization seems well-suited for applications where generation time and program size are critical.


The Journal of Nuclear Medicine | 2012

Quantitative PET of Human Urokinase-Type Plasminogen Activator Receptor with 64Cu-DOTA-AE105: Implications for Visualizing Cancer Invasion

Morten Persson; Jacob Madsen; Søren Dinesen Østergaard; Mette Munk Jensen; Jesper Jørgensen; Karina Juhl; Charlotte Lehmann; Michael Ploug; Andreas Kjær

Expression levels of the urokinase-type plasminogen activator receptor (uPAR) represent an established biomarker for poor prognosis in a variety of human cancers. The objective of the present study was to explore whether noninvasive PET can be used to perform a quantitative assessment of expression levels of uPAR across different human cancer xenograft models in mice and to illustrate the clinical potential of uPAR PET in future settings for individualized therapy. Methods: To accomplish our objective, a linear, high-affinity uPAR peptide antagonist, AE105, was conjugated with DOTA and labeled with 64Cu (64Cu-DOTA-AE105). Small-animal PET was performed in 3 human cancer xenograft mice models, expressing different levels of human uPAR, and the tumor uptake was correlated with the uPAR expression level determined by uPAR enzyme-linked immunosorbent assay. The tumor uptake pattern of this tracer was furthermore compared with 18F-FDG uptake, and finally the correlation between sensitivity toward 5-fluorouracil therapy and uPAR expression level was investigated. Results: The uPAR-targeting PET tracer was produced in high purity and with high specific radioactivity. A significant correlation between tumor uptake of 64Cu-DOTA-AE105 and uPAR expression was found (R2 = 0.73; P < 0.0001) across 3 cancer xenografts, thus providing a strong argument for specificity. A significantly different uptake pattern of 64Cu-DOTA-AE105, compared with that of 18F-FDG, was observed, thus emphasizing the additional information that can be obtained on tumor biology using 64Cu-DOTA-AE105 PET. Furthermore, a significant correlation between baseline uPAR expression and sensitivity toward 5-fluorouracil was revealed, thus illustrating the possible potentials of uPAR PET in a clinical setting. Conclusion: Our results clearly demonstrate that the peptide-based PET tracer 64Cu-DOTA-AE105 enables the noninvasive quantification of uPAR expression in tumors in vivo, thus emphasizing its potential use in a clinical setting to detect invasive cancer foci and for individualized cancer therapy.


Nuclear Medicine and Biology | 2013

High tumor uptake of 64Cu: Implications for molecular imaging of tumor characteristics with copper-based PET tracers

Jesper Jørgensen; Morten Persson; Jacob Madsen; Andreas Kjær

INTRODUCTION The use of copper-based positron emission tomography (PET) tracers in cancer studies is increasing. However, as copper has previously been found in high concentrations in human tumor tissue in vivo, instability of PET tracers could result in tumor accumulation of non-tracer-bound radioactive copper that may influence PET measurements. Here we determine the degree of (64)Cu uptake in five commonly used human cancer xenograft models in mice. Additionally, we compare copper accumulation in tumor tissue to gene expression of human copper transporter 1 (CTR1). METHODS Small animal PET scans were performed on five different human cancer xenograft mice models 1h and 22h post injection (p.i.) of (64)CuCl2. Regions of interest (ROIs) were drawn on tumor tissue and sections of various organs on all images. Quantitative real-time PCR (qPCR) gene expression measurements of CTR1 were performed on tumor samples obtained after the 22h scan. RESULTS A relatively high tumor uptake of (64)Cu was seen in four out of five tumor types and an increase in (64)Cu accumulation was seen in three out of five tumor types between 1h and 22h p.i. No relationship was found between tumor uptake of (64)Cu and gene expression of CTR1. CONCLUSIONS The relatively high, time- and tumor type dependent (64)Cu uptake demonstrated here in five different human cancer xenograft models in mice, emphasizes the importance of validating tracer uptake and indicates that high in vivo stability of copper-based PET tracers is of particular importance because non-tracer-bound copper can accumulate in tumor tissue to a level that could potentially lead to misinterpretation of PET data.


PLOS ONE | 2012

18F-FDG PET Imaging of Murine Atherosclerosis: Association with Gene Expression of Key Molecular Markers

Anne Mette Fisker Hag; Sune Pedersen; Christina Christoffersen; Tina Binderup; Mette Munk Jensen; Jesper Jørgensen; Dorthe Skovgaard; Rasmus Sejersten Ripa; Andreas Kjær

Aim To study whether 18F-FDG can be used for in vivo imaging of atherogenesis by examining the correlation between 18F-FDG uptake and gene expression of key molecular markers of atherosclerosis in apoE−/− mice. Methods Nine groups of apoE−/− mice were given normal chow or high-fat diet. At different time-points, 18F-FDG PET/contrast-enhanced CT scans were performed on dedicated animal scanners. After scans, animals were euthanized, aortas removed, gamma counted, RNA extracted from the tissue, and gene expression of chemo (C-X-C motif) ligand 1 (CXCL-1), monocyte chemoattractant protein (MCP)-1, vascular cell adhesion molecule (VCAM)-1, cluster of differentiation molecule (CD)-68, osteopontin (OPN), lectin-like oxidized LDL-receptor (LOX)-1, hypoxia-inducible factor (HIF)-1α, HIF-2α, vascular endothelial growth factor A (VEGF), and tissue factor (TF) was measured by means of qPCR. Results The uptake of 18F-FDG increased over time in the groups of mice receiving high-fat diet measured by PET and ex vivo gamma counting. The gene expression of all examined markers of atherosclerosis correlated significantly with 18F-FDG uptake. The strongest correlation was seen with TF and CD68 (p<0.001). A multivariate analysis showed CD68, OPN, TF, and VCAM-1 to be the most important contributors to the uptake of 18F-FDG. Together they could explain 60% of the 18F-FDG uptake. Conclusion We have demonstrated that 18F-FDG can be used to follow the progression of atherosclerosis in apoE−/− mice. The gene expression of ten molecular markers representing different molecular processes important for atherosclerosis was shown to correlate with the uptake of 18F-FDG. Especially, the gene expressions of CD68, OPN, TF, and VCAM-1 were strong predictors for the uptake.


Theory and Practice of Logic Programming | 2004

Offline specialisation in Prolog using a hand-written compiler generator

Michael Leuschel; Jesper Jørgensen; Wim Vanhoof; Maurice Bruynooghe

The so called “cogen approach” to program specialisation, writing a compiler generator instead of a specialiser, has been used with considerable success in partial evaluation of both functional and imperative languages. This paper demonstrates that the cogen approach is also applicable to the specialisation of logic programs (called partial deduction) and leads to effective specialisers. Moreover, using good binding-time annotations, the speed-ups of the specialised programs are comparable to the speed-ups obtained with online specialisers. The paper first develops a generic approach to offline partial deduction and then a specific offline partial deduction method, leading to the offline system LIX for pure logic programs. While this is a usable specialiser by itself, it is used to develop the cogen system LOGEN. Given a program, a specification of what inputs will be static, and an annotation specifying which calls should be unfolded, LOGEN generates a specialised specialiser for the program at hand. Running this specialiser with particular values for the static inputs results in the specialised program. While this requires two steps instead of one, the efficiency of the specialisation process is improved in situations where the same program is specialised multiple times. The paper also presents and evaluates an automatic binding-time analysis that is able to derive the annotations. While the derived annotations are still suboptimal compared to hand-crafted ones, they enable non-expert users to use the LOGEN system in a fully automated way. Finally, LOGEN is extended so as to directly support a large part of Prologs declarative and non-declarative features and so as to be able to perform so called mixline specialisations.

Collaboration


Dive into the Jesper Jørgensen's collaboration.

Top Co-Authors

Avatar

Andreas Kjær

University of Copenhagen

View shared research outputs
Top Co-Authors

Avatar

Anders Elias Hansen

Technical University of Denmark

View shared research outputs
Top Co-Authors

Avatar

Robert Glück

University of Copenhagen

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Fan Li

University of Copenhagen

View shared research outputs
Top Co-Authors

Avatar

Jacob Madsen

Copenhagen University Hospital

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge