Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jesús Marco de Lucas is active.

Publication


Featured researches published by Jesús Marco de Lucas.


Physical Review Letters | 2011

Measurement of the Inclusive Jet Cross Section in pp Collisions at √s= 7 TeV

S. Chatrchyan; Javier Andres Brochero Cifuentes; José Ibán Cabrillo Bartolomé; Alicia Calderón Tazón; Shan Huei Chuang; Jordi Duarte Campderros; Marta Felcini; Marcos Fernández García; Gervasio Gómez Gramuglio; Francisco Javier González Sánchez; Clara Jordá Lope; Patricia Lobelle Pardo; María Amparo López Virto; Jesús Marco de Lucas; Rafael José Marco de Lucas; Celso Martinez Rivero; Francisco Matorras Weinig; Francisca Javiela Munoz Sanchez; Jonatan Piedra Gomez; Teresa Rodrigo Anoro; Ana Y. Rodríguez Marrero; Alberto Ruiz Jimeno; L. Scodellaro; Mar Sobron Sanudo; Iván Vila Álvarez; Rocio Vilar Cortabitarte

The inclusive jet cross section is measured in pp collisions with a center-of-mass energy of 7 TeV at the LHC using the CMS experiment. The data sample corresponds to an integrated luminosity of 34 pb−1. The measurement is made for jet transverse momenta in the range 18–1100 GeV and for absolute values of rapidity less than 3. The measured cross section extends to the highest values of jet pT ever observed and, within the experimental and theoretical uncertainties, is generally in agreement with next-to-leading-order perturbative QCD predictions. Submitted to Physical Review Letters ∗See Appendix A for the list of collaboration members ar X iv :1 10 6. 02 08 v1 [ he pex ] 1 J un 2 01 1


Frontiers in Genetics | 2015

Future opportunities and trends for e-infrastructures and life sciences: going beyond the grid to enable life science data analysis

Afonso M.S. Duarte; Fotis E. Psomopoulos; Christophe Blanchet; Alexandre M. J. J. Bonvin; Manuel Corpas; Alain Franc; Rafael C. Jimenez; Jesús Marco de Lucas; Tommi Nyrönen; Gergely Sipos; Stephanie Suhr

With the increasingly rapid growth of data in life sciences we are witnessing a major transition in the way research is conducted, from hypothesis-driven studies to data-driven simulations of whole systems. Such approaches necessitate the use of large-scale computational resources and e-infrastructures, such as the European Grid Infrastructure (EGI). EGI, one of key the enablers of the digital European Research Area, is a federation of resource providers set up to deliver sustainable, integrated and secure computing services to European researchers and their international partners. Here we aim to provide the state of the art of Grid/Cloud computing in EU research as viewed from within the field of life sciences, focusing on key infrastructures and projects within the life sciences community. Rather than focusing purely on the technical aspects underlying the currently provided solutions, we outline the design aspects and key characteristics that can be identified across major research approaches. Overall, we aim to provide significant insights into the road ahead by establishing ever-strengthening connections between EGI as a whole and the life sciences community.


Journal of Physics: Conference Series | 2012

Integrating PROOF Analysis in Cloud and Batch Clusters

Ana Yaiza Rodríguez-Marrero; Isidro González Caballero; Alberto Cuesta Noriega; Enol Fernández-del-Castillo; Álvaro López García; Jesús Marco de Lucas; Francisco Matorras Weinig

High Energy Physics (HEP) analysis are becoming more complex and demanding due to the large amount of data collected by the current experiments. The Parallel ROOT Facility (PROOF) provides researchers with an interactive tool to speed up the analysis of huge volumes of data by exploiting parallel processing on both multicore machines and computing clusters. The typical PROOF deployment scenario is a permanent set of cores configured to run the PROOF daemons. However, this approach is incapable of adapting to the dynamic nature of interactive usage. Several initiatives seek to improve the use of computing resources by integrating PROOF with a batch system, such as Proof on Demand (PoD) or PROOF Cluster. These solutions are currently in production at Universidad de Oviedo and IFCA and are positively evaluated by users. Although they are able to adapt to the computing needs of users, they must comply with the specific configuration, OS and software installed at the batch nodes. Furthermore, they share the machines with other workloads, which may cause disruptions in the interactive service for users. These limitations make PROOF a typical use-case for cloud computing. In this work we take profit from Cloud Infrastructure at IFCA in order to provide a dynamic PROOF environment where users can control the software configuration of the machines. The Proof Analysis Framework (PAF) facilitates the development of new analysis and offers a transparent access to PROOF resources. Several performance measurements are presented for the different scenarios (PoD, SGE and Cloud), showing a speed improvement closely correlated with the number of cores used.


Software - Practice and Experience | 2018

Resource provisioning in Science Clouds: Requirements and challenges

Álvaro López García; Enol Fernández-del-Castillo; Pablo Orviz Fernández; Isabel Campos Plasencia; Jesús Marco de Lucas

Cloud computing has permeated into the information technology industry in the last few years, and it is emerging nowadays in scientific environments. Science user communities are demanding a broad range of computing power to satisfy the needs of high‐performance applications, such as local clusters, high‐performance computing systems, and computing grids. Different workloads are needed from different computational models, and the cloud is already considered as a promising paradigm. The scheduling and allocation of resources is always a challenging matter in any form of computation and clouds are not an exception. Science applications have unique features that differentiate their workloads; hence, their requirements have to be taken into consideration to be fulfilled when building a Science Cloud. This paper will discuss what are the main scheduling and resource allocation challenges for any Infrastructure as a Service provider supporting scientific applications.


computing frontiers | 2017

Big Data Analytics on Large-Scale Scientific Datasets in the INDIGO-DataCloud Project

Sandro Fiore; Cosimo Palazzo; Alessandro D'Anca; Donatello Elia; Elisa Londero; Cristina Knapic; Stephen Monna; Nicola Mario Marcucci; Fernando Aguilar; Marcin Płóciennik; Jesús Marco de Lucas; Giovanni Aloisio

In the context of the EU H2020 INDIGO-DataCloud project several use case on large scale scientific data analysis regarding different research communities have been implemented. All of them require the availability of large amount of data related to either output of simulations or observed data from sensors and need scientific (big) data solutions to run data analysis experiments. More specifically, the paper presents the case studies related to the following research communities: (i) the European Multidisciplinary Seafloor and water column Observatory (INGV-EMSO), (ii) the Large Binocular Telescope, (iii) LifeWatch, and (iv) the European Network for Earth System Modelling (ENES).


Physics Letters B | 2015

Study of W boson production in pPb collisions at vsNN = 5.02 TeV

Vladimir Khachatryan; Javier Andres Brochero Cifuentes; José Ibán Cabrillo Bartolomé; Alicia Calderón Tazón; Jorge Duarte Campderros; Marcos Fernández García; Gervasio Gómez Gramuglio; A. Graziano; María Amparo López Virto; Jesús Marco de Lucas; Rafael José Marco de Lucas; Celso Martinez Rivero; Francisco Matorras Weinig; Francisca Javiela Munoz Sanchez; Jonatan Piedra Gomez; Teresa Rodrigo Anoro; Ana Y. Rodríguez Marrero; Alberto Ruiz Jimeno; L. Scodellaro; Iván Vila Álvarez; Rocio Vilar Cortabitarte

Article history: Received 19 March 2015 Received in revised form 6 September 2015 Accepted 23 September 2015 Available online 28 September 2015 Editor: M. DoserThe first study of W boson production in pPb collisions is presented, for bosons decaying to a muon or electron, and a neutrino. The measurements are based on a data sample corresponding to an integrated luminosity of 34.6 nb−1 at a nucleon-nucleon centre-of-mass energy of sNN = 5.02 TeV, collected by the CMS experiment. The W boson differential cross sections, lepton charge asymmetry, and forward-backward asymmetries are measured for leptons of transverse momentum exceeding 25 GeV/c, and as a function of the lepton pseudorapidity in the |ηlab| < 2.4 range. Deviations from the expectations based on currently available parton distribution functions are observed, showing the need for including W boson data in nuclear parton distribution global fits. Submitted to Physics Letters B c


Archive | 2012

Integrating a Multisensor Mobile System in the Grid Infrastructure

Ignacio Coterillo; Maria Campo; Jesús Marco de Lucas; José Monteoliva; Agustín Monteoliva; A. Monná; Milan Prica; A. Del Linz

This document presents a mobile floating autonomous platform supporting an extensive set of sensors, with depth profiling capabilities and ease of deployment at any place within a water reservoir limits in a short time; and its integration in the existing grid infrastructure. The platform equipment is remotely operable and provides real-time access to the measured data. With the integration in the existing Grid infrastructure, by means of the infrastructure and middleware provided by the DORII project, a remote operator can not only control the platform system but also use the produced data as soon as it is recorded in desired simulations and analysis.The instrumentation is installed in a mobile floating autonomous platform containing a power provider system (consisting of a pair of solar panels and a wind turbine, along with a series of deep-cycle batteries), a set of surface sensors, and a set of water quality sensors integrated in a compact underwater cage, connected with a steel cable to a winch system situated at the platform surface, providing water column profiling capabilities.All the underwater sensors (plus one of the surface sensors) are locally controlled by a datalogger device which has capabilities for storing up to 14 days worth of sampling data, constituting a first layer of data security backup. An onboard computer serves as datalogger for the rest of the sensors and as a second layer data backup. Communication with the land station is managed via a WiFi point to point link, using standard consumer directional antennas on both sides of the link. With this configuration distances of over 20 km are possible.In the land station, a server hosts both the primary database where the platform computer stores all sensor data (replicated at a second location, for additional data redundancy) and the Instrument Element (IE) middleware that integrates the whole monitoring system in the Grid infrastructure and provides remote access to the data and controlling capabilities through the Virtual Control Room (VCR). From within the VCR an operator is able not only to control and configure the behaviour of the sensors and access the recorded data in real-time, but also to employ this data in analysis jobs executing in the high availability Grid environment.


Journal of Physics: Conference Series | 2006

A Grided World

Jesús Marco de Lucas

A summary of the status and perspectives on Grid technology use for e-Science is presented. In particular the perspectives for High Energy Physics are described, including the experience in the LHC Computing Grid (LCG) and EGEE projects. Special emphasis on collaboration and interactive use is made showing the results of the CrossGrid project, and the possibilities opened by the new Interactive European Grid project.


Physical Review D | 2011

Measurement of the Bs0 Production Cross Section with Bs0→J/ψϕ Decays in pp Collisions at √s=7 TeV

S. Chatrchyan; Javier Andres Brochero Cifuentes; José Ibán Cabrillo Bartolomé; Alicia Calderón Tazón; Shan Huei Chuang; Jordi Duarte Campderros; Marta Felcini; Marcos Fernández García; Gervasio Gómez Gramuglio; Francisco Javier González Sánchez; Clara Jordá Lope; Patricia Lobelle Pardo; María Amparo López Virto; Jesús Marco de Lucas; Rafael José Marco de Lucas; Celso Martinez Rivero; Francisco Matorras Weinig; Francisca Javiela Munoz Sanchez; Jonatan Piedra Gomez; Teresa Rodrigo Anoro; Ana Y. Rodríguez Marrero; Alberto Ruiz Jimeno; L. Scodellaro; Mar Sobron Sanudo; Iván Vila Álvarez; Rocio Vilar Cortabitarte


arXiv: Distributed, Parallel, and Cluster Computing | 2017

INDIGO-DataCloud: A data and computing platform to facilitate seamless access to e-infrastructures.

Davide Salomoni; Luciano Gaido; Isabel Campos Plasencia; Jesús Marco de Lucas; P. Solagna; Jorge Gomes; Ludek Matyska; P. Fuhrman; Marcus Hardt; Giacinto Donvito; L. Dutka; Marcin Płóciennik; R. Barbera; Cristina Aiftimiei; Ignacio Blanquer; Andrea Ceccanti; M. David; Álvaro López García; Pablo Orviz Fernández; Zdenek Sustr; M. Viljoen; Fernando Aguilar; L.C. Alves; A. Bonving; Riccardo Bruno; Davor Davidovic; Marco Fargetta; Y. Chen; Sandro Fiore; Z. Kurkcuoglu

Collaboration


Dive into the Jesús Marco de Lucas's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Gervasio Gómez Gramuglio

Joint Institute for Nuclear Research

View shared research outputs
Researchain Logo
Decentralizing Knowledge