A science gateway for Exploring the X-ray Transient and variable sky using EGI Federated Cloud
Daniele D'Agostino, Luca Roverelli, Gabriele Zereik, Giuseppe La Rocca, Andrea De Luca, Ruben Salvaterra, Andrea Belfiore, Gianni Lisini, Giovanni Novara, Andrea Tiengo
AA Science Gateway for Exploring the X-RayTransient and Variable Sky Using EGI Federated Cloud
Daniele D’Agostino a, ∗ , Luca Roverelli a , Gabriele Zereik a , Giuseppe La Rocca b , Andrea De Luca c,e , RubenSalvaterra c , Andrea Belfiore c , Gianni Lisini d , Giovanni Novara d,c , Andrea Tiengo d,c,e a CNR-Istituto di Matematica Applicata e Tecnologie Informatiche “Enrico Magenes”,via Dei Marini 6, 16149 Genova, Italy b EGI Foundation, Science Park 140 1098 XG Amsterdam The Netherlands c INAF-Istituto di Astrofisica Spaziale e Fisica Cosmica Milano,via E. Bassini 15, 20133 Milano, Italy d Scuola Universitaria Superiore IUSS Pavia, piazza della Vittoria 15, 27100 Pavia, Italy e Istituto Nazionale di Fisica Nucleare, Sezione di Pavia, via A. Bassi 6, 27100 Pavia, Italy
Abstract
Modern soft X-ray observatories can yield unique insights into time domain astrophysics, and a huge amountof information is stored - and largely unexploited - in data archives. Like a treasure-hunt, the EXTraSproject harvested the hitherto unexplored temporal domain information buried in the serendipitous datacollected by the European Photon Imaging Camera instrument onboard the ESA XMM-Newton, in 16 yearsof observations. All results have been released to the scientific community, together with new softwareanalysis tools. This paper presents the architecture of the EXTraS science gateway, that has the goal toprovide the software to the scientific community through a Web based portal using the EGI Federated Cloudinfrastructure. The main focus is on the light software architecture of the portal and on the technologicalinsights for an effective use of the EGI ecosystem.
Keywords:
Science Gateways; Microservices; Astrophysics
1. Introduction
Almost all astrophysical objects, from stars inthe surroundings of the solar system, to supermas-sive black holes in the nuclei of very distant galax-ies, display a distinctive variability as a functionof time, their flux and spectral shape changing ona range of time scales. This is especially true inthe high-energy range of the electromagnetic spec-trum. The X-ray and gamma-ray sky is extremelydynamic and new classes of objects, some of themcompletely unexpected, have been discovered in the ∗ Corresponding author last decades thanks to their peculiar variability. Mostof the variable phenomena have been discovered withlarge field-of-view instruments operating at hard X-ray/gamma-ray energies, which, constantly observinglarge portions of the sky, can also detect relativelyrare events.At soft X-rays ( 0.1-10 keV), wide field instrumentsare much less sensitive than narrow field telescopeswith focusing optics. In particular, the EuropeanPhoton Imaging Camera (EPIC, [1, 2]) instrumentonboard the European Space Agency mission XMM-Newton is the most powerful tool to study the vari-ability of faint X-ray sources, thanks to its unprece-dented combination of large effective area, good an-gular, spectral and temporal resolution, and pretty
Preprint submitted to Future Generation Computer Systems November 18, 2019 a r X i v : . [ a s t r o - ph . I M ] N ov arge field of view.Seventeen years after its launch, EPIC is still fullyoperational and its immensely rich archive of data,the XMM-Newton Science Archive (XSA), keepsgrowing. Large efforts are ongoing, to explore theserendipitous content in XMM data. Indeed, the cat-alog of serendipitous sources extracted from EPICobservations, dubbed 3XMM, is the largest and mostsensitive compilation of X-ray sources ever produced,listing more than 500,000 detections over about 800square degrees of the sky [3]. Further 20,000 sourceshave been identified in the so-called XMM Slew Sur-vey (XSS) [4], using data collected while the telescopeis moving from one target to the next one. Thesedata present a shallower sensitivity, but they covermore than 70% of the sky. Time-domain informationon such a large sample of sources remains, however,largely unexploredThe EU-funded FP7 Exploring the X-ray Transientand variable Sky (EXTraS) project [5] investigatedand extracted the serendipitous content of the EPICdatabase in the time domain. In particular EXTraSextended 3XMM by designing and implementing fourmain lines of analysis: i) a systematic study of aperi-odic, short-term variability of 3XMM sources on allpossible time scales (from the duration of an obser-vation to the instrument time resolution, typically 73ms and 2.6 sec.); ii) a systematic search for short,weak transient sources that are above the detectionthreshold just for a small interval of time; iii) a sys-tematic study of long-term variability (i.e. variabilitybetween different observations), thanks to the largenumber of overlapping observations performed in 16years; iv) a systematic search for periodicities.As the most sensitive search for variability everperformed, EXTraS raises new questions in high-energy astrophysics [6] and may serve as a pathfinderfor future missions. Therefore EXTraS results, to-gether with new software tools related to the fourlines of analysis, has been released to the whole com-munity. The software is of particular importance forenhancing the potential of discovery of the XMM-Newton mission [7], especially because it is still on- going and fully operational and therefore it collectsnew data each day.In this paper we present the architecture of the sci-ence gateway developed in the project, named EX-TraS portal , that presently allows the analysis fortransient X-ray sources. The portal is the result ofthe joint effort of the two communities of the project,astrophysics and ICT, and the main contribution isrepresented by the description of the microservice-based approach we are following, that is based ona previous experience in deploying a science gate-way for the Hydrometeorological scientific commu-nity, named DRIHM portal [8, 9].The first release of the EXTraS portal has been pre-sented in [10], while in this paper we describe an im-proved version, based on the re-design of many com-ponents, that relies on the EGI Federated Cloud [11]for executing analysis tasks.The paper is organized as follows. Section 2 dis-cusses related works, Section 3 presents the EXTraSPortal architecture while 4 presents the EGI compu-tational infrastructure. Section 5 describes the ser-vice for the analysis of transient X-ray sources, whileSection 6 concludes.
2. Related Works
The common strategies for providing software toolsto the scientific community are the following. Thefirst, basic solution is to make available an installeror an archive containing all the files required to com-pile and run the analysis tool. This approach hasbeen adopted for some important tools of the Astro-physics community as the Science Analysis System (SAS), a collection of scripts and libraries specificallydesigned for the XMM-Newton observations. A sec-ond solution is to provide the software by exportingthe corresponding workflows, that can be thus exe-cuted using a Workflow Management System. Alsothis solution has been adopted by the Astrophysics http://portal.extras-fp7.eu
3. The EXTraS Portal Architecture
The EXTraS portal has been developed as a set ofmodules of PortalTS, a modern Web Portal under de-velopment as an independent project at CNR-IMATI3 igure 1: The architecture of the EXTraS science gateway for the refactoring of the DRIHM portal. The mod-ule is a Single Page Web Application leveraging onother ready-to-use components and APIs exposed byPortalTS to enable users management, security andto persist users data. In order to simplify the accessto the FedCloud infrastructure, we also developed an-other component, that exposes a simple API to per-form the submission and the monitoring of jobs inthe Cloud. This component has been developed as astand-alone Web Service, with its own database. Forthis reason, it represents a microservice.The architecture of the EXTraS Portal and itsmain components are shown in Figure 1.
PortalTS
PortalTS is an original Web Portal developed inTypescript using the NodeJS and Express frame-works. It is composed by reusable modules and im-plements standard features available for a Web site(e.g. user management and registration) along withother features (such as a simple API for data persis-tence) that enables fast development of custom mod-ules.A module is a component that implements and ex-poses a feature, but can also use features exposedby other modules. It is a very general componentrepresenting, for example, a set of web pages, a webservice, a web app (aka Single Page Application), a https://portalts.it/ ,that are stable and maintained, making them usablein a production environment.The User Management module defines an API fora complete authentication system, including user reg-istration, login, and administration pages. It imple-ments also role and group concepts, at the basis ofthe authorization mechanism for pages, modules andother entities.The Persistence API module defines the interfaceto store, retrieve and manage heterogeneous data onthe MongoDB database. It exposes both a RESTfuland an internal API, that can be directly used byother modules, as the CMS described below. TheRESTful API is very important since it allows tostore data directly from a web app, that can be builtupon this modules. One example is exactly repre-sented by the EXTraS-specific modules described be-low. The Persistence API defines entities and col-lections. A collection is a set of entities, while anentity represents possibly heterogeneous data storedwith some additional metadata, like creation, updatetime, the owner and authorized users. An entity can http://mongoosejs.com/ belong to only one collection.The Persistence API relies on the User Manage-ment Module to ensure security and user authenti-cation on the data. By default, an entity is onlyaccessible by the owner, but its read and write accesspolicy can be changed, using a group-based policy.The Persistence API Module is fundamental since itallows to store and retrieve persistence data withoutany effort, enabling a ready-to-use persistence layer.Moreover, this layer is integrated with an AngularJSlibrary that implements all the methods necessary togive a quicker and simpler access to the persistentdata.The Content Management System (CMS) moduledefines some web pages for user login and registra-tion, and it allows the creation of user-defined webpages and menus. Each web page or menu elementcan be publicly available or accessible by a particulargroup, since the CMS module uses the PersistenceAPI module. Images and files can be managed us-ing the Repository module, and then exploited by theweb pages.There are also some further basic modules,like the Theme module, that defines the web pagesheader and the footer to define a standard look andfeel of a portal instance, and the Logging module forstoring the requests received by all the modules, to-gether with possible errors and exceptions. The EXTraS Portal Modules
The Jobs Management module represents theHome page of the EXTraS portal, and is shown inFigure 2. It provides users with the possibility tocreate, submit and manage the different analysis ex-periments based on the software developed within theEXTraS project. In particular it presents all the sub-mitted or configured analyses, providing the possibil-ity to create a new analysis starting from an existingconfiguration or share results with other users.This module is based on AngularJS and it is a com-plete web app, without any server side code. It usesthe Persistence API to store and retrieve experimentsdata, and it activates the other portal modules corre-sponding to the different operations available. Withrespect to the previous version of the portal in fact,the Jobs Management module has been redesignedto cooperate with other two modules, the Workflow5onfiguration module and the FedCloud SubmissionHandler module.The Workflow Configuration module is responsi-ble to interact with the user for the creation andconfiguration of experiments based on the EXTraSsoftware. In the portal every analysis corresponds toa single application, therefore there is no need to ex-plicitly create and manage workflows. For this reasona user interacts directly with an analysis-specific UI,as shown in Figure 3 for the Transient Analysis. Itsmain aim is to collect the parameters value and tocreate a namelist, that will be provided to the Fed-Cloud Submission Handler for the actual executionof the job. A key aspect is represented by the factthat all the UI are defined as JSON files (a smallexample corresponding to the Transient Analysis UIis shown in Figure 4) stored in the portal repositoryand accessed via the Persistence API. The UI cor-responding to the requested analysis is therefore dy-namically created at runtime, therefore without theneed to write any line of code and with the majoradvantage that each update to the parameter list iscompletely transparent to this module.The FedCloud Submission Handler module man-ages submission of jobs by interacting with a ded-icated microservice, described in the following Sec-tion, and provides a full view of the job status, re-sults and logs. In particular the actual submission re-quires the user specifies as input for the analysis oneor more observation identifiers (OBSID) among thosecontained in the XSA, as shown in Figure 5. Eachof them corresponds to a job, therefore an analysisconfiguration can result in multiple jobs executed ata time. During the execution the user can monitorthe status of the job by means of the real-time log in-formation the software tool provides.When a job ter-minates, the FedCloud Submission Handler providesthe possibility to retrieve results and also log infor-mations, as shown in Figure 6. All the informationrelated to a job (e.g. the configuration parameters,the logs, the results, the ownership/sharing informa-tion and possible comments) are stored on the portaldatabase via the Persistence API until it is deletedby the user who owns it. It is to note that the Fed-Cloud Submission Handler module can be called alsoby external systems via an API. In perspective the EXTraS portal will provide the analysis software asservices.The EXTraS portal provides two further key fea-tures: the ability to share an analysis (i.e. thenamelist and possibly the results) and the supportfor the interaction and discussion (in terms of com-ments) among the scientists sharing it. Sharing a jobmeans not only that the experiment results are visi-ble to other users, but also the configuration is sharedand can be used as a starting point for re-submittingthe experiment on a new set of data. Thus, a job ex-ecution can be replicated by other users that can, forexample, validate the experiment results or explorethe behavior by changing one or a few parameter val-ues.
Json-GUI
Json-GUI is a front-end library, developed as aset of reusable AngularJS directive, that allows thedynamic generation of full-featured form-based webinterfaces including validation and constraints. Itcan considered a companion tool with respect toPortalTS, but it can be exploited in any Angu-larJS/Javascript web application.The defining characteristic of this library is thatit simplifies and automatizes the design and the im-plementation of a standard Web interface, reducingthe development time. Starting from a formal JSONconfiguration describing a list of inputs, this moduleis able to build a form frame interface, with standardtype but also personalized validation and constraints.In details the library provides 7 basic parame-ter types: integer, float, datetime, text, select, geo-referenced domains and a generic fileupload. Foreach parameter, the module allows to define differ-ent constraints with different error messages, whereeach constraint can be constituted by one or moreconditions. In such a way it is possible to explicitlysupport all the HTML 5 input types.Moreover, within each condition, the user cancompare the value of the parameter with the valueof another one, and/or with a static value. Figure 4shows how it is possible to define an interface starting https://github.com/portalTS/json-gui igure 2: The Jobs Management module interface. from a definition of the mandatory selection param-eter T IM E IN T ERV AL SELECT ION . We alsoassume the previous definition of a parameter named
T IM E IN T ERV AL SELECT ION BAY ESIAN .The former value depends on the latter one becausethe “dependencies” property defines a relationbetween them, and a set of conditions with the“isValid” property that have to be verified. If thiscondition is not verified, the interface raises an errormessage.Json-GUI is of particular importance when devel-opers have to deploy in the science gateways softwaremodules subject to frequent updates. As stated be-fore, each UI in fact is dynamically built by the Work-flow Configuration module starting from the JSONfile corresponding to the kind of analysis a user wantsto perform. This means that there is not the need towrite any line of code, giving however the possibilityto specify personalized behavior and accepted valuefor each parameter. The ability to modify the in-terface simply changing a configuration file allows afaster development cycle.The choice of building such a module from scratch,against using already existing ones, is derived from the need to address some specific requirements. Inparticular, the way the module is built lets any non-IT scientist to easily design an advanced configura-tion interface, freeing the IT developer from editingthe source code any time a scientist decides to updatethe parameter list or any other interface element. Infact, Json-GUI comes with some high-level featureswell suited for the scientific context, that gives scien-tists the possibility to easily define high level valida-tion and constraints between configuration parame-ters.
4. The EGI Computational Infrastructure
The EXTraS Science Gateway relies on the EGICloud Compute service to provide scientists with thepossibility to run the EXTraS analysis tools on allthe observation data available through the XMM-Newton Science Archive (XSA). To allow users fromthe Astrophysics community to analyse their owndata using EXTraS pipelines and guaranteed a re-liable services to end-users, the project has signed anagreement (SLA) with three cloud providers of theEGI Federation and created the extras-fp7.eu
VO. In7 igure 3: The Transient Analysis UI shown by the Workflow configuration module. details, the INFN-CATANIA-STACK resource cen-ter provides 10 Virtual CPU cores, 40 GB of mem-ory and a scratch storage of 0.6 TB; the RECAS-BARI resource center provides 10 Virtual CPU cores,40 GB of memory and a scratch storage of 1 TB;the CYFRONET-CLOUD resource center provides40 Virtual CPU cores, 160 GB of memory and ascratch storage of 100 GB. Presently, we are able torun up to 30 analysis at a time [26] by exploiting aIaaS infrastructure.The resulting computational infrastructure is de-picted in the left side of Figure 1 and it is composedby four components, the EGI Applications Database(AppDB), the E-Token Server, the CERN VirtualMachine File System infrastructure (CVMFS) andthe cloud providers of the EGI Federation (FedCloud)that actually support the execution of the EXTraSanalysis tools. The usage of the components is man- aged by a microservice developed as part of the EX-TraS portal.
AppDB
The EGI Applications Database is a central mar-ketplace containing computing tools ready to be usedon top of the EGI Grid and/or Cloud Infrastructure.In particular the tools can be represented by appli-cations, for which is provided a description and in-formation of the virtual organizations (VO) that canexploit them on a set of computational resources, andvirtual appliance, i.e. pre-configured virtual machineimages that can be instantiated again by members ofone or more VO and using the Cloud infrastructureof the providers supporting the VOs.In the first release of the EXTraS portal we adoptedthe approach of preparing a dedicated virtual appli-ance for each analysis tool. However we had to mod-8 igure 4: An example of parameter and consistency check definition with Json-GUI. ify it on the basis of the characteristics of the EX-TraS software tools. They rely in fact on the ScienceAnalysis System (SAS), a collection of software toanalyze the data collected by the XMM-Newton ob-servatory, and HEAsoft , a collection of software tomanipulate FITS Files and to analyze data for high-energy astronomy. Both these packages are rathercomplex to install and setup and, more importantly,require about 9 GB of disk space, including calibra-tion files required by SAS. Considering that everychange in the EXTraS software requires to re-uploadthe entire virtual appliance on the AppDB and thatmost of the Cloud providers do not allow virtual ap-pliance with a disk larger than 10 GB, we decided tomove all the EXTraS-specific software and packageson CVMFS.Therefore we use a general-purpose virtual appli-ance running Ubuntu, which is started and configured https://heasarc.nasa.gov/lheasoft/ at runtime by using specific contextualization scriptsby the microservice described below. In particularthe virtual machine instance installs at runtime somegeneral-purpose packages for SAS and Heasoft, somePython-based packages for astrophysics, and mountthe CVMFS directory containing the EXTraS soft-ware. Moreover, considering also I/O and temporarydata for the analysis can be of several GB, the vir-tual machine uses also a block storage device, thatis dynamically created and mounted at runtime bythe microservice using the Open Cloud ComputingInterface (OCCI). E-Token Server
During the aforementioned DRIHM project we re-alized that it is unfeasible to require all the users ofa science gateway to get a digital certificate issuedby a recognized Certification Authority (CA). EGIin fact relies on a single sign-on mechanism to accessthe federated services based on X.509 certificates andVO membership. The solution we adopted there andin the EXTraS portal relies on the use of robot cer-tificates and e-Token servers [27].9 igure 5: The portal allows to query the XSA for validating the inserted OBSID.
Robot certificates have been introduced by theInter-operable Global Trust Federation (IGTF) in2010. The formal definition is the following: “Robots,also known as automated clients, are entities thatperform automated tasks without human interven-tion. Production ICT environments typically supportrepetitive, ongoing processes - either internal systemprocesses or processes relating to the applications be-ing run (e.g. by a site or by a portal system). Theseprocedures and repetitive processes are typically au-tomated, and generally run using an identity with thenecessary privileges to perform their tasks” [28].In practice, they were introduced to allow users,who cannot get or are not familiar with personal dig-ital certificates, to exploit any distributed infrastruc-ture relying on them in their research activities. Therobot certificate is usually associated with a specificapplication (or function) that the application devel- oper/provider wants to share with all the VO [29].This is exactly the scenario arisen in the EXTraS ac-tivities, because portal users are provided with thepossibility to run only pre-defined software tools.Robot certificates are used, together with one ormore MyProxy servers, to generate short lived cer-tificates called proxy certificates. These proxy cer-tificates are then actually used to perform the execu-tion of the software on the resources supporting theVO. An evolution of this concept is represented by theper-user sub-proxy (PUSP), that allows identificationof the individual users that operate using a commonrobot certificate. This is achieved by creating a proxycredential from the robot credential with the proxycertificate containing user-identifying information inits additional proxy CN field.To be complaint with the EGI policies, the EXTraSportal has been configured to send a PUSP gener-10 igure 6: The left side of the image shows the interface provided by the FedCloud Submission Handler module during theexecution of the job, the right one when it finishes with a success. ation request via its network API. In this request,the portal specifies the ID its associate to the user,which will be used by the e-Token server to generatethe short-term proxy certificate that can be used tointeract with the extras-fp7.eu VO resources.
CVMFS
The CERN Virtual Machine File System(CernVM-FS or CVMFS) has been designed asa scalable, reliable and low-maintenance softwaredistribution service with the aim to replace localpackage managers and shared software areas on clus-ter file systems. Its original purpose was to supportthe research activities in High Energy Physics atCERN by deploying software through a a POSIXread-only file system in user space [30]. With this approach files are hosted on standard web serversand mounted, uses outgoing HTTP connections only,in the universal namespace /cvmfs . As stated in thedocumentation, CVMFS focuses specifically on thesoftware use case by means of aggressive caching andreduction of latency. Software in fact usually com-prises many small files that are frequently openedand read as a whole. Furthermore, the software usecase includes frequent look-ups for files in multipledirectories when search paths are examined. Dataand meta-data are therefore transferred on demandand data integrity is verified by cryptographichashes.11 he FedCloud Microservice
A microservice is a self-contained reusable compo-nent that fulfills a specific task. They are looselycoupled components that provide better scalabilitywith respect to other solutions [31]. For example, ifa single microservice of a portal is accessed by manymore users than the others, thus it needs to managemany more connections and activities, it is possibleto instantiate it multiple times, performing load bal-ancing without wasting resources, thus scaling at verylow granularity. On the contrary, with a traditionalsingle Web Service approach, the whole Web Servicerequires to be instantiated multiple times, increasingthe consumption of partially exploited resources.Furthermore it is possible to update, improve orcorrect bugs of each microservice independently fromthe others. This enables a faster development cycleand simpler analysis, debug and deployment phasesfor each microservice, improving the maintenance ofthe whole system and enabling a fast prototyping onnew functionalities.In the current EXTraS portal the scalability doesnot represent a key issue, while the independent de-velopment cycle does. In particular we designed theusage of microservices for specific computational in-frastructure. Their duty in fact is to get the kind ofanalysis and parameters the user requested, the (setof ) OBSID, and to submit the execution on a specificcomputational infrastructure.Each microservice therefore expose two differentAPIs. The first one is used by a corresponding sub-mission handler, to submit and monitor the experi-ments. The other one supports the communicationwith the computing resources used to run the experi-ments, in order to provide a nearly real-time feedbacklog to the user and manage the job termination.In the current production release of the EXTraSportal we support only one computational infrastruc-ture, therefore only one microservice has been de-ployed. It is responsible to run the job by a propercontextualization of one (or more) instance of the vir-tual appliance on EGI FedCloud and destroy the vir-tual machine when the analysis has been executedwith success or an error occurs.In detail, the FedCloud Submission Handler per-forms an AJAX call to the microservice providing the user and job ID as stored in the portal reposi-tory. The microservice a) interacts with the Persis-tence API to retrieve the parameter list, the kind ofanalysis to perform and, the list of OBSID to pro-cess; b) interacts with the E-Tokern server to get aPUSP; c) selects (presently in a round-robin way) oneof the Federated Cloud site supporting the extras-fp7.eu VO; d) runs an instance of the virtual appli-ance providing a proper contextualization scripts viaa OCCI client ; e) receives updates and log informa-tions by a process local to the virtual machine thatprovides stdio and stderr in a progressive way viaPOST requests; f) deletes the virtual machine whenthe status update notifies the end of the analysis oran error. Result retrieval is performed by the contex-tualization script via scp.
5. The Analysis of Highly Variable and Tran-sient Sources
The analysis of highly variable and transientsources aims at identifying burst-like variability dur-ing EPIC observations. It is based on standard sourcedetection algorithm, that are applied to time-resolvedimages derived from the XMM-Newton observations.Images are analyzed to identify new point sourcesthat might have brightened considering different en-ergy bands. For each EPIC observation, a set of im-ages can be obtained by dividing the observing timeinto sub-exposures corresponding to different time in-tervals. The time intervals for sub-exposures can ei-ther have fixed duration or they can be defined witha preliminary search for an excess of counts withina small region of the detector in limited time peri-ods. This step of the analysis is performed using aBayesian Blocks algorithm [32] on the events detectedin partially overlapping regions having a size com-parable to the characteristic dimension of the tele-scope point spread function. The main parametersfor the transient analysis on one or more observa-tions, named hereafter the experiment, are the choiceof instruments (i.e. among the three EPIC cameras), https://github.com/EGI-FCTF/rOCCI-cli
6. Conclusions and Future Works
This paper presented an updated release of the EX-TraS portal, a science gateway for the astrophysicscommunity devoted to the search and characteriza-tion of variable sources in the soft X-ray energy rangeby exploiting the XMM-Newton observations. Theportal relies on recent general-purpose technologies,architectural patterns and best practices adopted inthe development of enterprise web application.The EXTraS portal has been validated in Decem-ber 2016 and officially presented to the scientific com-munity in June 2017 [35]. Presently, the registeredusers are a few dozens, but they submitted analysistask for the equivalent of 15,795 processor hours.Thanks to this flexible architecture, we can addand manage microservices in order to improve theresource usage and availability. To this extent we areworking on an extension of PortalTS, named Easy-Gateway, to be able to interplay with other toolkits.In particular we experimented that the joint use ofEasyGateway and Apache Airavata leads to a richuser interface from the Airavata side, with supportfor submission on a large set of middleware and queuemanagers for science gateways relying on PortalTS[36].The integration of the other analysis tools, bothdeveloped in the project and in the scientific com- munity, represents the future direction. We are inparticular considering the possibility to provide a re-mote visualization service based on [37], for a betterexploitation of experiment results. This feature willbe of particular importance for outreach and studentsinvolvement activities of the project and, in perspec-tive, to provide the portal to the large communityof citizen scientists interested in the astrophysics re-search activities.
Acknowledgment
EXTraS has received funding from the Euro-pean Union’s 7th Framework Programme for re-search, technological development and demonstra-tion under grant agreement no. 607452. Thiswork used the EGI infrastructure with the sup-port of CYFRONET- CLOUD, INFN-CATANIA-STACK and RECAS-BARI. Authors are grateful toMarica Antonacci from RECAS-BARI and CatalinCondurache from Science and Technology FacilitiesCouncil for their valuable assistance with the setupof the computing infrastructure.