Navigating Diverse Data Science Learning: Critical Reflections Towards Future Practice
NNavigating Diverse Data Science Learning:Critical Reflections Towards Future Practice
Yehia Elkhatib
MetaLab, School of Computing and Communications
Lancaster University, United Kingdom { i.lastname } @lancaster.ac.uk Abstract —Data Science is currently a popular field of scienceattracting expertise from very diverse backgrounds. Currentlearning practices need to acknowledge this and adapt to it. Thispaper summarises some experiences relating to such learningapproaches from teaching a postgraduate Data Science module,and draws some learned lessons that are of relevance to othersteaching Data Science.
Index Terms —Data Science, teaching, diverse learning
I. I
NTRODUCTION
As Data Science (DS) continues to be a growing fieldwith promising prospects [1]–[3], it is attracting significantattention from many including learners of different learningbackgrounds and applications areas. From a DS educator’sperspective, the result is a very diverse cohort of learners. Thistypically includes (in no order) mathematicians, statisticians,operations researchers, computer scientists of all their colours,other scientists ( e.g. environmental scientists, psychologists,etc.), and business managers and analysts to name a few.This in itself poses a number of challenges to the educator.In addition, as is common with emerging fields of science,teaching DS commonly starts as a specialist graduate-levelcourse. In due course, it would establish itself as a stand aloneundergraduate speciality. Until then, however, learners come inwith a fairly dense knowledge, exposure to, and preference tocertain learning approaches. This poses even further challengesto the educator. In fact, it also creates challenges to the learnersthemselves in terms of how to interact with colleagues whostudied different disciplines, and are potentially accustomed todifferent learning methods and materials.In this paper, I will focus on the challenges posed to theeducator. I will use the term educator to refer to the lecturer,professor, or teacher; and the term learner to refer to thestudents or members of the learning cohort. I reflect on myown experiences of teaching on a MSc level course createdspecifically for DS. Established in 2014, the DS MSc courseat Lancaster University was one of the first of its kind andcontinues to draw a large number of learners from all over theworld. The cohort consists of 50-70 learners that primarilycome from different backgrounds. The experiences related inthis paper are drawn from a module I teach on this MSc course.The module is an introductory one intended to equip learnerswith basic data analysis and experimentation skills that areessential to industrial and academic DS work. The modulecovers a wide range of DS foundations in the span of 10 weeks. The paper provides the following contributions:1) Identify the make up of a DS practitioner and teamstructure.2) Explore some of the challenges associated with teachingDS (at a graduate level).3) Enumerate a number of successful and unsuccessfulpractices and approaches.4) Distill a number of lessons learned for the benefit ofother educators in the DS field.II. W
HAT ’ S IN A D ATA S CIENTIST ?Before delving into the challenges and experiences, it isimportant to ensure common understanding about what a datascientist is. While many others ( e.g. [4]–[6]) have focused onthe skills of a ‘data scientist’, I instead focus on their roles.
A. Many Not One
Some assume that a data scientist is a single, well-definedrole. Indeed, it requires a unique set of skills that sets itapart from other established roles in modern ICT industry( e.g. systems developer, or network engineer). However, asthe DS market develops, so does our understanding of whatit is capable of and entails. As such, I came to appreciatethat a data scientist is neither a single role nor a necessarilywell-defined one. Instead, it is a collection of different rolesthat complement each other. I separate these into core and auxiliary roles.Before defining these, it is worth noting that the datascientist role is itself rather malleable and context specific. Inone industry, for example, a data scientist could be someonewho analyses data streams for insights that directly affect thetactical and strategic company directions, while in other in-dustries the same term could be used to refer to someone whocollects and curates data. Some of this divergence stems frommisunderstanding of the different roles that a data scientistmight have. And although these roles are varied but fluid, i.e. distinction between them is not always clear even for therole holders themselves, understanding them helps bring somecommon understanding to what being a data scientist means. Italso helps us to appreciate the diverse set of skills required, andbetter assemble the right teams and required support systems.1 a r X i v : . [ c s . G L ] J u l . Core Roles The core roles are described as follows and depicted inFigure 1. • Janitor is perhaps the role that is most hidden from viewand is thus the most underappreciated. Data is rarely everperfectly ready to use as is, and features a fair amountof missing data points, outliers, duplicates, and wronglylabelled data points. This is caused by capturing methods,sampling approach, or human/machine intervention onthe path through which the data passed. Processing suchinconsistencies manually comes naturally to many peopleespecially those acquainted with the data provenance, butautomated methods are not yet sophisticated enough to beable to reason about them in a completely unsupervisedfashion. As such, the data janitor role entails a non-insignificant effort to clean and pre-process the data inorder to prepare it for analysis. • Scout performs exploratory data analysis for sanitychecking and early insights. This uncovers data structure(if this is not known in advance) and identifies inconsis-tencies, both things that will help other roles that willwork with the data at a deeper level. Also, the scoutusually forms initial hypotheses that will seed others andfeed into the work of other roles. • Analyst is what most people attribute to being a datascientist. They dig deep into the data in order to extractmeaning, discern patterns, identify the essential chronicleof the data and what it describes, and uncover evidenceof unforeseen narratives. This entails, at a high level,forming hypotheses and designing corresponding tests.The implementation of such tests could follow any of anumber of methodologies. • Decision Builder carries on the work of the analyst andbuilds products that will automate decision making oralternatively provide decision making support based onthe outcomes of the analysis. This commonly includesadaptive machine learning and deep learning methods,with the aim of transforming the insights of the analysisinto actionable decisions. • Curator is responsible for holding and maintaining thedata. This includes traversing concerns of storage formats,access interfaces, data governance, custodianship, andresponsible sharing. • Engineer defines different setups in order for otherroles to be able to interact with the data efficiently andreliably. They would also be responsible for managing theinterface between development and production productsand environments.Core roles sometimes overlap, and usually interact through it-erative processes that need to adapt to changes in the incomingdata and the analysis objectives.
C. Auxiliary Roles
Data scientists seldom work in isolation. They interact withteams responsible for creating data, they work with others whohelp them in their analysis, and they communicate with those
Fig. 1. Core data scientist roles.Fig. 2. Auxiliary data scientist roles. who have a vested interest in the data. In fact, working inisolation renders their job meaningless beyond fascination withdata.This interaction along with the increasing sophistication ofdata science typically results in data scientists resolving towork in teams. Consequently, the distinction between the coreroles described above begin to become clearer. Additionally,a number of other roles begin to emerge as DS teams grow.I refer to these as auxiliary roles (Figure 2) and they are asfollows. • Domain Specialist provides much needed domain ex-pertise to help decipher provenance, data significance,sources of bias, and implications. • Infrastructure Manager provides support to build andoperate data systems beyond the role of the data en-gineer. For example, a data engineer might set up aSpark application pipeline and an associated developmentenvironment, whilst the systems developer would helpstreamline data pipeline production, and deal with themanagement of underlying system infrastructure. • Communicator takes on the responsibility of communi-cating analytic outcomes outside of the visualisation of2xploratory and confirmatory analysis results done by thecore team. This includes building products for interactingwith constructed data systems, interactive visualisation tocommunicate results to audiences outside the DS team( e.g. business management), and creating easy-to-digestinsights ( e.g. in the form of infographics). • Facilitator provides additional support in terms of settingup systems to confirm or disspel certain hypotheses thatare emerging from the core team; e.g. setting up andcarrying out A/B experiments, or procuring external datasets.III. C
URRICULUM D ESIGN OR I MPLEMENTATION ?There is a real need to not only focus on the learningoutcomes and approaches, but also on the practical meansof implementing these approaches and to identify the exactstructure by which the learning outcomes are to be achieved.This is something that in many ways transcends the design ofa course; it drills deep into how each pedagogical element isbeing created and delivered, and needs continuous monitoring.Here, I will focus on in-class activities as an example. I putemphasis on discussion in class, giving learners between 1 and3 group learning activities during every lecture, usually endingwith a few minutes of open discussion. Instead of parlayingknowledge via PowerPoint, this method encourages continuousstimulation of the learners’ critical and creative thinking skillsthrough direct fundamental questions, group brainstorming,rhetorical questions, application to top tier papers, and comingup with solutions to practical dilemmas [7]–[10]. They alsoencourage learners to know each other, and for the educatorto know the learners, and additionally to gauge comprehensionand application across topics and subjects. (More on this in§VI.)One thing that soon became noticeable is the differencebetween the performances of learners with a CS backgroundcompared to those of other backgrounds. This was unexpectedas in-class activities were mainly about applying concepts andexercising analytical skills followed by a discussion, and neverincluded any programming skills of any kind. Upon closerinspection, I realised that the difference was in fact betweenhome and overseas learners.I discussed this informally with a small subset of overseaslearners. It soon became apparent that this issue relates directlyto the learning setting, which is unfamiliar to some due totheir previous experiences. Some overseas learners found itdifficult to participate in group exercises, especially when theyhad doubts about their understanding of the lecture material.Home learners, on the other hand, were generally much moreconfident in articulating their understanding (regardless of howcorrect it may be) and expressing their views, which madethem undermine their own and shy from in-class participation.It is important to note that this is not due to a language barrieras all learners are comfortable using and comprehendingEnglish at postgraduate level.I was able discern two parts to this issue. The first relates tothe input: some, if not many, overseas learners are not used to being asked to apply critical thinking to what the educator tellsthem, or what they read in a book or an academic paper. Thiscreates a barrier to applying their knowledge and also to gainnew knowledge through application and discussion. The sec-ond part of this problem relates to the output: many, especiallyoverseas, learners fail to realise that learning is largely a socialactivity [11], [12]. Perhaps they were (either in their previouseducational institution or discipline) not encouraged to sharewith their colleagues, possibly even implicitly trained to dealwith colleagues as competitors. This instils an unwillingnessto participate in group activities and thus learners miss out onone of the key learning activities of the module.My approach to solving these two issues was simple andeffective. First, I made clear and explicit remarks about how no work is infallible , including the ones I introduce as partof their learning activities. Before each group exercise, I gavebrief examples of how to critique similar work and ways ofimproving said work. Second, I made conscious effort to inter-leave my lecture material with checkpoints ; these are frequentbut short pauses where I very briefly reflect with the learnerson something I just introduced to or discussed with them,allowing them a moment to focus on the processes and not justthe artefacts / outcomes [13]. Third, I designed clear guidelinesto what I expect (and, perhaps more importantly, do not!) eachgroup discussion to produce, and I made these expectations as minimal and elemental as possible. My rationale was thatany learner can easily apply her/ his thinking to essentially“fill in the blanks”, encouraging them to participate through alow barrier whilst explaining their reasoning in a step-by-stepmanner [14].As a result learner participation was very rapidly increas-ing, no longer with any clear distinction between home andoverseas learner participation input. This was also reflected intheir coursework submissions.IV. C
ONTEXTUALISE T HIS
There are a number of well known factors that help effec-tively attain learning outcomes such as use of clear language,adequate level of complexity given the cohort’s educationalattainment, appropriate information given lecture and moduleduration, clear setting of expectations and learning outcomes,and appropriate use of examples. However, a common buthidden thread through these is contextualisation, which is welldocumented ( cf. [15]–[17]) to allow learners to use their ownset of skills and understanding, to appreciate the relevanceof the learning material, and consequently to be motivated toactively engage with the course material.Let us focus on the use of examples to discuss this. I tendto use real world examples whenever possible in order to helpcontextualise the learning material and make them relatableas much as possible. For instance, when describing what databias is, I give plenty of examples from industry and academiadetailing different techniques to quantify and identify biases,and how to set up processes to identify potential bias and itseffect on validity.3ased on clear feedback from the learners, this form ofcontextualisation helps them tie new concepts with old ones,and to illuminate routes from theory to practice. However, itis also not easy. It takes a lot of time to think of appropriateexamples that the learners would be able to relate to bothat the time of the lecture and when they move beyond thecourse. This is especially true when considering their variousbackgrounds.There is no silver bullet here. My approach is to match thelearners’ backgrounds and use examples from as varied DS-related fields as possible. In other words, rather than restrictingto examples from my own research areas of distributed com-puting and networking [18]–[21], I actively seek examplesfrom further fields through engaging with colleagues fromother CS sub-disciplines as well as other disciplines likeenvironmental science, politics, psychology, accounting andfinance, etc.V. A
SSESSMENT R ATIONALE AND A NALYSIS
The coursework structure relied on different elements: knowledge -based elements to test the grasp of information (inthe form of simple in-class questions, as described above), comprehension -based elements to gauge the understandingof information, and application -based elements to evaluateproblem solving abilities. This section focuses on the lattertwo elements and their associated challenges.
A. Comprehension-based Assessment
Knowledge-based assessment take the form of straight-forward questions. These were embedded in the lecture deliv-ery strategy during regular checkpoints, as explained in §III.Modifying these into comprehension exercises is important inorder to raise discussion and engender deeper understanding.Self-assessment is an established technique where learnersappraise their own understanding with minimal guidance [11].I relied on mutual-assessment , a slightly modified versionof self-assessment where learners appraise each other’s under-standing. This was set up as follows. Once or twice duringeach lecture, the learners are given a problem that would testtheir understanding of the lecture material thus far. They areasked to write their answers on paper within 3-5 minutes, thenswap the answer sheets with one of their colleagues. Theywould then give each other feedback about how they fared, anddiscuss for another 3-5 minutes. The whole class would thenreconvene for another 3-5 minutes to have a wider discussionof the main points, or sometimes to answer questions aboutthings the learners could not clarify to each other. Many choseto share this verbally with the rest of the class, whilst somepreferred to write it down on provided notes that I collect andread out.This approach was extremely successful for four mainreasons. First, the learners were able to test and enhanceeach other’s understanding, and provide very personalisedfeedback to one another (much more than I could due to They could swap with their nearest neighbour if time is constrained.Otherwise, I ask them to move around the room. the scale of the class). This is even more effective thanin-class interactive and online quizzes, which I tried in anearlier module, where feedback was inevitably brief, and toofactual despite being personalised. Second, this approach isvery practical as it scales really well, reducing the amount ofindividual marking the educator has to do and allows themto focus on understanding their cohort. Third, the conclusionof each exercise provides the educator with a vital feedbackloop. In effect, it signals the parts of the learning material thatare the ‘muddiest point’ [22] that might have been explainedbetter, and whether these were common to many learners orrestricted to only a subset. Moreover, due to the huge diversityof this particular cohort, the feedback is generally quite wide-ranging and many times includes things I myself would nothave thought of. This was hugely educational and eye openinglearning approach. Fourth, and this leads to the topic of thenext subsection, is the promotion of a lively classroom culturewhere there is close interaction between members of a cohortthat would otherwise fall into cloistered cliques of CS, DS,etc.One complication, though, is the reliability and fairness ofthe marks provided by the learners. To ensure this, I needed todevise an additional screening process to moderate the marksgiven by the learners. This, however, is not a great burden andis rather manageable and reasonable compared to the gainedbenefits. Reflecting on this, a potential future direction ofimprovement is to set a loose marking scheme for the learnersto use as a reference when marking each others’ work.
B. Application-based Assessment
As highlighted in §II, there are different roles that anydata scientist might take. Common practice in industry isto assemble DS teams where different data scientists wouldtake on one or more of these roles and work closely together[23]. Accordingly, application-based assessment elements arestructured around group work where learners work togetherto tackle a certain data challenge. There are a number of usefullessons in this regard.First, the data problems that the learners are asked to tackleare all real ones provided by industrial partners. As such, theygive a real sense of the challenges that DS teams are currentlyfacing in industry. This is very appealing to learners and helpsin maintaining their engagement with the required work evenin the face of difficulties. In fact, many of our learners continueto work with the industrial partners beyond the course onvery similar challenges. The main restraint here relates todrawing important challenges from industrial partners and toinvolve them in defining the student projects without expectinga significant time commitment from their side.Second, the way learner teams are set is of crucial im-portance. If they set the teams themselves, the teams tendto be exhibit high self-similarity in terms of background andskills. If, instead, they are set up by the educator, there is arisk of coercing incompatible personalities to work together.Hence, a hybrid approach was adopted: At the beginning ofterm, learners are asked to express their skills and experiences4hrough self-assessment forms. The form includes Likert scalescores on different abilities such as ‘Statistical Modelling’ and ‘Data Handling’ . The scores are then used as input to aclustering algorithm that produces teams of 4-5 learners with agood balance of skills across each team. The produced teamsare then proposed to the students, giving them an opportunityto gauge whether they would be interested to work with theirclassmates. After such stage, many learners accept the createdteams or slightly adjust them. As an added bonus to thisexercise, the self-scores are revisited during the last lecture togive learners the opportunity to reflect on how their abilitiesand skills progressed through the module. Incidentally, severallearners have observed how this was useful further still byhelping them be mindful of their perceived knowledge.Third, the ability to work well within a team is not some-thing that is initially attained by all learners. Also, learners ofdifferent backgrounds rely on different collaborative systems.An explicit point is made to the learners that teamwork is acrucial part of their training in light of the nature of DS teams(see §II). General advice is given to the learners about howto allocate and monitor group work. Additional guidance isalso provided on both individual and group levels. Plenty ofpractical tips are supplied regarding things that are not coveredby the syllabus such as collaboration and peer review tools( e.g.
GitHub, Jupyter [24], and other generic [25] and field-specific tools [26], [27]) and best practices regarding projectmanagement and presentation strategies that are suitable foracademia and industry.Finally, assessment criteria needed to be more detachedfrom the means of accomplishment. In other words, theapplication-based assessment needs to be able to accommodatea wide range of preferences in terms of processing tools, pro-gramming languages and frameworks, and presentation styles.For instance, the learners are given flexibility to accomplishthe group work using Python, Java, R, or Matlab as long as theend results are presentable in a format that is suitable for anacademic or managerial audience, and the code and artefactsare clearly annotated and self-describing. Obviously, this raisesthe expectations on the educator but is a normal state of affairsfor any application-based course, especially considering thediversity of DS roles and learner backgrounds.VI. L
EARNING THE L EARNERS
Beside the aforementioned strategies and practices, theeducator needs to realise the importance of knowing theirlearners well. There is a great deal to be learned from teachingliterature, peer observation, and critical evaluation approach.However, the actions needed to implement these best practicesand methodologies all seem to hinge on simply getting toknow the learners in terms of their backgrounds, abilities,interests and experiences, and to consequently act on thisrecognition to provide a tailored learning experience [28].For example, I design my lectures to leave plenty of roomfor self exploration and creative thought, assembling a myriadof avenues of inquisition in different directions. However, ondeep reflection I came to realise that Socratic teaching methods are not enough. Extra work is required in order to cater tolearners with diverse learning backgrounds and skills.Realising this requires looking up from the lecture material,knowing the learners through direct interactions, and tuningthe fine details of the syllabus to best suit the learners’ abilitiesand experiences. Such fine details include, for instance, adopt-ing an in-lecture checkpointing practice (§III) and allowinglearners to reflect on the subject, assisting them in dissectingit and exposing the strengths and weaknesses (§V-A). Anotherexample where knowing learner abilities is important is whencontextualising lecture material (§IV), and planning groupefforts (§V-B).This level of knowing the learners is not something that isinherently included in a course syllabus, nor is it somethingthat can be easily allocated as a time-limited task (such asdelivering a lecture, supervising a lab, etc.). Instead, it is anunderlying activity that an educator is responsible for in orderfor them to ensure optimal module delivery. Furthermore,this is not necessarily something that requires a significantamount of effort or one that might incur big changes tocurriculum. This is easily blended into the learning approachthrough lectures, lab sessions, and assessment exercises asdemonstrated through examples in this paper. The key is tointroduce cognisance for a tailored delivery that is easy totrack and tweak, if and where necessary.VII. F
INAL REMARKS
Any educator working with a diverse cohort of learnersstands to greatly benefit from observing practices in otherfields of science . Due to the interdisciplinary nature of ourDS course at Lancaster, I had the opportunity to interactwith colleagues from across the university and assimilate theireducational approaches. Although starting points and goalsare quite similar in a pedagogical sense, the methods areoften quite different. I attribute some of this to disciplinarydifferences and conventions, but also to variances in learningbackgrounds and previous experiences. This forced me toreflect on my own practices in a critical light, and also toidentify the distinctions between approaches and recognisetheir development. Learners are different and thus educatorsneed to expand their educational toolboxes to cater to suchdiverse cohorts. R
Building data science teams . ”O’Reilly Media, Inc.”, 2011.[5] D. Conway, “The data science venn diagram,” Mar.2013. [Online]. Available: http://drewconway.com/zia/2013/3/26/the-data-science-venn-diagram[6] L. Lyon and E. Mattern, “Education for real-world data science roles(part 2): A translational approach to curriculum development,”
Interna-tional Journal of Digital Curation , vol. 11, no. 2, pp. 13–26, 2017.
7] M. McLean and R. Blackwell, “Opportunity knocks? professionalismand excellence in university teaching,”
Teachers and Teaching , vol. 3,no. 1, pp. 85–99, 1997.[8] Y. Steinert and L. S. Snell, “Interactive lecturing: strategies for increas-ing participation in large group presentations,”
Medical Teacher , vol. 21,no. 1, pp. 37–42, 1999.[9] L. S. Shulman, “Signature pedagogies in the professions,”
Daedalus ,vol. 134, no. 3, pp. 52–59, 2005.[10] J. Hardin, R. Hoerl, N. J. Horton, D. Nolan, B. Baumer, O. Hall-Holt,P. Murrell, R. Peng, P. Roback, D. T. Lang, and M. D. Ward, “Datascience in statistics curricula: Preparing students to“think with data”,”
The American Statistician , vol. 69, no. 4, pp. 343–353, 2015.[11] P. Orsmond, S. Merry, and K. Reiling, “A study in selfassessment:tutor and students perceptions of performance criteria,”
Assessment &Evaluation in Higher Education , vol. 22, no. 4, pp. 357–368, 1997.[12] J. Qadir, “What every student should know: Seven learning impedimentsand their remedies,”
IEEE Potentials , vol. 34, no. 3, pp. 30–35, May2015.[13] M. P. Rowe, B. M. Gillespie, K. R. Harris, S. D. Koether, L.-J. Y.Shannon, and L. A. Rose, “Redesigning a general education sciencecourse to promote critical thinking,”
CBE-Life Sciences Education ,vol. 14, no. 3, 2015.[14] V. A. Aleven and K. R. Koedinger, “An effective metacognitive strategy:learning by doing and explaining with a computer-based cognitive tutor,”
Cognitive Science , vol. 26, no. 2, pp. 147 – 179, 2002.[15] D. I. Cordova and M. R. Lepper, “Intrinsic motivation and the processof learning: Beneficial effects of contextualization, personalization, andchoice,”
Journal of Educational Psychology , vol. 88, no. 4, pp. 715–730,1996.[16] J. Holbrook and M. Rannikm¨ae, “Contextualisation, de-contextualisation, recontextualisation–a science teaching approachto enhance meaningful learning for scientific literacy,”
Contemporaryscience education , pp. 69–82, 2010.[17] N. Hood, A. Littlejohn, and C. Milligan, “Context counts: How learners’contexts influence learning in a MOOC,”
Computers & Education ,vol. 91, pp. 83 – 91, 2015.[18] Y. Elkhatib, G. Tyson, and M. Welzl, “Can SPDY Really Make theWeb Faster?” in
Proceedings of IFIP International Conference onNetworking , Jun. 2014. [19] Y. Elkhatib, R. Killick, M. Mu, and N. Race, “Just browsing? Un-derstanding user journeys in online TV,” in
Proceedings of the ACMInternational Conference on Multimedia . ACM, 2014, pp. 965–968.[20] G. Blair, Y.-D. Bromberg, G. Coulson, Y. Elkhatib, L. R´eveill`ere,H. B. Ribeiro, E. Rivi`ere, and F. Ta¨ıani, “Holons: Towards a systematicapproach to composing systems of systems,” in
Proceedings of the 14thInternational Workshop on Adaptive and Reflective Middleware , ser.ARM 2015. New York, NY, USA: ACM, 2015, pp. 5:1–5:6.[21] Y. Elkhatib, B. F. Porter, H. B. Ribeiro, M. F. Zhani, J. Qadir, andE. Rivi`ere, “On using micro-clouds to deliver the fog,”
Internet Com-puting , vol. 21, no. 2, pp. 8–15, Mar. 2017.[22] F. Mosteller, “The ‘muddiest point in the lecture’ as a feedback device,”
On Teaching and Learning: The Journal of the Harvard-DanforthCenter , vol. 3, pp. 10–21, 1989.[23] B. Dinter, D. Douglas, R. H. L. Chiang, F. Mari, S. Ram, and D. Schoder,
Big Data Panel at SIGDSS Pre-ICIS Conference 2013: A Swiss-ArmyKnife? The Profile of a Data Scientist . Cham: Springer InternationalPublishing, 2015, pp. 7–11.[24] F. Perez and B. E. Granger, “IPython: A system for interactive scientificcomputing,”
Computing in Science Engineering , vol. 9, no. 3, pp. 21–29,May 2007.[25] J. Tennant, J. Dugan, D. Graziotin, D. Jacques, F. Waldner, D. Mietchen,Y. Elkhatib, L. B. Collister, C. Pikas, T. Crick, P. Masuzzo, A. Caravaggi,D. Berg, K. Niemeyer, T. Ross-Hellauer, S. Mannheimer, L. Rigling,D. Katz, B. Greshake Tzovaras, J. Pacheco-Mendoza, N. Fatima,M. Poblet, M. Isaakidis, D. Irawan, S. Renaut, C. Madan, L. Matthias,J. Nrgaard Kjr, D. O’Donnell, C. Neylon, S. Kearns, M. Selvaraju, andJ. Colomb, “A multi-disciplinary perspective on emergent and futureinnovations in peer review,”
F1000Research , vol. 6, no. 1151, 2017.[26] C. Vitolo, Y. Elkhatib, D. Reusser, C. J. Macleod, and W. Buytaert, “Webtechnologies for environmental big data,”
Environmental Modelling &Software , vol. 63, no. 0, pp. 185–198, jan 2015.[27] S. Greene, P. Johnes, J. Bloomfield, S. Reaney, R. Lawley, Y. Elkhatib,J. Freer, N. Odoni, C. Macleod, and B. Percy, “A geospatial frameworkto support integrated biogeochemical modelling in the united kingdom,”
Environmental Modelling & Software , vol. 68, no. Supplement C, pp.219 – 232, 2015.[28] S. Manning, B. P. Stanford, and S. Reeves, “Valuing the advancedlearner: Differentiating up,”
The Clearing House: A Journal of Edu-cational Strategies, Issues and Ideas , vol. 83, no. 4, pp. 145–149, 2010., vol. 83, no. 4, pp. 145–149, 2010.