Modern Microscopy with the Web of Things: The OpenFlexure Microscope Software Stack
Joel T. Collins, Joe Knapper, Julian Stirling, Samuel McDermott, Richard Bowman
MModern Microscopy with the Web of Things: The OpenFlexureMicroscope Software Stack
Joel T. Collins , Joe Knapper , Julian Stirling , Samuel McDermott , Filip Ayazi ,and Richard Bowman Centre for Photonics and Photonic Materials, Department of Physics, University ofBath, UK. Cavendish Laboratory, University of Cambridge, UK. * [email protected] 25, 2021 Abstract
Automated and computerised control of scientific instrumentation is almost ubiquitous in themodern laboratory. Most instrumentation is controlled over decades old communication busses oris accessed via proprietary system libraries. This limits which languages and operating systemscan be used to control instruments, and poses a significant problem when interfacing multipleinstruments into the same experiment. Here we present the OpenFlexure Microscope softwarestack as an example of how a scientific instrument can be controlled using existing, cross-platform,language-independent, industry-supported standards. We split the control code into client andserver applications interfaced via a web API that conforms to the W3C Web of Things standard.This enables simple control of the microscope from multiple languages, provides a modern graph-ical control interface, and minimises duplicated code. Network control also makes the softwarestack more robust, allows multiple microscopes to be controlled by one computer, and facilitatessharing of equipment between local or remote users. Using a Web of Things approach in researchlaboratories has the potential to solve many of the key challenges of experiment integration, usingtechnology that is already well established.
Microscopists are increasingly required to write complicated experiment automation code, in multiplelanguages, to bind together inconsistent interfaces and proprietary software to run their experiments.Currently, a wide mix of often proprietary or “legacy” connectors and protocols are commonplace evenin modern microscopes. This has led to many research labs relying on outdated hardware and softwarebecause of the requirement for interfaces that are largely obsolete in every other field of technology.Common practice is to make use of third-party solutions to retrofit interoperability, often maintainedpredominantly by microscope users rather than manufacturers. Since not all hardware is supported bythese solutions, time must often be spent developing compatibility layers in the project’s programminglanguage of choice to enable even the most basic control.In recent years, open web technologies have been widely adopted for controlling domestic hardware,in the “Web of Things” (WoT) [1]. These network and web technologies have already addressed manyof the problems faced by laboratories and have been proven robust, fast, and secure by their long-standing ubiquity throughout modern life. Nowadays, support staff deeply familiar with networkingand web technologies are already in place at most research laboratories and teaching institutions.1 a r X i v : . [ phy s i c s . i n s - d e t ] J a n hile prior work has introduced web technology into laboratories [2, 3, 4, 5, 6], these have lacked thecomprehensive standardisation required for true interoperability. Recently however W3C, the primaryinternational standards organization for the open web, have moved to standardise the Web of Things,with solid industry and community support [7].Here, we introduce the software stack developed for the OpenFlexure Microscope [8, 9], an open-source, 3D-printed, and fully-automated laboratory microscope. The microscope has been deployedaround the world in a wide range of operating environments, posing unique challenges as it is usedin almost equal measure by novice and expert microscopists. While most commercial microscopesuse direct connections to a single, physical computer, we use a web API (application programminginterface) to enable local and remote control through internet protocol (IP) networks. The softwarestack makes use of modern networking technologies and Web of Things standards for microscope controland data management. The presented architecture for network-connected microscopy has allowedthe OpenFlexure Microscope to be used in a diverse range of settings without re-implementation orcode duplication. Additionally, the extensibility and interoperability has allowed users to developadditional functionality and entirely new imaging modes without having to re-implement the morecomplex instrument control code.By designing our software stack around the W3C WoT Architecture[7], we have avoided introducingyet another competing device control system, and paved the way for comprehensive integration withexisting microscopy software solutions such as µ Manager [10, 11, 12, 13] and Microscope Cockpit [14,15]. Most modern programming languages have well-developed libraries for handling web requests.This means that WoT extensions can be developed for various microscopy control platforms to enablenetworked interaction with not just the OpenFlexure Microscope, but any device using the W3C WoTAPI model. This architecture is not limited to microscopy, and the same approach would work formost instrumentation. Client applications controlling multiple instruments for larger experiments canthen be written in any modern language that supports web requests.
The microscope’s software stack includes low-level device control code, logic to integrate the hard-ware components into a useful instrument, automation for common tasks, a graphical interface forinteractive control, and APIs for scripting automated experiments from various languages. Treatingthe microscope as an IoT device naturally splits these functions into client applications (the graphicalinterface and scripting APIs) and a server handling the bulk of the logic and hardware control.This split between the server application and clients has several important advantages. First, itenables multiple client applications to connect to a microscope simultaneously. This allows, amongstother things, a graphical interface to display a real-time camera feed, while a script controls samplemanipulation, data acquisition, and analysis. Conversely, a single client can manage multiple micro-scopes simultaneously. This has allowed clinicians to image sample slides from several microscopesconcurrently, dramatically increasing data acquisition throughput.Lastly, by separating the more complex server application from comparatively simple client applica-tions, it becomes significantly easier to write client libraries in a broader set of languages. This meansthat microscope users can script experiments without having to re-implement the hardware controlcode in their language of choice, interface with a binary library, or learn a new language. It also ensuresconsistency between different languages and avoids duplicated effort, as most of the complexity is inthe server application.
Our server runs on a Raspberry Pi computer embedded in the microscope (Figure 1). This applica-tion handles communication with the sample translation stage, imaging camera, and any additional2igure 1: OpenFlexure Microscope implementation of the client–server architecture. A Raspberry Piruns Python code to control physical hardware (camera, Arduino-based motor controller), and set upan HTTP web server and mDNS discovery records. Users can either control the microscope from theRaspberry Pi itself using HDMI output and USB peripherals, or connect over any IP network.hardware, as well as logic for data management and additional functions such as tiled scans and aut-ofocus. Running the server on an embedded computer ensures the hardware control code is runningin a very well controlled environment. We automatically build and distribute an SD card image witha correctly configured operating system and our server and client applications pre-installed [16]. Thiseliminates the most troublesome aspect of distributing instrument control software, which is correctlyinstalling and configuring the low-level drivers and libraries on an unknown computer, often clashingwith system-level changes made to support other instruments connected to the same machine.Client applications can run on the embedded computer, making the microscope a stand-alonesystem to which keyboard, monitor, and mouse may be attached. More usually, client applicationswill run on other devices connected via a wired or wireless IP network, using the Raspberry Pi’sethernet and WiFi interfaces. By using IP networking for instrument control we enable control ofmultiple instruments with any external router or switch. Replacing different (often proprietary andexpensive) connectors and adaptors [17] with commodity hardware makes experimental science moreefficient and more accessible to resource-constrained scientists.The internet protocol itself allows for high-speed, low-latency plug-and play communication. Fullyremote control can be enabled using existing, well-established secure protocols such as SSH forwardingand VPN connections. Clients can automatically detect the server’s IP address and capabilities viamDNS (multicast Domain Name System) [18], which is already used extensively by consumer WoTdevices. 3 amera returns 17msREQUESTRESPONSE a) READ PROPERTYCLIENT SERVER
GET settings/camera/shutter_speed ms Read shutter speedfrom physical camera.Camera returns write successfulREQUESTRESPONSE b) WRITE PROPERTYCLIENT SERVER ms PUT settings/camera/shutter_speed OK Write shutter speedto physical camera.
Figure 2: Simplified example of an HTTP request flow for reading and writing a “property” resource.a) Reading a property requires a GET request sent to the URL corresponding to the property to beread (camera shutter speed). The server then sends a response containing the properties value (17milliseconds) in the response body. b) Writing a property requires a PUT request sent to the sameURL, containing the new value in the request body (22 milliseconds). The server then sends an “OK”response confirming the operation’s success.
The server application is written in Python, with much of the back-end code released as a sepa-rate library (Python-LabThings) [19]. This library uses the “Flask” [20] web application framework,and includes various utilities to simplify thread-based concurrency, mDNS discovery, hardware syn-chronisation, and documentation generation. We also make use of the standard scientific pythonlibraries [21, 22, 23, 24]. Client–server interactions are governed by a web API based W3C Web ofThings interaction model. This model defines properties, actions, and events that map onto HTTPURLs. The HTTP method then describes the type of operation to be handled. For example, an HTTPGET request will read the value of a property, whereas an HTTP PUT request will write a new valueto the property (Figure 2).W3C WoT compatibility is introduced by an automatically generating a description the micro-scope’s web API functionality in a standardised format [25]. This allows clients to traverse the APIwith minimal prior knowledge, aligning closely with the principles of representational state transfer(REST) [26, 27], an API architectural style widely used across web service and WoT device APIs [27].The widespread existing use of REST APIs means that users can interact with the OpenFlexure Mi-croscope using existing standard libraries. We make use of the OpenAPI standard [28] for automaticgeneration of interactive documentation [29].Many long-running tasks, such as acquiring large tile scans, must run in the background withoutblocking new API requests. Each request and action is therefore handled by its own thread to allowconcurrency. Access to physical hardware is carefully managed to avoid conflicting instructions by useof re-entrant locks (Rlocks). In the simplest case, the lock prevents requests from sending instructionsto a device already in use. For example, if the translation stage is currently in the middle of a long4ovement, a request to move elsewhere will be denied until the initial move has completed, releasingthe stage’s lock. Clients can continue to interact with the microscope while an action is running, aslong as they do not require a locked piece of hardware. This, for example, allows the live camerastream to be monitored during long-running experiments without users having to manually managemultiple threads.Background tasks are handled automatically by the server. Whenever an “action” is requested,the server will start the function in a new thread and immediately send a “created” response back tothe client, including the URL of an “action resource” the client can poll to check its progress or finalreturn value. In the future, we can improve efficiency by allowing the server to asynchronously push updates to clients without polling, for example by utilising Server-Sent Events [30].Most automation and integration can be done using client-side code, but for modifications that arebest done on the server (e.g. adding new hardware, or low-latency integration) we use an extensionsystem. The main server application handles only the most basic microscope functionality: capturingimages, moving the stage, and managing device settings. All functionality beyond this is provided byextensions. This allows functionality to be customised on each microscope. Functions deemed broadlyuseful have been included and enabled by default in the server application, however all can be disabledas required. Extensions are written as Python scripts that have direct access to physical componentscomprising the microscope (e.g. camera and translation stage) through Python objects. Extensionscan provide HTTP API endpoints and HTML interfaces that are displayed as part of the microscope’sweb app.
Our primary client for the OpenFlexure Microscope is a web application included in the microscope’sinternal API server. This application provides a comprehensive graphical interface for the microscope,including a live stream of the camera, capture functionality, basic data management, and full extensionsupport. By developing the client as a browser-accessible web application, we are able to support manydifferent operating systems without any additional code, while simultaneously drawing on the expertisebrought by a large community of existing libraries.The web application is accompanied by a desktop application (OpenFlexure Connect) handlingdevice discovery and connection. The application finds and displays discovered microscopes usingmDNS, as well as allowing manual connections and saving a list of commonly accessed microscopes.Upon connecting, the application finds and loads the microscope’s graphical user interface (Figure 3).Using a modular interface served by the microscope allows the client to only render user interfaceelements for enabled functionality. Server extensions are able to define new graphical interface compo-nents to be rendered within the main client application. For example, the interface to manage whereimages are stored is defined in the relevant extension.For experiment scripting, we have created a Python client for the microscope [31, 32] that convertsthe web API into native Python functions. Extensions functionality is also mapped to Python func-tions automatically. This enables both scripted experiments as well as interactive sessions using, forexample, iPython/Jupyter notebooks [33, 34]. This lowers the barrier to entry for scripting microscopyexperiments since many students and lab users are familiar with iPython notebooks, especially for dataanalysis. The ability to run the graphical client at the same time, e.g. to view the live video feed,further simplifies setting up and monitoring automated experiments. We have also created a MATLABclient [35] with similar features. As the scripting clients are lightweight wrappers for the HTTP API,this involves a minimal amount of duplicated code.The flexibility of the client–server architecture allows task or experiment specific interfaces to becreated quickly. For example, a client for controlling the microscope with a USB game pad wasdeveloped[36]. This client is useful in remote field applications where a keyboard and mouse are notpractical. 5igure 3: An example side pane from the OpenFlexure Connect microscope client. Using a web APIfor instrument communication allows the client to be written in any programming language. Usingjavascript we were able to create a cross-platform, responsive, and modern user interface using generalpurpose user interface frameworks.
Central to the microscope interface is the integrated live video display. The Raspberry Pi camerais supported by GPU firmware that provides accelerated JPEG compression of each video frame,allowing us to serve a real-time Motion JPEG (MJPEG) live stream of the camera. The server starts abackground thread on startup that records JPEG frames from the camera into a buffer. When clients6onnect to the stream, the server will begin sending a multi-part stream of frames taken from thatbuffer. A synchronisation event is created for each client thread ensuring that clients never receive thesame frame twice. As a new frame is read from the camera, the event for each client is set, at whichpoint the response handler for each client will pass the frame onto the client, unset the event, andthen wait for the event to be set again, dropping frames if neccesary to ensure latency is minimisedfor clients not able to receive frames at the full rate. This system is based on the work of MiguelGrinberg [37], and is included in the Python-LabThings library. We use MJPEG in preference tomore sophisticated video formats in order to minimise latency in the stream. The MJPEG format alsomakes it easy to extract or drop individual frames, and enables our fast auto-focus algorithm.
The Raspberry Pi camera can capture JPEG images and the raw 8 megapixel Bayer data from thecamera sensor [38, 39]. This raw Bayer data can be used for more advanced analysis, avoiding artefactsfrom gamma correction, demosaicing, and compression. The server records metadata about the stateof the microscope at the time of capture (camera settings, stage position, calibration data, and custommetadata added by the user), stored as a JavaScript Object Notation (JSON) formatted string inthe “UserComment” EXIF field. Captures are stored locally on the Raspberry Pi, either on the SDcard or an available USB storage device, and can be listed (including metadata) and downloadedthrough the HTTP API. Separating the actions of capturing images and downloading them avoids theneed to transfer large amounts of data over the network during time-critical phases of an experiment.The standard graphical client provides an gallery interface to view captured images and view theirmetadata.As previously mentioned, multiple microscopes can be run in parallel to increase data throughputfor a single operator. The greatest time saving can be achieved by setting microscopes to automaticallyscan over a large area, building a composite image of hundreds of overlapping fields of view (FOVs).The server has the option to perform such scans with movements, paths and capture types chosen bythe user. Capture options and metadata are the same as individual captures, and individual imagesare saved, with metadata, as the scan runs [9]. Scans run as background tasks, so the microscope’svideo feed and position can be monitored as they run, and scans can be aborted without losing orcorrupting images that are already acquired.
Due to the parallelogram-based mechanisms controlling the motion of the OFM translation stage,changes to the x - y position move the sample over a sphere cap relative to the optics rather than aplane [8]. This necessitates an auto-focus procedure which can be run reliably at each x - y locationin an automatic scan. As a typical diagnostic scan may require over 100 x - y sites to be imaged, thesoftware must focus rapidly while still being sufficiently reliable to not invalidate a large scan with anyout of focus images.A basic auto-focus procedure captures a z -stack of images regularly spaced between points expectedto be far above and below the focal point. At each height, an image is captured and converted to grey-scale. A Laplacian convolution is applied to the whole image, assigning higher values to areas ofgreater spatial brightness variance. These values are raised to the fourth power and summed over theimage to provide a sharpness value. The translation stage is then returned to the z -position with thehighest sharpness. This procedure is based on methods used previously to detect focused areas in anout of focus image [40], and while highly reliable, typically takes 10–20 seconds to complete, limitedby capturing and image processing time.A fast auto-focus procedure utilises the MJPEG preview stream as a metric of focus. By disablingbit rate control, the stream becomes a simple series of independent JPEG images each with identicalcompression settings. This JPEG compression uses the discrete cosine transform to describe blocksof the image [41], where each block is described using a superposition of the fewest discrete cosine7unctions possible, minimising the storage space required. As focused images typically have sharperfeature boundaries, the storage size of an MJPEG frame will peak when the sample is in focus. Bytracking frame size and z -position as the objective moves through the focal point without stopping, theposition of peak sharpness can be identified and returned to. Monitoring the frame size while movingback to the focus and comparing to the initial data can be used to correct for imperfections such asbacklash. Far from focus, sample features can overflow from one JPEG block to its neighbours. On asparse, predominantly dark sample, this can introduce information into blocks which would otherwisebe empty. If this increase in the number of blocks containing some non-zero information outweighs thereduction of blocks containing a large amount of information, the JPEG size may maximize away fromfocus. However, for feature-dense samples the size of a JPEG image can generally be used as a reliablemeasure of focus. This auto-focus method has a greater positional resolution than the discrete stepsof the simpler auto-focus as MJPEG frame size can be tracked on-the-fly, reducing the time taken toless than 5 seconds. Due to chief ray angle compensation on the Raspberry Pi camera module’s Sony IMX219 image sensor,the raw images captured by the microscope suffer from vignetting even when the sample is uniformlyilluminated. However, flat-field correction allows us to recover uniform images in software [42]. Weuse a forked version [39] of the “picamera” library [38], to access the lens shading table in the camera’sGPU-based image processing pipeline, enabling us to correct for vignetting in both captured imagesand the real-time preview. A reduction in saturation at the edges of the image remains, but this canbe corrected by post processing at the expense of higher noise in the image [42].
It is often convenient to move the microscope stage by a given displacement in pixels on the camera, butthe axes and step size of the translation stage rarely align perfectly. We calibrate this relationship bymoving back and forth along the stage’s x and y axes in turn, analysing the resulting displacement inthe image from the camera [43]. We combine the calibrations into a 2 × µ Manager’s Pixel Calibratorplugin [13], but treating the camera’s coordinate system as ground truth rather than the stage. Weavoid hard-coded step sizes by moving the stage in gradually increasing steps, and measure mechanicalbacklash by comparing motion in opposite directions. This allows an intuitive click-to-move feature ofthe microscope’s graphical interface, and will in the future be used when scanning and tiling images.The same image analysis used for stage calibration is used as a 2D displacement encoder, allowingfor closed-loop sample translation and closed-loop scanning. This significantly reduces the error ineach individual move, as well as ensuring that errors do not accumulate over the course of a largescan. Going forward, we will extend this functionality to include simultaneous location and mapping(SLAM) [44]. This will enable the creation of a map of the sample by comparing predictions based oncommands sent to the motors to observations from the camera. This will enable accurate movementsto features or areas of interest using the camera and estimated motor position.
In this manuscript we have demonstrated the benefits of adopting a client–server architecture for in-strument control, that is built upon standardised web APIs. This architecture allows for local andremote control of an instrument, allows for multiple devices to safely access an instrument simulta-neously, or for a device to control multiple instruments. Due to the ubiquity of web requests thisallows experiment to be scripted in almost any modern language on any operating system using only8tandard libraries. We show how this architecture has allowed a highly extensible user interface for theOpenFlexure Microscope, from simple, cross-platform graphical clients to robust scripting libraries.We have created a desktop client that provides a graphical interface for the OpenFlexure microscopesuitable for scientific, educational, and clinical use. This can be extended via server plug-ins to allowthe interface to be adapted for new purposes. We enable remote scripting of experiments with ourPython and MATLAB clients, which can run alongside the graphical interface and integrate well withnotebook-based programming.The core architecture of our software is written as a stand-alone libraries which are not specific tothe microscope itself [19]. This allows other instruments to adopt our standards compliant client–serverarchitecture. We hope that this will enable greater standardisation of instrument control software builtupon open protocols, and reduce the number of instruments that are tied to specific programminglanguages, proprietary software libraries, and legacy communication busses.
We would like to acknowledge financial support from EPSRC (EP/R013969/1, EP/R011443/1) andthe Royal Society (URF \ R1 \ \ EA \ References [1] Jeretta Horn Nord, Alex Koohang, and Joanna Paliszkiewicz. The Internet of Things: Reviewand theoretical framework.
Expert Systems with Applications , 133:97–108, nov 2019.[2] Jeffrey M Perkel. The Internet of Things comes to the lab.
Nature , 542(7639):125–126, feb 2017.[3] J. Zornig, S. Chen, and H. Dinh. RESTlabs: A prototype web 2.0 architecture for Remote Labs. In ,pages 1–3. IEEE, jul 2012.[4] Mohammed Misbah Uddin, Suresh Vakati, and Abul K. M. Azad. Potential of embedded proces-sors and cloud for remote experimentation. In Michael E. Auer and Dominik May, editors,
CrossReality and Data Science in Engineering , pages 472–487, Cham, 2021. Springer InternationalPublishing.[5] Mohammed Moussa, Abdelhalim Benachenhou, Smail Belghit, Abderrahmane Adda Benattia, andAbderrahmane Boumehdi. An Implementation of Microservices Based Architecture for RemoteLaboratories. In Michael E Auer and Dominik May, editors,
Cross Reality and Data Science inEngineering , pages 154–161, Cham, 2021. Springer International Publishing.[6] Christophe Salzmann and Denis Gillet. Smart device paradigm, Standardization for online labs.In
Review ofScientific Instruments , 87(2):025104, feb 2016.[9] Joel T Collins, Joe Knapper, Julian Stirling, Joram Mduda, Catherine Mkindi, Valeriana Maya-gaya, Grace A Mwakajinga, Paul T Nyakyi, Valerian L Sanga, Dave Carbery, Leah White,9ara Dale, Zhen Jieh Lim, Jeremy J Baumberg, Pietro Cicuta, Samuel McDermott, Boyko Vo-denicharski, and Richard Bowman. Robotic microscopy for everyone: the OpenFlexure micro-scope.
Biomedical Optics Express , 11(5):2447, may 2020.[10] Micro-manager. https://micro-manager.org/wiki/Micro-Manager . (Accessed on 10/06/2020).[11] Writing plugins for micro-manager. https://micro-manager.org/wiki/Writing_plugins_for_Micro-Manager . (Accessed on 10/06/2020).[12] Arthur Edelstein, Nenad Amodaj, Karl Hoover, Ron Vale, and Nico Stuurman. Computer controlof microscopes using µ manager. Current Protocols in Molecular Biology , 92(1):14.20.1–14.20.17,2010.[13] Arthur Edelstein, Mark Tsuchida, Nenad Amodaj, Henry Pinkard, Ronald Vale, and Nico Stu-urman. Advanced methods of microscope control using µ manager software. Journal of BiologicalMethods , 1(2):e10, 2014.[14] Mick A Phillips, David Miguel Susano Pinto, Nicholas Hall, Julio Mateos-Langerak, Richard MParton, Danail V Stoychev, Thomas Park, Tiago Susano Pinto, John W Sedat, Martin J Booth,Ilan Davis, and Ian M Dobbie. Microscope-cockpit: Python-based bespoke microscopy for bio-medical science. bioRxiv , 2021.[15] Micronoxford/cockpit: Cockpit is a microscope graphical user interface. https://github.com/MicronOxford/cockpit . (Accessed on 10/06/2020).[16] Openflexure / pi-gen · gitlab. https://gitlab.com/openflexure/pi-gen . (Accessed on21/01/2021).[17] Shop - ni. . (Accessed on 10/06/2020).[18] S. Cheshire and M. Krochmal. Multicast dns. RFC 6762, RFC Editor, February 2013. .[19] labthings/python-labthings: Python implementation of labthings, based on the flask microframe-work. https://github.com/labthings/python-labthings . (Accessed on 10/08/2020).[20] pallets/flask: The python micro framework for building web applications. https://github.com/pallets/flask . (Accessed on 10/08/2020).[21] Charles R. Harris, K. Jarrod Millman, St’efan J. van der Walt, Ralf Gommers, Pauli Virta-nen, David Cournapeau, Eric Wieser, Julian Taylor, Sebastian Berg, Nathaniel J. Smith, RobertKern, Matti Picus, Stephan Hoyer, Marten H. van Kerkwijk, Matthew Brett, Allan Haldane,Jaime Fern’andez del R’ıo, Mark Wiebe, Pearu Peterson, Pierre G’erard-Marchant, Kevin Shep-pard, Tyler Reddy, Warren Weckesser, Hameer Abbasi, Christoph Gohlke, and Travis E. Oliphant.Array programming with NumPy. Nature , 585(7825):357–362, September 2020.[22] Pauli Virtanen, Ralf Gommers, Travis E. Oliphant, Matt Haberland, Tyler Reddy, David Cour-napeau, Evgeni Burovski, Pearu Peterson, Warren Weckesser, Jonathan Bright, St´efan J. van derWalt, Matthew Brett, Joshua Wilson, K. Jarrod Millman, Nikolay Mayorov, Andrew R. J. Nel-son, Eric Jones, Robert Kern, Eric Larson, CJ Carey, ˙Ilhan Polat, Yu Feng, Eric W. Moore, JakeVand erPlas, Denis Laxalde, Josef Perktold, Robert Cimrman, Ian Henriksen, E. A. Quintero,Charles R Harris, Anne M. Archibald, Antˆonio H. Ribeiro, Fabian Pedregosa, Paul van Mulbregt,and SciPy 1. 0 Contributors. SciPy 1.0: Fundamental Algorithms for Scientific Computing inPython.
Nature Methods , 2020.[23] G. Bradski. The OpenCV Library.
Dr. Dobb’s Journal of Software Tools , 2000.1024] J. D. Hunter. Matplotlib: A 2d graphics environment.
Computing in Science Engineering
Architectural Styles and the Design of Network-based Software Architectures . PhDthesis, University of California, Irvine, 2000.[27] Leonard Richardson, Mike Amundsen, and Sam Ruby.
RESTful Web APIs . O’Reilly Media, Inc.,2013.[28] Openapi specification. https://swagger.io/specification/ . (Accessed on 10/08/2020).[29] Rest api documentation tool — swagger ui. https://swagger.io/tools/swagger-ui/ · gitlab. https://gitlab.com/openflexure/openflexure-microscope-pyclient . (Accessed on 10/09/2020).[32] labthings/python-labthings-client: A simple python client for labthings devices. https://github.com/labthings/python-labthings-client . (Accessed on 10/16/2020).[33] F. Perez and B. E. Granger. Ipython: A system for interactive scientific computing. Computingin Science Engineering , 9(3):21–29, May 2007.[34] Thomas Kluyver, Benjamin Ragan-Kelley, Fernando P´erez, Brian Granger, Matthias Busson-nier, Jonathan Frederic, Kyle Kelley, Jessica Hamrick, Jason Grout, Sylvain Corlay, Paul Ivanov,Dami´an Avila, Safia Abdalla, and Carol Willing. Jupyter notebooks – a publishing format forreproducible computational workflows. In F. Loizides and B. Schmidt, editors,
Positioning andPower in Academic Publishing: Players, Agents and Agendas , pages 87 – 90. IOS Press, 2016.[35] Openflexure / openflexure-microscope-matlab-client · gitlab. https://gitlab.com/openflexure/openflexure-microscope-matlab-client . (Accessed on 21/01/2021).[36] Openflexure / openflexure-microscope-snesclient · gitlab. https://gitlab.com/openflexure/openflexure-microscope-snesclient . (Accessed on 10/08/2020).[37] Miguel Grinberg. Flask video streaming revisited - miguelgrinberg.com. https://blog.miguelgrinberg.com/post/flask-video-streaming-revisited . (Accessed on 10/06/2020).[38] waveform80/picamera: A pure python interface to the raspberry pi camera module. https://github.com/waveform80/picamera . (Accessed on 10/06/2020).[39] jtc42/picamerax: A pure python interface for the raspberry pi camera module, with extra featuresand fixes. https://github.com/jtc42/picamerax . (Accessed on 10/08/2020).[40] Xingxing Hao, Hui Zhao, and Jing Liu. Multifocus color image sequence fusion based on meanshift segmentation. Applied Optics , 54(30):8982, oct 2015.[41] Gregory K. Wallace. The JPEG still picture compression standard.
Communications of the ACM ,34(4):30–44, apr 1991.[42] Richard W. Bowman, Boyko Vodenicharski, Joel T. Collins, and Julian Stirling. Flat-Field andColour Correction for the Raspberry Pi Camera Module.