On the Battery Consumption of Mobile Browsers
OOn the Battery Consumption of Mobile Browsers
Matteo Varvello † , Benjamin Livshits †(cid:5) † Brave Software, (cid:5)
Imperial College London
ABSTRACT
Mobile web browsing has recently surpassed desktop brows-ing both in term of popularity and traffic. Following its desk-top counterpart, the mobile browsers ecosystem has beengrowing from few browsers (Chrome, Firefox, and Safari)to a plethora of browsers, each with unique characteristics(battery friendly, privacy preserving, lightweight, etc.). Inthis paper, we introduce a browser benchmarking pipelinefor Android browsers encompassing automation, in-depthexperimentation, and result analysis. We tested 15 Androidbrowsers, using
Cappuccino a novel testing suite we builtfor third party Android applications. We perform a battery-centric analysis of such browsers and show that: 1) popu-lar browsers tend also to consume the most, 2) adblocking produces significant battery savings (between 20 and 40%depending on the browser), and 3) dark mode offers an extra10% battery savings on AMOLED screens. We exploit thisobservation to build
AttentionDim , a screen dimming mecha-nism driven by browser events. Via integration with the Bravebrowser and 10 volunteers, we show potential battery savingsup to 30%, on both devices with AMOLED and LCD screens.
When it comes to mobile apps, users are tied to the official appfrom the services they access. This is not the case for mobilebrowsers where plenty of options are currently available forboth Android and iOS [1]. While iOS browsers must useSafari’s rendering engine [2], Android browsers are allowedmore freedom although, in reality, most browsers rely on acommon Chromium source base [3].Such a competitive environment constantly stimulates thedevelopment of new browsers as well as new browser func-tionalities. In the last years, there has been a growing interestin reducing browsers (and apps in general) power consump-tion, motivated by the ever-increasing phone usage and appcomplexity.
Adblocking —either in the form of an addon [4, 5]or directly integrated in the browser [6–8]—is probably themost popular feature which has recently been connected withbattery savings [21, 22].
Dark theme [9] is another featurewhich, originally introduced for eye strains, is now creditedwith high battery savings in presence of AMOLED screenswhich effectively turn pixels off when dark. The Yandexbrowser also offers a mysterious power saving mode [10].The goal of this work is to shed some light on the Androidbrowser ecosystem. Our approach is clearly battery-centric , but it also covers other metrics which directly impact batteryusage, like CPU and bandwidth utilization. A strawman re-search approach to this problem consists in building a localtestbed, e.g., one Android device connected to a power meter,and writing automation code for a set of browsers and devicesto be tested. Such approach does not offer reproducible re-search, which is paramount to guarantee transparency whencommercial entities are involved.
Scalability is another issuegiven manual work can rapidly become overwhelming.Motivated by the above, we have built a generic browsertesting suite – which provides both fairness and transparency –where human-generated automation is plugged as needed. Todo so, we have built Cappuccino the alter ego of the Espressotest recorder [11]. In the same way as Espresso can automat-ically generate testing code from human input,
Cappuccino automatically generates automation for third party apps. Weintegrated Cappuccino with Batterylab [27] – a research plat-form for battery measurements – to realize a fully transparentand extensible browser testing suite.We used the above approach to benchmark the battery con-sumption (and more) of 15 Android browsers under differentconfigurations, workloads, and devices. We find that: 1) pop-ular browsers tend also to consume the most, 2) adblockingproduces significant battery savings (between 20 and 40%depending on the browser), 3) Yandex’s power saving modedoes not produce any extra battery saving, and 3) dark themeoffers an extra 10% of savings on AMOLED devices. Thelatter observation motivates us to build
AttentionDim , a screendimming mechanism driven by browser events. Via integra-tion with a commercial browser and 10 volunteers, we showthat AttentionDim can further reduce battery consumption upto 30%, independently of the device’s screen technology.
The underlying goal of this paper is to benchmark the energyconsumption of multiple Android browsers. A strawman ap-proach to this research question requires: 1) building a local testbed composed of an Android device and a power me-ter [18, 19, 23, 26], 2) write code to automate each browser,e.g., how to launch and instrument, 3) write code to instru-ment the device and the power meter, e.g., collect performancemetrics and minimize experimental noise.Such strawman approach does not provide reproducible research, which we argue is a necessity when commercialproducts are at play. Further, it does not scale given that au-tomation code needs to be written per browser and device. In a r X i v : . [ c s . OH ] A ug igure 1: Cappuccino’s GUI. On the left, the user followsinstructions to record an automation. On the right, a vir-tual device is shown. The example refers to the generationof Opera’s night-mode selection. fact, while some operations are common across browsers, e.g.,launch and open a URL, others are browser-specific, rarelyexported as flags or unusable on regular Android devices [12].Fortunately, the research community has recently releasedBatteryLab [27], a testbed which largely simplifies batterymeasurements. In short, BatteryLab consists of a set of remotedevices connected to power meters where experimenters canrun ad-hoc experiments. BatteryLab also offers simple APIs tocollect fine-grained battery readings along with other metricslike CPU and bandwidth usage. This testbed not only elimi-nates two of the three limitations of the strawman approachabove, but further fosters our transparency goal. Accordingly,we are left with the need of building a generic browser testingpipeline, which we describe in the upcoming subsections. Today, testing third party Android apps ( i.e., lack of sourcecode) requires an experimenter to write his/her own automa-tion by manually interacting with a device while recordingthe actions executed. Such actions can then be translated intoAndroid Debugging Bride (ADB [20]) commands to gener-ate automation scripts . Better tools are instead available fordevelopers who have access to the code to be tested. For ex-ample, Espresso test recorder [11] automatically generatestesting code from natural interaction with an Android emula-tor. In the following we describe
Cappuccino , the equivalentof Espresso for third party apps. The intuition behind Cappuccino derives from the workin [17], where the authors crowdsource human input by stream-ing (emulated) Android devices in the browser. In the samespirit, Cappuccino streams a real (or emulated) device in thebrowser where a custom version of noVNC records the userinput and maps it to generic
ADB commands, with the goal tobuild automated scripts. We say these commands are genericsince they are expressed as ratios of screen resolutions – ac-counting, for example, for on screen toolbars as in Samsungdevices – so to be reused across several devices.Figure 1 shows the workflow of Cappuccino’s human inputcollection. First, an experimenter provides information aboutthe app to be tested, and proceed with installation and launch.This allows Cappuccino to learn how to launch the newlyinstalled app, i.e., package and activity name to be used.Next,the experimenter start generating automations . Some prede-fined automation_labels are offered (e.g., onboarding) andcustoms are possible to (e.g., enableAdblocking). Start andstop recording buttons are used to inform Cappuccino onwhen to start and stop recording human inputs. A replay button is also available to execute the automation script justderived from the human input. In this way, an experimentercan evaluate the accuracy of the automation generated anddecide whether to save it or not.Thanks to the BatteryLab team, we integrated Cappuccinowith BatteryLab. This means that app/browser automation canbe saved at BatteryLab’s access server under pair < browser,automation_label > and easily accessed during browser testing(see next section). All automations used in this paper wereproduced by Cappuccino. A word of caution is needed thoughsince browser automations are simple as mostly require clicksand, at most, few scrolls. More complex app automationscan be hard to replay, especially due to divergent behaviorbetween devices. Further evaluating Cappuccino to a broaderset of mobile apps is part of our future work. Algorithm 1 shows the pseudocode of a generic job for browsertesting. Such job targets BatteryLab and as such it relies onits APIs, e.g., device preparation and battery measurements.However, it can be extended to run on other device farmsolutions granted that they allow ADB [20] access to theirdevices, like for instance Samsung’s Remote Test Lab [25].In the following, we capitalize all calls to BatteryLab’s API.Our generic browser testing job expects as input the id ofthe device where to run ( device ), the list of browsers to betested ( browser_list ), JSON containing the desired automa-tion ( automation_dict ), and JSON containing the websitesto be tested and how ( workload_dict ), e.g., load in a newtab, time spent on site, webpage interaction strategy. Thebrowser testing workflow consists of four phases: device setup , browser setup , data collection , and testing . evice Setup – The device under test is configured to mini-mize noise on the measurements. For instance, backgroundprocesses and app notifications are disabled ( DEVICE_SETUP ,L1 in Algorithm 1).
Browser Setup – Before a test, a browser might need tobe installed (L4). By default, the
INSTALL
API relies onthe PlayStore and thus installs the most recent version of anapp. For custom testings, a URL pointing to the .apk tobe tested can also be provided. Next, the browser is setupwith a clean profile , i.e., its cache is emptied and local datalike configuration, cookies, and history are erased. This step(L5) is equivalent across browsers since it relies on the OSto clean a target app/package.Next, we deal with a browser’s onboarding process (L6), a common operation where theuser is asked to customize the browser, e.g., by choosing asearch engine or turning adblocking on/off. This step differsacross browsers and it is thus the first step where we rely onthe human-driven automation described above. This is thefirst step where the browser can be customized for a specificsetting to be tested by the experimenter. More human-driven settings are setup with the for loop in L7-L8. Data Collection – Once both phone and browser are config-ured for a test, we enter the “data collection” phase wherefine-grained battery measurements (1,500 current/voltagesamples per second), CPU and memory usage (5 secondsfrequency, via /proc/stat ), bandwidth consumption (via /proc/net/ ) are collected. This phases only starts whenthe CPU load returns to common rest values (between 0 and5% for more than 15 seconds) after the CPU spikes causedby device and browser preparation.
Testing – For each page, the browser launches it, wait for acertain amount of time, and interact (or not) with the page exe-cuting several scrolls up and down. Pages and load details aredescribed in workload _ dict , either as pure ADB commandson Cappuccino automations. This section offers an empirical evaluation of the Androidbrowser ecosystem. We start by describing browsers, work-loads, and devices under tests, along with the rationale beyondtheir selection. Next, we report on the benchmarking results.Our rationale for browser selection is threefold. First, wetarget popular browsers which are indeed used in the wild.Second, we target adblocking browsers – either native orenhanced with adblocking addons – because of the potentialenergy benefits associated with adblocking [21, 22]. Finally,we target browsers which advertise energy saving capabilities.Based on this strategy, we have selected 15 browsers to betested (see Table 2, Appendix A).
Algorithm 1:
Pseudo-code for browser testing.
Input:
Device identifier device , Browser list browser _ list , Automation JSON automation _ dict , Workload JSON workload _ dict Output:
JSON file with performance metrics device_status ← DEVICE _ SETUP ( device ) for browser ∈ browser_list do dict ← AUTOMATION _ DICT [ BROWSER ] INSTALL ( browser ) CLEAN ( browser ) ONBOARDING ( browser , device , dict["onboarding"]) for settinд ∈ dict[browser]["settings"] do SETUP ( device , browser , setting ) end DATA _ COLLECTION ( device ) RUN _ TEST ( device , browser , url_list , workload ) end device_status ← CLEANUP( device )We call workload the content of the workload _ dict JSONfile (see Section 2.2) describing which websites should betested and how . According to [13], most users keep the num-ber of open tabs between 1 and 10. Accordingly, we opted toopen testing webpages sequentially as a new tab. Each page isrequested for T seconds (empirically estimated to 10 seconds)to allow full page loads. Note that waiting for onload onlywork for some of the Chromium-based browsers, and wouldcause uneven experiment durations. Next, we simulate multi-ple user interactions by scrolling the page down N times andthen up N / times, for 30 seconds. With respect to the pagesto be tested we pick 10 popular news websites around theworld ( news workload) as well as 10 (hard to find) ads-freewebsites ( ads-free workload). The rationale of our selectionis that these two workloads are realistic representation of,respectively, the best and worst case scenarios for adblockingbrowsers. The full list of websites selected can be found inTable 3) (Appendix A).From BatteryLab, we use two 2018 devices: a Samsung J7Duo (J7DUO) and a Samsung Galaxy J3 (SMJ337A). Themain difference between the two is their screen technology(AMOLED for the J7DUO and LCD for SMJ337) which isexpected to offer significant differences when measuring thebattery savings associated with dark mode. With respect totheir hardware, the J7DUO is more powerful, with twice asmany cores (octa vs quad-core) and RAM (4 vs 2GB). a) Battery discharge (mAh). (b) Bandwidth consumption (MBytes) (c) CDF of CPU utilization during a single run. Figure 2: Performance evaluation of 15 Android browser ; News workload; J7DUO
Figure 2(a) summarizes the performance evaluation (batterydischarge, bandwidth consumption, and CPU utilization) ofall browsers under test with default configuration, while con-sidering news workload and J7DUO. Barplots report, foreach metric, the average with errorbars for standard deviation(values computed over 5 runs). Given the CPU consump-tion evolves over time, we instead report one representativeCumulative Distribution Function (CDF) per browser (seeFigure 2(c)). We use circle markers (barplots) and dashedlines (CDFs) to highlight adblocking browsers.Figure 2(a) shows that the most popular browsers are quitesimilar with respect to battery consumption, with Baidu lead-ing the pack with minimum consumption at 150mAh. Mostadblocking browsers are more power efficient, with the excep-tion of Vivaldi and Firefox equipped with the Ublock plugin,and Opera Mini which shows a staggering 225mAh duringour test. Among adblocking browsers, Brave consumes theleast followed by Opera and Kiwi which suffers from higherstandard deviation than most browsers. Figure 2(b) justifiesthis result showing that adblocking browsers can save tensof MBytes by non downloading ads. Not all ad-blockers areequal though, most notably Firefox uBlock seems quite re-laxed and Kiwi suffers, again, from quite variable results –potentially due to a less mature development.To explain the strange case of Opera Mini, consuming littlebandwidth but high battery, we resort to Figure 2(c), whichshows the CDF of the CPU consumption during our tests, perbrowser. Opera Mini shows the highest CPU consumption,median of 35% with peaks up to 70%, twice as much asbrowsers like Brave and Firefox focus, with a median of 15%.
The previous subsection shows that adblocking results insignificant power savings. We here generalize the analysiswith respect to other techniques, namely dark mode – which can potentially save battery by allowing to turn pixels offon AMOLED screens – and Yandex power saving mode, tothe best of our knowledge, the only explicit power savingfeature offered by an Android browser. In this analysis wealso introduce the ads-free workload, to estimate the potentialcost of adblocking in absence of ads. Further, we introducethe second device (SMJ337) which is still equipped with anLCD screen, i.e., dark pixels are indeed not turned off. Toreduce the measurement space, we select a mix of popular andbest performing browsers with different levels of adblocking:Chrome, Opera, and Brave.Figure 2(a) shows the energy consumption across browsers,devices, and workloads. The figure shows that in absence ofads (left of the dashed line) browsers have very similar batteryconsumption, irrespective of the devices. We also notice thatless powerful SMJ337 also consumes ∼
20% less than the
Figure 3: Battery discharge of adblocking, dark mode,and Yandex power saving. Ads-Free and News work areon the left/right of the dashed line. Results refer to bothJ7DUO and SMJ337A (circle markers). rightness Scenario Current(mA, median) Savings(Aggressive) Savings(Conservative)0 - 145/108 0/0% 0/0% Indoor 189/130 23/17% 23/17%
Indoor 239/157 39/31% 39/31%
Cloud Outdoor 299/201 51/46% 28/28%
Outdoor 379/243 61/55% 21/17%
Sunny Outdoor 417/247 65/56% 28/17%
Table 1: Screen brightness reduction power savings(J7DUO/SMJ337A)
J7DUO. This could be due to many things, such as the biggerscreen and overall more advanced hardware to be powered.Next we focus on the news workload. Regardless of thedevice, the trend is the the same as the one observed beforewith the most aggressive adblocking browser (Brave) bringingthe highest battery savings. In addition, dark mode offersabout 11-13% extra savings for Opera and Brave, respectively.For Chrome-Dark we do not measure additional savings; the“*” indicate that for Chrome we selected the basic dark modesettings – i.e., the one available via GUI without accessing chrome://flags – which does not darken the page but only thebrowser’s GUI. We purposely selected this mode to measurepotential benefits from this feature, which are negligible inour tests. As expected, dark mode only brings benefits tothe J7DUO since SMJ337 does not mount an AMOLEDscreen. Last but not lest we did not observe any differencewhen activating Yandex’s power saving mode, despite the 9%battery saving advertised.
While browsing a user “waste” quite some time, e.g., whentyping a URL or while a webpage is loading. For example,under bad network conditions a user can spend tens of sec-onds waiting for a webpage to load, and eventually give upwith no content displayed. Our intuition is to minimize thescreen power consumption during these wasted times. We thuspropose “AttentionDim”, a screen dimming strategy whichleverages the browser state, e.g., loading versus content ready,to define screen brightness.We motivate this idea by investigating the potential savingsderiving from screen dimming. Table 1 shows, for severalincreasing brightness values in Android ( i.e., full screen dimming, i.e., dropping the screen brightness tozero, as well as the more conservative strategy we will detailbelow. This experiment shows that, even with a conservativestrategy, screen dimming offers potential savings between 17and 40%, on both AMOLED and LCD-equipped devices.The above savings highly depend on actual device us-age, e.g., slow mobile networks offer more opportunities for savings. Accordingly, rather than focusing on in-lab experi-ments, we have directly integrated AttentionDim in the Bravebrowser (Android) and performed experiments in the wild.We picked Brave since it resulted one of the “greener” browserfrom the previous analysis. Nevertheless, the code is genericand can be used by any Chromium-based browser. Implemen-tation and evaluation details are reported in the following.
The idea behind AttentionDim is to use onLoad() , a browserevent which signals when a page is loaded, as an approxima-tion of user attention which requires regular screen brightness.We have identified three events where user attention is low-ered: 1) URL typing, 2) menu settings, 3) webpage loading.Personal preferences are at play, but AttentionDim is thoughtfor the user who is willing to sacrifice a bit of his/her userexperience for longer battery duration. Other “events” arepossible, e.g., video buffering, but requires more complexbrowser modifications and were thus left as future work.AttentionDim consists of a module which controls thescreen brightness from the browser. This module currently sitsin
ChromeTabbedActivity , i.e., it can be adopted by allChromium-based browsers, and it is triggered by the aboveevents to dim the screen and then restore the brightness whenthe event completes. When dimming is triggered, this moduledetects whether the user is using auto or manual brightness sothat it can: 1) manually restore the previous brightness valuewhen the event completes, 2) reactivate auto dimming and letthe OS decide the brightness value to be restored.We experimented with several dimming strategies and thensettle for the following one based on feedback received fromour volunteers. When the original screen brightness is low ( i.e., <= 100) we opt for an aggressive strategy, i.e., we lowerthe screen brightness B to zero. For mid brightness values(e.g., 150) we set B to half of the original value (50 and75). We instead use a fixed B = for high values, sinceoutdoor and sunny conditions are quite challenging and weneed to prevent leaving users in the dark. Last but not least,we implemented a setting option and a simple GUI to allowusers to deactivate AttentionDim when in trouble and to get asense of the battery savings provided. We recruited 10 Android users who installed our modifiedversion of Brave for up to 30 consecutive days totaling about500 hours of browsing – we urged our volunteers to use thebrowser as normal. The test spans 6 devicessince multiplevolunteers shared the same device model.Figure 4(a) shows the CDF of the fraction of time spentdimming, per device. The CDF is calculated using the be-ginning of a dimming event as both the start time of suchevent and the end time of the previous non-dimming event. a) CDF of fraction of time spent dimming. (b) CDF of screen brightness. (c) CDF of estimated energy savings. Figure 4: Performance evaluation of AttentionDim in the wild.
Start/end timers are also triggered whenever the user closesor (re)launch the browser. As expected, the amount dimmingis very much user and time-dependent, meaning that someusers experience a higher amount of dimming as well as thedimming duration spans a broad distribution. Generally speak-ing, very short dimming events (e.g., lower than 10%) arerare. Across users, the median dimming event lasts between aminimum of 30 and a maximum of 70% of the time.Figure 4(b) shows the CDF (per device) of the screen bright-ness hen AttentionDim operated. Most brightness values re-ported are smaller than 100 (indoor usage). One of the Pixeldevices is an exception since most values reported were quitehigh, either because of outdoor or manually set. It has to benoted that this device was also only used for a limited amountof time, as the sharp CDF suggests. Finally, we combine theinformation from Figure 4(a), 4(b), and Table 1 to estimateactual battery savings. The figure shows encouraging batterysavings of about 20-30%, which means potentially extendingthe battery life by up to one hour.
Browser benchmarking is a largely discussed topic in thegreater “web community”, while less attention was dedicatedby the research community. In the following we report ontwo research papers and one blog which, to the best of ourknowledge, share some similarities with our work.Nielson et al. [24] benchmark four popular browsers (Fire-fox, IE, Opera, and Safari) at the time of this work (2008).Their results show substantial differences among browsersacross the range of tests considered, particularly in renderingspeed and JavaScript string operation performance. Our workis similar in spirit to [24] but differs under many aspects. First,the browsing ecosystem largely changed in the last 10 years,e.g., the prevalence of Android and mobile browsing whichwas not the case at that time. Second, our work also aims atbuilding an automated and scalable testing suite that offersboth transparency and reproducibility . Greenspector [21] – a startup offering software-based bat-tery measurements – has recently ranked Android browsersbased on their energy consumption (and other metrics). Weperform a similar evaluation but relying on highly accurate hardware-based measurements versus (proprietary) software-based measurements. Further, we test more workloads, web-sites, and features (such as dark mode, for instance). Never-theless, our testing suite was designed to foster extensible andreproducible research in browser performance.Last but not least, [22] analyzes the power consumption ofthe Brave browser, with respect to adblocking functionalities,over the Odroid-XU3 development board [14]. We also reportnumbers with respect to Brave, but in the context of two realAndroid devices. In our tests, we could not verify the reportedresults with respect to the extra cost of adblocking when adsare indeed missing (see Figure 3).
This paper has investigated the battery consumption of 15 An-droid browsers, 3 top performing browsers in dark mode , Yan-dex power saving feature, and
AttentionDim , a novel screendimming mechanism driven by browser events like onLoad .Given the scale of our measurements, we have also built afully automated testing suite. For browser automation, wehave built
Cappuccino , the first record and replay tool forthird party Android apps. For browser testing, we integratedwith BatteryLab a research platform which allows remotehardware-based battery measurements on a few Android de-vices. Our results show that adblocking offers significant bat-tery savings (around 30%) which can be further enhanced via dark mode (extra 10%). Yandex power saving feature resultedmore a marketing stunt than a beneficial solution. Finally,we integrated AttentionDim in the Brave browser – one ofthe top performing browsers – and run an experiment in thewild involving 10 users and up to 500 hours of real browsing.Our results show that AttentionDim reduced, on average, thebattery consumption of our volunteers by about 20-30%. EFERENCES
LMEIDA , M., B
ILAL , M., F
INAMORE , A., L
EONTIADIS , I.,G
RUNENBERGER , Y., V
ARVELLO , M.,
AND B LACKBURN , J. Chimp:Crowdsourcing human inputs for mobile phones. pp. 45–54.[18] B UI , D. H., L IU , Y., K IM , H., S HIN , I.,
AND Z HAO , F. Rethinkingenergy-performance trade-off in mobile web page loading. In
Proc.ACM MobiCom (2015).[19] C AO , Y., N EJATI , J., W
AJAHAT , M., B
ALASUBRAMANIAN , A.,
AND G ANDHI , A. Deconstructing the energy consumption of the mobilepage load.
Proc. of the ACM on Measurement and Analysis of Comput-ing Systems 1 , 1 (June 2017), 6:1–6:25.[20] G
OOGLE I NC . Android Debug Bridge.https://developer.android.com/studio/command-line/adb.[21] G REEN S PECTOR . Top 2018 least energy-hungry browsersfor your smartphone, 2019. https://greenspector.com/en/articles/2018-11-19-least-energy-hungry-browsers/.[22] H
EITMANN , N., P
IRKER , B., P
ARK , S.,
AND C HAKRABORTY , S.Towards building better mobile web browsers for ad blocking: Theenergy perspective (wip paper). In
The 21st ACM SIGPLAN/SIGBEDConference on Languages, Compilers, and Tools for Embedded Systems (2020), pp. 146–150.[23] H
WANG , C., P
USHP , S., K OH , C., Y OON , J., L IU , Y., C HOI , S.,
AND S ONG , J. Raven: Perception-aware optimization of power consumptionfor mobile games. In
Proc. ACM MobiCom (2017).[24] N
IELSON , J., W
ILLIAMSON , C.,
AND A RLITT , M. Benchmarkingmodern web browsers. In (2008).[25] S
AMSUNG . Remote Test Lab. https://developer.samsung.com/remote-test-lab.[26] T
HIAGARAJAN , N., A
GGARWAL , G., N
ICOARA , A., B
ONEH , D.,
AND S INGH , J. P. Who killed my battery?: Analyzing mobile browserenergy consumption.[27] V
ARVELLO , M., K
ATEVAS , K., P
LESA , M., H
ADDADI , H.,
AND L IVSHITS , B. BatteryLab: a distributed power monitoring platform formobile devices. In
HotNets ’19 (2019).
Browser Version Chrome/Firefox Vrs PopularityChrome QQ Samsung Browser
Opera Mini
Baidu
Opera
Firefox
Yandex
Edge
Brave
Firefox-focus
Firefox uBlock
DuckDuckgGo
Vivaldi
Kiwi
Quadea 77.0.3865.92 >1M
Table 2: Android browsers selected for performance eval-uation. Versions refer to the most recent version availableat the time of testing (May 2020).
Ads-Free News
Table 3: Workload description (news and ads-free)
A BROWSER AND WORKLOAD DETAILS
Table 2 summarizes the name, version, and underlying engine( i.e.,