Darren Sawyer
NetApp
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Darren Sawyer.
high performance distributed computing | 2010
Anna Povzner; Darren Sawyer; Scott A. Brandt
Data centers often consolidate a variety of workloads to increase storage utilization and reduce management costs. Each workload, however, has its own performance targets that need to be met, requiring isolation from the effects of other workloads sharing the system. Satisfying the global throughput and latency targets of each workload is challenging in fully distributed storage systems because workloads can have different data layouts and different requests from the same workload can be serviced by different nodes. Quality of service schemes that manage individual system resources usually rely on resource reservations, often requiring assumptions about the layout of data. On the other hand, solutions for distributed storage tend to treat the storage system as a black box, metering requests issued to the system and often under-utilizing system resources. We show that our multi-layered approach that locally manages individual disk resources can deliver global throughput and latency targets while efficiently utilizing system resources. Our system uses upper-level control mechanisms to assign deadlines to requests based on workload performance targets, and low-level disk I/O schedulers designed to meet request deadlines while maximizing throughput at the disk. We provide a novel disk scheduler called Horizon that meets deadlines while providing efficient request re-ordering and keeping efficient disk queues. Our experimental results show that Horizon can meet more than 90% of deadlines while remaining efficient even in the presence of low-latency bursty workloads.
international conference on performance engineering | 2011
Shaun Dunning; Darren Sawyer
In order to effectively measure the performance of large scale data management solutions at NetApp, we use a fully automated infrastructure to execute end-to-end system performance tests. Both the software and user requirements of this infrastructure are complex: the system under test runs a multi-protocol, highly specialized operating system and the infrastructure serves a diverse audience of developers, analysts, and field engineers (including both sales and support). In this paper we describe our approach to rapidly constructing automated performance system tests by using a lightweight, little, or domain-specific language called SLSL in order to more effectively express test specifications. Using a real world example, we illustrate the efficacy of SLSL in terms of its expressiveness, flexibility, and ease of use by showing a complex test configuration expressed with just a few language constructs. We also demonstrate how SLSL can be used in conjunction with our performance measurement lab to quickly deploy performance tests that yield highly repeatable measurements.
measurement and modeling of computer systems | 2011
Shaun Dunning; Darren Sawyer
In order to effectively measure the performance of large scale data management solutions at NetApp, we use a fully automated infrastructure to execute end-to-end system performance tests. Both the software and user requirements of this infrastructure are complex: the system under test runs a multi-protocol, highly specialized operating system and the infrastructure serves a diverse audience of developers, analysts, and field engineers (including both sales and support). In this paper we describe our approach to rapidly constructing automated performance system tests by using a lightweight, little, or domain-specific language called SLSL in order to more effectively express test specifications. Using a real world example, we illustrate the efficacy of SLSL in terms of its expressiveness, flexibility, and ease of use by showing a complex test configuration expressed with just a few language constructs. We also demonstrate how SLSL can be used in conjunction with our performance measurement lab to quickly deploy performance tests that yield highly repeatable measurements..
Archive | 2008
Darren Sawyer; Kesari Mishra; Swaminathan Ramany
Archive | 2013
Lakshmi N. Bairavasundaram; Gokul Soundararajan; Vipul Mathur; Kaladhar Voruganti; Darren Sawyer
Archive | 2004
Richard E. Honicky; Swaminathan Ramany; Darren Sawyer
Archive | 2011
Lakshmi N. Bairavasundaram; Gokul Soundararajan; Vipul Mathur; Kaladhar Voruganti; Darren Sawyer
Archive | 2014
Dan Truong; Alexander Sideropoulos; Michael Cao; Raymond Luk; Darren Sawyer
Int. CMG Conference | 2005
Swami Ramany; Richard Honicky; Darren Sawyer
Archive | 2004
Swaminathan Ramany; Manpreet Singh; Darren Sawyer