Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Christopher T. Marsden is active.

Publication


Featured researches published by Christopher T. Marsden.


Archive | 2007

Co-Regulating Internet Security: The London Action Plan

Ian Brown; Christopher T. Marsden

In this article we have used existing literature, a large-scale electronic survey of stakeholder participants, an expert workshop and interviews with a range of LAP stakeholders to answer the following research questions:• Where should action on Internet security sit in the continuous spectrum from self to co-regulation to ‘full’ regulation?• How are divergent national approaches accommodated, with shifting alliances between government, industry groups and users?• How well does the London Action Plan framework include all stakeholders, including civil society?• What lessons can we learn from the London Action Plan for the development of other self/co-regulatory organisations?


Archive | 2017

Network neutrality: From policy to law to regulation

Christopher T. Marsden

This study explains the concept of network neutrality and its history as an extension of the rights and duties of common carriers, as well as its policy history as examined in US and European regulatory proceedings from 1999. The book compares national and regional legislation and regulation of net neutrality from an interdisciplinary and international perspective. It also examines the future of net neutrality battles in Europe, the United States and in developing countries such as India and Brazil, and explores the case studies of Specialized Services and Content Delivery Networks for video over the Internet, and zero rating or sponsored data plans. Finally, Network neutrality offers co-regulatory solutions based on FRAND and non-exclusivity. This is a must-read for researchers and advocates in net neutrality debate, and those interested in the context of communications regulation, law and economic regulation, human rights discourse and policy, and the impact of science and engineering on policy and governance.


Archive | 2016

Zero Rating and Mobile Net Neutrality

Christopher T. Marsden

Several developed countries have recently legislated for or regulated for net neutrality, the principle that Internet Service Providers (ISPs) should not discriminate between different applications, services and content accessed by their users. This came after 20 years of attempted discrimination between content streams within the walled gardens of both fixed and mobile ISPs, such as AOL in the 1990s, BT Openworld (sic) around 2000 and Vodafone Live/360 in 2002-11, which was intended to challenge the Apple AppStore and Android/GooglePlay. Alongside their walled gardens, these ISPs enforced monthly data caps preventing their customers having unlimited use of the Internet.


Critical Studies in Media Communication | 2014

Hyper-power and Private Monopoly: The Unholy Marriage of (Neo)corporatism and the Imperial Surveillance State

Christopher T. Marsden

American hyper-power world dominance by public and private agencies has replaced British Empire hyper-power world domination in the period 1815–1914. Edward Snowdens revelations of United States and United Kingdom surveillance have given rise to several important papers examining the geographical and territorial limits of the internet, comparing it to the imperial telegraph and even to the Roman imperial road. This paper recalls earlier telegraphy research and explains how the previous hyper-power, the British Empire, was able to control communications in order to extend its extraterritorial application of domestic law. I explain that the nineteenth century telegraph ‘cables that girdled the Earth’ were sunk into the sea in Cornwall, southwest England, and that todays internet fibre cables are in the same places – with the result that the greatest National Security Agency espionage-gathering operation is a joint US/UK operation from the small town of Bude, Cornwall. Add to that historical espionage the invention of encryption/decryption computing, devices from Babbages Difference Engine to Turing and Tommy Flowers Colossus Marks I and II that broke both Enigma and Lorenz in World War II. The recipe now exists for what the National Security Agency calls ‘Total Information Awareness’ and the Orwellian nightmare of totally efficient surveillance and ‘war is peace’ according to the Ministry of Truth. But it existed before, and we should learn from the past.


standardization and innovation in information technology | 2013

Interoperability as a standard-based ICT competition remedy

Ian Brown; Christopher T. Marsden

This paper examines how a standards-based pro-competition legal interoperability framework can be applied to ensure future Internet services markets remain open, innovative and competitive. We assess regulatory intervention according to the code solution or solutions used. We analyse the regulatory shaping of “code” standards - the technological environment of the Internet comprising hardware, software and their interactions, notably in the protocols and standards used to achieve interoperability ” to achieve more economically efficient and socially just regulation. We go on to explore standards-based solutions that involve both competition analysis and interoperability requirements in strategic communications sectors. We conclude that such standards frameworks are urgently needed to enable citizens to make most effective use of the opportunities offered by new information and communications technologies (ICTs).


International Review of Law, Computers & Technology | 2012

Internet co-regulation and constitutionalism: Towards European judicial review

Christopher T. Marsden

This article analyzes co-regulation, by defining and exploring its recent institutional history in the Internet environment. It then assesses the legal definitions and taxonomies of co-regulation before constructing a 12-point scale of self- and co-regulation. The term ‘co-regulation’ encompasses a range of different regulatory phenomena, which have in common the fact that the regulatory regime is made up of a complex interaction of general legislation and a self-regulatory body. Co-regulation has enriched conceptions of ‘soft law’ or ‘governance’ in the literature in the past ten years, but like those umbrella terms, refers to forms of hybrid regulation that do not meet the administrative and statute-based legitimacy of regulation, yet clearly perform some elements of public policy more than self-regulation, which is defined by the absence of formal roles for the nation-state or European law. Co-regulation is often identified with the rise of the ‘new governance’ in the 1990s. Recent European case law has seen a long overdue emphasis placed on human rights in judicial review of co-regulatory arrangements. Without regulation responsive to both the market and the need for constitutional protection of fundamental rights, Internet regulatory measures cannot be self-sustaining.


Communications of The ACM | 2017

How law and computer science can work together to improve the information society

Christopher T. Marsden

Seeking to remedy bad legislation with good science.


Archive | 2014

Bitcoin: The Wrong Implementation of the Right Idea at the Right Time

Andres Guadamuz; Christopher T. Marsden

This paper is a study into some of the regulatory implications of cryptocurrencies using the CAMPO research framework (Context, Actors, Methods, Methods, Practice, Outcomes). We explain in CAMPO format why virtual currencies are of interest, how self-regulation has failed, and what useful lessons can be learned. We are hopeful that the full paper will produce useful and semi-permanent findings into the usefulness of virtual currencies in general, block chains as a means of mining currency, and the profundity of current ‘media darling’ currency Bitcoin as compared with the development of block chain generator Ethereum. While virtual currencies can play a role in creating better trading conditions in virtual communities, despite the risks of non-sovereign issuance and therefore only regulation by code (Brown/Marsden 2013), the methodology used poses significant challenges to researching this ‘community’, if BitCoin can even be said to have created a single community, as opposed to enabling an alternate method of exchange for potentially all virtual community transactions. First, BitCoin users have transparency of ownership but anonymity in many transactions, necessary for libertarians or outright criminals in such illicit markets as #SilkRoad. Studying community dynamics is therefore made much more difficult than even such pseudonymous or avatar based communities as Habbo Hotel, World of Warcraft or SecondLife. The ethical implications of studying such communities raise similar problems as those of Tor, Anonymous, Lulzsec and other anonymous hacker communities. Second, the journalistic accounts of BitCoin markets are subject to sensationalism, hype and inaccuracy, even more so than in the earlier hype cycle for SecondLife, exacerbated by the first issue of anonymity. Third, the virtual currency area is subject to slowly emerging regulation by financial authorities and police forces, which appears to be driving much of the early adopter community ‘underground’. Thus, the community in 2016 may not bear much resemblance to that in 2012. Fourth, there has been relatively little academic empirical study of the community, or indeed of virtual currencies in general, until relatively recently. Fifth, the dynamism of the virtual currency environment in the face of the deepening mistrust of the financial system after the 2008 crisis is such that any research conclusions must by their nature be provisional and transient.All these challenges, particularly the final three, also raise the motivation for research – an alternative financial system which is separated from the real-world sovereign and which can use code regulation with limited enforcement from offline policing, both returns the study to the libertarian self-regulated environment of early 1990s MUDs, and offers a tantalising prospect of a tool to evade the perils of ‘private profit, socialized risk’ which existing large financial institutions created in the 2008-12 disaster. The need for further research into virtual currencies based on blockchain mining, and for their usage by virtual communities, is thus pressing and should motivate researchers to solve the many problems in methodology for exploring such an environment.


Archive | 2013

Regulating Code: Towards Prosumer Law?

Ian Brown; Christopher T. Marsden

In this interdisciplinary paper written by a socio-legal scholar and a computer scientist, we explain a novel holistic approach to Internet regulation in the broader public interest. We argue for ‘prosumer law’ and give an example of our proposed solution to the problems of dominant social networking sites. What should prosumer law consist of? We examine the international governance of information, especially the apparent incompatibility of human rights and trade-related concerns exposed in such multi-stakeholder fora as the OECD. Finally, we argue for holistic regulation of the Internet, taking a trans-disciplinary perspective to solve those ‘hard cases’ we have examined. Prosumer law suggests a more directed intervention, to prevent Facebook or Google or any other network from erecting a fence around its piece of the information commons: to ensure interoperability with open standards. It is not sufficient for it to permit data deletion as that only covers the user’s tracks. It requires some combination of interconnection and interoperability, more than transparency and the theoretical possibility to switch. It needs the ability for exiting prosumers to interoperate to permit exit. We argue that it is untrue to state that there is so much convergence between platforms that there is no clear distinction between open commons and closed proprietary environments, though ‘voluntary forfeiture’ of IPR to permit greater innovation has always been commonplace. We base our argument on the empirical case studies presented in ‘Regulating Code’ (MIT Press, 2013), but we extend our argument from that monograph to assess the environmental preconditions for prosumer law to operate in Europe. We describe the multistakeholder environment for Internet governance and regulation, in which user groups lobbied along with business and governments. We also describe the insights of new institutionalism, with exit and competition for standards becoming increasingly critical in the information economy. We then describe interoperability as a means of lowering entry barriers and increasing consumer welfare. We consider United States administrative and academic arguments (Wu 2010, Zittrain 2008, Lessig 1999, 2006) for self-regulation to have demonstrably failed, and focus on the European regulatory space as more fertile ground to explore prosumerism as both a market-based and citizen-oriented regulatory tool.


Archive | 2007

Codifying cyberspace: communications self-regulation in the age of internet convergence

Damian Tambini; Danilo Leonardi; Christopher T. Marsden

Collaboration


Dive into the Christopher T. Marsden's collaboration.

Top Co-Authors

Avatar

Ian Brown

University College London

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Damian Tambini

London School of Economics and Political Science

View shared research outputs
Top Co-Authors

Avatar

Lilian Edwards

University of Strathclyde

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ruth Levitt

Queen Mary University of London

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge