Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Colin George Harrison is active.

Publication


Featured researches published by Colin George Harrison.


international workshop on mobile object systems | 1996

Mobile Agents: Are They a Good Idea?

David M. Chess; Colin George Harrison; Aaron Kershenbaum

Mobile agents are programs, typically written in a script language, which may be dispatched from a client computer and transported to a remote server computer for execution. Several authors have suggested that mobile agents offer an important new method of performing transactions and information retrieval in networks. Other writers have pointed out, however, that mobile agents introduce severe concerns for security. We consider the advantages offered by mobile agents and assess them against alternate methods of achieving the same function. We conclude that, while the individual advantages of agents do not represent an overwhelming motivation for their adoption, the creation of a pervasive agent framework facilitates a very large number of network services and applications.


IEEE Personal Communications | 1995

Itinerant agents for mobile computing

David M. Chess; Benjamin N. Grosof; Colin George Harrison; David W. Levine; Colin Parris; Gene Tsudik

Describes a framework for itinerant agents that can be used to implement secure, remote applications in large, public networks such as the Internet or the IBM Global Network. Itinerant agents are programs, dispatched from a source computer, that roam among a set of networked servers until they accomplish their task. This is an extension to the client/server model in which the client sends a portion of itself to the server for execution. An additional feature of itinerant agents is their ability to migrate from server to server, perhaps seeking one that can help with the users task or perhaps collecting information from all of them. A major focus of the article is the agent meeting point, an abstraction that supports the interaction of agents with each other and server based resources The article begins with an overview of the operation of an itinerant agent framework and a review of previous work. The authors consider likely applications of itinerant agents and discuss one specific example in detail. They give an architectural description of the structure of itinerant agents, the languages employed to create them, and the execution environments required at the servers; and also a detailed description of how an itinerant agent is processed at a server. Security issues are then discussed and finally they consider the technical advantages of the itinerant agent framework and the services it enables. >


Ibm Journal of Research and Development | 2010

Foundations for smarter cities

Colin George Harrison; Barbara A. Eckman; R. Hamilton; Perry G. Hartswick; Jayant R. Kalagnanam; J. Paraszczak; Peter Williams

This paper describes the information technology (IT) foundation and principles for Smarter Cities™. Smarter Cities are urban areas that exploit operational data, such as that arising from traffic congestion, power consumption statistics, and public safety events, to optimize the operation of city services. The foundational concepts are instrumented, interconnected, and intelligent. Instrumented refers to sources of near-real-time real-world data from both physical and virtual sensors. Interconnected means the integration of those data into an enterprise computing platform and the communication of such information among the various city services. Intelligent refers to the inclusion of complex analytics, modeling, optimization, and visualization in the operational business processes to make better operational decisions. This approach enables the adaptation of city services to the behavior of the inhabitants, which permits the optimal use of the available physical infrastructure and resources, for example, in sensing and controlling consumption of energy and water, managing waste processing and transportation systems, and applying optimization to achieve new efficiencies among these resources. Additional roles exist in intelligent interaction between the city and its inhabitants and further contribute to operational efficiency while maintaining or enhancing quality of life.


IEEE Computer | 2011

Smarter Cities and Their Innovation Challenges

Milind R. Naphade; Guruduth Banavar; Colin George Harrison; J. Paraszczak; Robert J. T. Morris

The transformation to smarter cities will require innovation in planning, management, and operations. Several ongoing projects around the world illustrate the opportunities and challenges of this transformation. Cities must get smarter to address an array of emerging urbanization challenges, and as the projects highlighted in this article show, several distinct paths are available. The number of cities worldwide pursuing smarter transformation is growing rapidly. However, these efforts face many political, socioeconomic, and technical hurdles. Changing the status quo is always difficult for city administrators, and smarter city initiatives often require extensive coordination, sponsorship, and support across multiple functional silos. The need to visibly demonstrate a continuous return on investment also presents a challenge. The technical obstacles will center on achieving system interoperability, ensuring security and privacy, accommodating a proliferation of sensors and devices, and adopting a new closed-loop human-computer interaction paradigm.


Ibm Journal of Research and Development | 2009

Instrumenting the planet

Ching-Hua Chen-Ritzo; Colin George Harrison; J. Paraszczak; F. Parr

During the last 50 years, population growth, along with increasingly affluent societies, has resulted in a greater demand for our limited physical infrastructures and natural resources than ever before. In addition, the risks of climate change have heightened the need for more sophisticated ways of controlling carbon emissions. Today, numerous streams of data are being collected from sensors that monitor the environment. When used in conjunction with computational models, these streams can be important sources of data for understanding physical phenomena and human behavior. In this paper, we present a vision of a pervasively instrumented world in which these streams of real-world data are combined with mathematical models to improve the ability to manage the consumption of increasingly scarce resources. Such an instrumented world requires a class of information technology systems that combine very large numbers of sensors and actuators with computing platforms for capturing and analyzing such data streams. We provide details on the characteristics, requirements, and possible applications of such platforms and the key roles that they will play in addressing various societal challenges.


EKS | 2005

Enabling ICT adoption in developing Knowledge Societies

Colin George Harrison

The deployment of ICT in its present form requires simultaneously mastering of many skills and having a developed infrastructure of human and technical resources. These are frequently lacking in regions remote from the affluent neighbourhoods of major cities, whether in developed or developing economies. Moreover, potential users in these developing Knowledge Societies may have different needs or a different balance of needs from the established user base. Such neighbourhoods of major cities already provide an ICT ecology and their users’ needs are heavily pre-determined by the prevailing Internet culture. In developing Knowledge Societies, however, the introduction of ICT — like any major infrastructure investment — is likely to be a communal decision, prioritised against other needs, and conditioned by local values. So the introduction of ICT into such a community needs to consider 1) what needs do we wish to meet, 2) what ICT infrastructure can meet those needs, and 3) how can we bootstrap the ICT ecology that will enable the deployment to become rapidly self-sustaining. The technology selection and deployment process thus requires a much broader assessment and the choices may — paradoxically — be wider than for an established Knowledge Society. In my contribution, I will propose a framework for preparing for the creation of a new Knowledge Society that is based in part on current experiments in developing economies and in part on a view of the evolution of the underlying technologies.


Proceedings of the IFIP TC3/WG3.1&3.2 Open Conference on Informatics and The Digital Society: Social, Ethical and Cognitive Issues on Informatics and ICT | 2002

e-Learning Technology: Convergence with the Mainstream

Colin George Harrison

The evolution of technology-based learning systems over the last thirty years has been driven by the desire to implement increasingly rich pedagogical models. This has led to remarkable innovations in interaction, collaboration, and the use of rich-media in learning systems. However, these technology innovations have rarely been absorbed into the mainstream of information technology and consequently have failed to evolve into broadly implemented standards. Abstractly we can view learning as a sequence of encounters among instructors, learners, and information. These encounters may be synchronous or asynchronous, local or distant, formal or informal, solitary or en masse. Computer Science is seeking to develop technologies that will support a new set of Web experiences. These technologies will provide a generalised model for describing encounters among people and information and are based on open industry standards. The application space for these experiential technologies will eventually include e-commerce, professional communications, games and entertainment, but learning appears likely to be the first practical application.


Archive | 2000

Invited Talk (Abstract): Data or Computation – Which Should We Move?

Colin George Harrison

The evolution of system architectures has been shaped by the varying evolutions of the costs of system operations. That is: the cost of computation, the cost of storage, the cost of communication, and the cost of operational management. In recent years the costs of computation and storage have continued to drop exponentially (at least). Communication costs have traditionally remained high relative to other costs and this has had a profound influence on the structure of system architectures, favouring centralized information systems with remote clients. But these costs too are in steep decline and in recent years end-to-end latencies in the Internet have approached values that in client-server days could only be achieved within campus networks. These trends will change this architectural balance, making much more attractive the widespread distribution of application function – even function requiring significant bandwidth. The current impetus towards exploring this model of computation is the desire to create dynamic business architectures in which components or entire business processes are delivered as remote network services. In this approach the business process is modeled as a form of workflow and the computation proceeds by moving the business data successively through several remote transaction systems. We contrast this approach with that of mobile agents in which the transaction itself moves through the network and consider why the mobile agent approach is not being adopted for dynamic business architectures.


IEEE Transactions on Biomedical Engineering | 1987

Pupa: A Pulse Programming Assistant for NMR Imaging

Doug Foxvog; Xiaofeng Li; Juan E. Vargas; John R. Bourne; R. Mushlin; Colin George Harrison

The design of pulse programs for magnetic resonance imaging (MRI) experiments is tedious and complex, requiring a deep understanding of the interactions that exist between magnetic fields generated during an MRI experiment. This paper describes an intelligent system that understands how to construct the multichannel temporal sequences of pulses needed to control an MRI experiment. PUPA, the PUlse Programmers Assistant, provides assistance to a relatively naive user of MRI systems. Knowledge is coded in the form of rules and semantic networks. A natural language facility and menu system are provided for communication with the user.


Archive | 1993

Wide-area wireless lan access

Colin George Harrison; Dieter Jaepel

Researchain Logo
Decentralizing Knowledge