Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Vinton G. Cerf is active.

Publication


Featured researches published by Vinton G. Cerf.


acm special interest group on data communication | 2009

A brief history of the internet

Barry M. Leiner; Vinton G. Cerf; David D. Clark; Robert E. Kahn; Leonard Kleinrock; Daniel C. Lynch; Jonathan B. Postel; Lawrence G. Roberts; Stephen Wolff

This paper was first published online by the Internet Society in December 20031 and is being re-published in ACM SIGCOMM Computer Communication Review because of its historic import. It was written at the urging of its primary editor, the late Barry Leiner. He felt that a factual rendering of the events and activities associated with the development of the early Internet would be a valuable contribution. The contributing authors did their best to incorporate only factual material into this document. There are sure to be many details that have not been captured in the body of the document but it remains one of the most accurate renderings of the early period of development available.


IEEE Transactions on Communications | 1974

A Protocol for Packet Network Intercommunication

Vinton G. Cerf; Robert E. Kahn

A protocol that supports the sharing of resources that exist in different packet switching networks is presented. The protocol provides for variation in individual network packet sizes, transmission failures, sequencing, flow control, end-to-end error checking, and the creation and destruction of logical process-to-process connections. Some implementation issues are considered, and problems such as internetwork routing, accounting, and timeouts are exposed.


Communications of The ACM | 1997

The past and future history of the Internet

Barry M. Leiner; Vinton G. Cerf; David D. Clark; Robert E. Kahn; Leonard Kleinrock; Daniel C. Lynch; Jonathan B. Postel; Lawrence G. Roberts; Stephen Wolff

The Internet also represents one of the most successful examples of sustained investment and commitment to research and development in information infrastructure. Beginning with early research in packet switching, the government, industry, and academia have been partners in evolving and deploying this exciting new technology. Today, terms like “[email protected]” and “http://www.acm.org” trip lightly off the tongue of random people on the street.1 The Internet today is a widespread information infrastructure, the initial prototype of what is often called the National (or Global or Galactic) Information Infrastructure. Its history is complex and involves many aspects—technological, organizational, and community. And its influence reaches not only to the technical fields of computer communications but throughout society as we move toward increasing use of online tools to accomplish electronic commerce, information acquisition, and community operations.2


acm special interest group on data communication | 2005

A protocol for packet network intercommunication

Vinton G. Cerf; Robert E. Icahn

A protocol that supports the sharing of resources that exist in different packet switching networks is presented. The protocol provides for variation in individual network packet sizes, transmission failures, sequencing, flow control, end-to-end error checking, and the creation and destruction of logical process-to-process connections. Some implementation issues are considered, and problems such as internetwork routing, accounting, and timeouts are exposed.


IEEE Internet Computing | 2010

Internet predictions [Guest editor's introduction]

Vinton G. Cerf; Munindar P. Singh

More than a dozen leading experts give their opinions on where the Internet is headed and where it will be in the next decade in terms of technology, policy, and applications. They cover topics ranging from the Internet of Things to climate change to the digital storage of the future.


Communications of The ACM | 2014

Unconventional computing

Vinton G. Cerf

On top of that, in the last couple of years, IBM has demonstrated two remarkable achievements: The Watson Artificial Intelligence system and the August 8, 2014 cover story of Science entitled “Brain Inspired Chip.” The TrueNorth chipset and the programming language it uses have demonstrated remarkable power efficiency compared to more conventional processing elements. What all of these topics have in common for me is the prospect of increasingly unconventional computing methods that may naturally force us to rethink how we analyze problems for purposes of getting computers to solve them for us. I consider this to be a refreshing development, challenging the academic, research, and practitioner communities to abandon or adapt past practices and to consider new ones that can take advantage of new technologies and techniques. It has always been my experience that half the battle in problem solving is to express the problem in such a way the solution may suggest itself. In mathematics, it is often the case that a change of variables can dramatically restructure the way in which the problem or formula is presented; leading one to find related problems whose solutions may be more readily applied. Changing from Cartesian to Polar coordinates often dramatically simplifies its expression. For example, a Cartesian equation for a circle centered at (0,0) is X + Y2 = Z2 but the polar version is simply r(φ)= a for some value of a. It may prove to be the case that the computational methods for solving problems with quantum computers, neural chips, and Watson-like systems will admit very different strategies and tactics than those applied in more conventional architectures. The use of graphics processing units (GPUs) to solve problems, rather than generating textured triangles at high speed, has already forced programmers to think differently about the way in which they express and compute their results. The parallelism of the GPUs and their ability to process many small “programs” at once has made them attractive for evolutionary or genetic programming, for example. One question is: Where will these new technologies take us? We have had experiences in the past with unusual designs. The Connection Machine designed by Danny Hillis was one of the first really large-scale computing machines (65K one-bit processors) hyperconnected together. LISP was one of the programming languages used for the Connection Machines along with URDU, among others. This brings to mind the earlier LISP machines made by Symbolics and LISP Machines, Inc., among others. The rapid advance in speed of more conventional processors largely overtook the advantage of special purpose, potentially language-oriented computers. This was particularly evident with the rise of the so-called RISC (Reduced Instruction Set Computing) machines developed by John Hennessy (the MIPS system) and David Patterson (Berkeley RISC and Sun Microsystems SPARC), among many others. David E. Shaw, at Columbia University, pioneered one of the explorations into a series of designs of a single instruction stream, multiple data stream (SIMD) supercomputer he called Non-Von (for “non-Von-Neumann”). Using single-bit arithmetic logic units, this design has some relative similarity to the Connection Machine although their interconnection designs were quite different. It has not escaped my attention that David Shaw is now the chief scientist of D.E. Shaw Research and is focused on computational biochemistry and bioinformatics. This topic also occupies his time at Columbia University, where he holds a senior research fellowship and adjunct professorship. Returning to new computing and memory technologies, one has the impression the limitations of conventional use of silicon technology may be overcome with new materials and with new architectural designs as is beginning to be apparent with the new IBM Neural chip. I have only taken time to offer an very incomplete and sketchy set of observations about unconventional computing in this column, but I think it is arguable that in this second decade of the 21 century, we are starting to see serious opportunities for rethinking how we may compute.


Proceedings of the IEEE | 2004

On the evolution of Internet technologies

Vinton G. Cerf

The Internet has been evolving from its origins in the early 1970s, based on work sponsored by the U.S. Defense Advanced Research Projects Agency. While the basic design was known in 1973 and first published in 1974 and the system essentially deployed in the academic and military communities on January 1, 1983, much has happened in the intervening 20 years. The first commercial Internet services emerged in 1989 after the interconnection of the Internet to commercial e-mail services. By 1993, commercial versions of the World Wide Web had appeared, and by 2003, voice over IP service was growing rapidly, after its first commercial introduction around 1995 (See Vocaltec: http://www.vocaltec.com/html/about/company.shtml). The Internet of the future will be shaped by the tectonic forces of regulation, commercialization, technological change, and a wide range of policy concerns expressed at local, national, regional and international levels. In this paper, the effect of these forces is considered and an attempt made to project their effects into the future.


Communications of The ACM | 2012

Where is the science in computer science

Vinton G. Cerf

Managing large software projects is intrinsically difficult. Although, high software quailty is a definite must, other issues like time and cost play major roles in large software development. For example, if a software company can produce the highest quality products but cannot predict how long and how much it is going to cost, then that company will not have any business. Software metrics are one answer to those problems. Software metrics are the measurement of periodic progress towards a goal [3]. Metrics are used to indicate various problems in a development process. Currently, there are a large number of documented metrics. However, there does not exist one perfect formula to to satisfy every development’s quality and productivity. The key to a good software metrics program is to be able to identify the specific goals of the development and be able to assist in reaching these goals. I will addres this concept through the development of specific measurements and analyses that will improve the quality of a specific system, the Mission Data System (MDS) at the Jet Propulsion Laboratory. I will attempt to identify certain software metrics that can help JPL reach their development goals. To accomplish this I have created the Hackystat Jet Propulstion Laboratory Build System (hackyJPLBuild). This system measures and analyzes the build system of MDS. The research question of this thesis is, is it possible to collect data from a build process of a large scale software project, in order to understand, predict, and prevent problems in the quality and productivity of the actual system. To evaluate this research question I will conduct three case studies: (1) can the hackyJPLBuild system accurately represent the build process of MDS, (2) can threshold values indicate problematic issues in MDS, and (3) can hackyJPLBuild predict future problematic issues in MDS. Initial results of case study 1 indicate that hackyJPLBuild can accurately represent the build process of MDS. In fact, hackyJPLBuild has already identified some pontetial flaws in the MDS build process. Case studies 2 and 3 have not been conducted yet.


IEEE Computer | 2007

An information avalanche

Vinton G. Cerf

The Internet has added yet another dimension to the production and consumption of information. Long merely consumers of content, Internauts are now also major producers of it. Search engines make it possible to sift through the enormous quantity of material that is finding its way into digital form. Going online has become an adventure in discovery for those who eagerly surf the billions of Web pages housed in the global Internet. Because of its global reach, the Internet seriously threatens to undermine IP protection regimes that have long served copyright holders


Nature | 2009

The day the Internet age began.

Vinton G. Cerf

Forty years ago today the first message was sent between computers on the ARPANET. Vinton G. Cerf, who was a principal programmer on the project, reflects on how our online world was shaped by its innovative origins.

Collaboration


Dive into the Vinton G. Cerf's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

David D. Clark

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Barry M. Leiner

Research Institute for Advanced Computer Science

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Adrian J. Hooke

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

James E. White

University of California

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge