Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Ted G. Lewis is active.

Publication


Featured researches published by Ted G. Lewis.


IEEE Computer | 1995

Where is client/server software headed?

Ted G. Lewis

Computers have moved to desktops; now, the underlying software is itself evolving. The direction computing will take is clear, but the route it will take is a matter for some speculation. We examine several promising software ideas, especially distributed software and its underlying infrastructure called middleware. Industrial software is being driven by document-centric design, reusable component implementation based on industry standards, and end-user programming. >


IEEE Computer | 1996

The next 10,000 2 years. II

Ted G. Lewis

For pt.I see ibid., vol.4, p.64-70 (1996). As microprocessor advances begin to level off, communication network deployment will keep accelerating, and software engineering must face the prospect of radical change if it is to keep pace. The paper considers how the intersection of these ascending and descending technologies will propel the high-tech world into a new model of computing by the year 2012. Software is the steam that drives the engines of the Information Age, but clearly it is not keeping up with developments on the hardware side. Historical trends suggest that further progress in programmer productivity and programming-language power over the next 10,0002 years is highly unlikely. With a large percentage of programmers maintaining legacy code, the resources available for innovation are limited. In fact, software innovation will have to come from a five percent fringe of artisans and nontraditional thinkers outside the current programming language and software engineering establishment


IEEE Computer | 1994

Where is computing headed

Ted G. Lewis

Technological change is putting entire industries on the betting line. For computer technologists, these shifts can mean opportunity or disappointment as one technology is replaced by another. Therefore, it is important that we consider economic and technical forces when we plan for the future. By studying predictable technology and asking what if questions where developments are less certain, we can envision the state of computing in another 10 years.<<ETX>>


IEEE Computer | 1996

The next 10,000/sub 2/ years. II

Ted G. Lewis

Forecasts technological breakdowns and breakthroughs for the next 16 (10,000 to the base 2) years. Change has always been a part of recent history. Indeed, Earth-shaking change occurs about every 150-200 years. It takes about 50 years to make the transition from the old to the new, and we are nearing the end of just such a 50-year period. Change is caused by both technological breakthroughs and technological breakdowns. In the current 50-year transition, the breakthrough is in networking and software development, and the breakdown is in processor (VLSI) technology. Both forces will propel the high-tech world into a new model of computing by the year 2012. The new model will be based on a networked, global megacomputer that obeys the Gustafson-Barsis speedup law instead of the Amdahl law of parallelism. The next centurys information superhighway will actually be a network of cable TV operators, not telephone companies. A new era of programming that eliminates traditional programming languages (and scolds the software engineering community for failure) will arise and lead to a software economy-an electronic commerce dominated by software artisans.


IEEE Software | 1995

Scheduling in hard real-time applications

Jiang Zhu; Ted G. Lewis; Weldon Jackson; Russel L. Wilson

A major problem with hard real-time systems is how to be assured that they really work. The authors present theorems to extract timing information from a design diagram and then use it to analyze the feasibility that a uniprocessor system will meet its deadlines. Their work also involves the development of new graphical languages for the design of hard real-time systems. >


Communications of The ACM | 2016

Exponential laws of computing growth

Peter J. Denning; Ted G. Lewis

Moores Law is one small component in an exponentially growing planetary computing ecosystem.


IEEE Computer | 1995

Where is software headed? A virtual roundtable

Ted G. Lewis; D. Power; Bertrand Meyer; J. Grimes; M. Potel; R. Vetter; P. Laplante; Wolfgang Pree; Gustav Pomberger; M.D. Hill; J.R. Larus; D.A. Wood; B.W. Weide

To find out where software is headed, experts in academia and industry share their vision of softwares future. It is a snapshot in time of where we have been and possibly where we are headed. The subjects discussed are: the desktop; software technology; objects; software agents; software engineering; parallel software; and the curriculum. The results suggest a strong polarization within the software community: a chasm exists between academia and industry. It appears that these two groups share radically different views on where software is headed. The impression is the heavy emphasis on programming languages, operating systems and algorithms by the academic group, in contrast to the clear emphasis on standards and market-leading trends by the industrial group. Academics worry about evolutionary or incremental changes to already poorly designed languages and systems, while industrialists race to keep up with revolutionary changes in everything. Academics are looking for better ideas, industrialists for better tools. To an industrial person, things are moving fast-they are revolutionary. To an academic, things are moving too slowly, and in the wrong direction-they are only evolutionary changes which are slave to an installed base. >


WIT Transactions on State-of-the-art in Science and Engineering | 2012

Model-based Risk Analysis For CriticalInfrastructures

Ted G. Lewis; Rudolph P. Darken; Thomas J. Mackin; Donald Dudenhoeffer

This chapter describes a risk-informed decision-making process for analysing and protecting large-scale critical infrastructure and key resource (CI/KR) systems, and a Model-Based Risk Analysis (MBRA) tool for modelling risk, quantifying it and optimally allocating fi xed resources to reduce system vulnerability. MBRA is one of the fi rst tools to adopt a systems approach to risk-informed decision-making. It applies network science metrics, height, degree, betweeness and contagiousness to a network of interdependent infrastructure assets across multiple sectors. Resource allocation is applied across entire networks to reduce risk and to determine threat, vulnerability and consequence values using Stackelberg game theory. MBRA is compared with non-network assessment tools: CARVER, Maritime Security Risk Analysis Model (MSRAM) and Knowledge Display and Aggregation System (KDAS) – three leading infrastructure analysis tools currently in use by practitioners. MBRA has been used successfully to model a variety of sectors, ranging from water, power, energy and telecommunications to transportation.


IEEE Computer | 1997

If Java is the answer, what was the question?

Ted G. Lewis

Within a very few short years, Java, Java Beans, and everything to do with Java will be pervasive. Javas adoption curve will rival just about everything else in Silicon Valley for living in real time. The technology will burn brightly for a time and then burn itself out. Before that happens, though, Java will be as common as a household mop. Product hype is as much a part of the computer industry as celebrity is an essential part of Hollywood. Excellence often falls victim to PR. In the case of Java, it is particularly difficult to separate the PR from the reality. So your first question might be, is Java really an improvement? Simply put, no. If todays languages are inadequate for todays software engineering challenges, then Java must be inadequate, too. Remember, most of Java is warmed-over C/C++. In spite of its celebrity status, Java lacks many of the features needed to improve the dismal science of software engineering, just like its predecessors. How so, you ask! The author presents an analysis-with a minimum of hype-on the pros and cons of Java.


Cognitive Systems Research | 2013

Cognitive stigmergy: A study of emergence in small-group social networks

Ted G. Lewis

This paper proposes a model and theory of leadership emergence whereby (1) small social groups are modeled as small world networks and a betweeness metric is shown to be a property of networks with strong leadership, and (2) a theory of group formation based on stigmergy explains how such networks evolve and form. Specifically, dominant actors are observed to emerge from simulations of artificial termites constructing a wood chip network in a random walk, suggesting a correlation between various preferential attachment rules and emergent network topologies. Three attachment rules are studied: maximizing node betweeness (intermediary power), maximizing node degree (node connectivity), and limiting radius (size of the network in terms of network distance). The simulation results suggest that a preference for maximizing betweeness produces networks with structure similar to the 62-node 9-11 terrorist network. Further simulations of emergent networks with small world properties (small radius) and high betweeness centrality (strong leader) are shown to match the topological structure of the 9-11 terrorist network, also. Interestingly, the same properties are not found in a small sampling of human made physical infrastructure networks such as power grids, transportation systems, water and pipeline networks, suggesting a difference between social network emergence and physical infrastructure emergence. Additionally, a contagion model is applied to random and structured networks to understand the dynamics of anti-leader sentiment (uprisings and counter-movements that challenge the status quo). For random networks, simulated pro-leader (pro-government) and anti-leader (pro-rebel) sentiments are propagated throughout a social network like opposing diseases to determine which sentiment eventually prevails. Simulations of the rise of rebel sentiment versus the ratio of rebel to government sentiment show that rebel sentiment rises on less than 100% rebel/government sentiment when government sentiment is high (strong leadership), but requires greater than 100% rebel/government sentiment when government sentiment is low (weak leadership). However, when applied to the structured 9-11 terrorist network, rebel sentiment is slow to rise against strong leadership, because of the high betweeness structure of the 9-11 network. These results suggest a theory of how and why human stigmergy evolves networks with strong leaders, and why successful social networks are resilient against anti-leader sentiment. The author concludes that a combination of small world and high betweeness structure explain how social networks emerge strong leadership structure and why the resulting networks are resilient against being overthrown by a dissenting majority.

Collaboration


Dive into the Ted G. Lewis's collaboration.

Top Co-Authors

Avatar

Rudy Darken

Naval Postgraduate School

View shared research outputs
Top Co-Authors

Avatar

Hesham El-Rewini

University of Nebraska Omaha

View shared research outputs
Top Co-Authors

Avatar

Jiang Zhu

Oregon State University

View shared research outputs
Top Co-Authors

Avatar

Lihua Zhao

Oregon State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Thomas J. Mackin

California Polytechnic State University

View shared research outputs
Top Co-Authors

Avatar

Leslie Marsh

University of British Columbia

View shared research outputs
Top Co-Authors

Avatar

Alex Mayberry

Naval Postgraduate School

View shared research outputs
Top Co-Authors

Avatar

Don Brutzman

Naval Postgraduate School

View shared research outputs
Researchain Logo
Decentralizing Knowledge