Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where George V. Meghabghab is active.

Publication


Featured researches published by George V. Meghabghab.


Archive | 2008

Search engines, link analysis, and user's Web behavior

George V. Meghabghab; Abraham Kandel

Basic WWW Technologies.- Web Graphs.- Link Analysis of Web Graphs.- PageRank Algorithm Applied to Web Graphs.- Stochastic Simulations Of Search Engines.- Modeling Human Behavior on the Web.


Information Processing and Management | 2001

Iterative radial basis functions neural networks as metamodels of stochastic simulations of the quality of search engines in the World Wide Web

George V. Meghabghab

Abstract Stochastic simulation has been very effective in many domains but never applied to the WWW. This study is a premiere in using neural networks in stochastic simulation of the number of rejected Web pages per search query. The evaluation of the quality of search engines should involve not only the resulting set of Web pages but also an estimate of the rejected set of Web pages. The iterative radial basis functions (RBF) neural network developed by Meghabghab and Nasr [Iterative RBF neural networks as meta-models for stochastic simulations, in: Second International Conference on Intelligent Processing and Manufacturing of Materials, IPMM’99, Honolulu, Hawaii, 1999, pp. 729–734] was adapted to the actual evaluation of the number of rejected Web pages on four search engines, i.e., Yahoo, Alta Vista, Google, and Northern Light. Nine input variables were selected for the simulation: (1) precision, (2) overlap, (3) response time, (4) coverage, (5) update frequency, (6) boolean logic, (7) truncation, (8) word and multi-word searching, (9) portion of the Web pages indexed. Typical stochastic simulation meta-modeling uses regression models in response surface methods. RBF becomes a natural target for such an attempt because they use a family of surfaces each of which naturally divides an input space into two regions X + and X − and the n patterns for testing will be assigned either class X + or X −. This technique divides the resulting set of responses to a query into accepted and rejected Web pages. To test the hypothesis that the evaluation of any search engine query should involve an estimate of the number of rejected Web pages as part of the evaluation, RBF meta-model was trained on 937 examples from a set of 9000 different simulation runs on the nine different input variables. Results show that two of the variables can be eliminated which include: response time and portion of the Web indexed without affecting evaluation results. Results show that the number of rejected Web pages for a specific set of search queries on these four engines very high. Also a goodness measure of a search engine for a given set of queries can be designed which is a function of the coverage of the search engine and the normalized age of a new document in result set for the query. This study concludes that unless search engine designers address the issue of rejected Web pages, indexing, and crawling, the usage of the Web as a research tool for academic and educational purposes will stay hindered.


modeling analysis and simulation on computer and telecommunication systems | 2000

Stochastic simulations of rejected World Wide Web pages

George V. Meghabghab

Studies the use of neural networks in a stochastic simulation of the number of rejected Web pages per search query. The evaluation of the quality of search engines should involve not only the resulting set of Web pages but also an estimate of the rejected set of pages. The iterative radial basis function (RBF) neural network developed by G. Meghabghab and G. Nasr (1999) was adapted to an actual evaluation of the number of rejected Web pages on four search engines, viz. Yahoo, Alta Vista, Google and Northern Light. Nine input variables were selected for the simulation. Typical stochastic simulation meta-models use regression models in response surface methods. An RBF divides the resulting set of responses to a query into accepted and rejected Web pages. The RBF meta-model was trained on 937 examples from a set of 9,000 different simulation runs on nine input variables. The results show that the number of rejected Web pages for a specific set of search queries on these four engines is very high. Also, a goodness measure of a search engine for a given set of queries can be designed which is a function of the coverage of the search engine and the normalized age of a new document in the resulting set for the query. This study concludes that, unless search engine designers address the issues of rejected Web pages, indexing and crawling, then the usage of the Web as a research tool for academic and educational purposes will remain hindered.


systems man and cybernetics | 2001

Fuzzy cognitive state map vs markovian modeling of user's web behavior

George V. Meghabghab

The vast majority of college students have been reared as researchers in an environment where boundaries for information have been clearly marked, i.e., that of books and paper text. Increasingly, however, they are called upon to perform tasks in an environment not clearly bounded, that of hyperspace. How do we learn to surf in this unfamiliar medium? What strategies do people use when surfing through the unbounded space of hyperlinks or the World Wide Web (WWW)? In order to effectively teach students new surfing skills we must be able to understand how neophyte web users form the cognitive neurological networks that result in a mental pathway, or cognitive map, that makes more navigable the route to further information as well as the information they set out to find. A markovian modeling of users behavior is introduced and compared to a fuzzy cognitive map (FCM) that represents the opinions of experts on how users surf the web. Experts are divided on what causes users to fail their queries on the web. This paper shows that a viable FCM model can be developed and some limit-cycle equilibria are uncovered. A FCM limit cycle repeats a sequence of events and actions. Limit cycles can reveal cognitive and behavioral patterns of users on the web. An adaptive FCM is built to reflect its causal behavior in time. This change reflects the users behavior as their knowledge of the web increases with time. The causal behavior learns from data. Users lean new patterns and reinforce old ones.


systems man and cybernetics | 1992

Hierarchical analysis of visual motion

George V. Meghabghab; Abraham Kandel

A hierarchical data structure for analyzing visual motion is presented. Although the literature on perception is abundant with studies on visual motion, none of the studies investigated the importance of a hierarchical model in the analysis of visual motion. The model was implemented on a supercomputer (Cyber 205). The algorithms of hierarchical correlation were performed on binary images. The results are compared with those obtained using similar serial algorithms. The impact of such a hierarchy on component directional selectivity and on pattern directional selectivity is studied. >


Information Sciences | 1992

Motion analysis with orientational filtering

George V. Meghabghab; Abraham Kandel

Abstract The primate visual system recognizes the true direction of pattern motion using local detectors only capable of detecting the component of motion perpendicular to the orientation of the moving edge. A multilayered model is presented with input patterns (binary images) each consisting of rigid geometrical forms moving in a particular direction. Input layers are given component orientation and frequency similar to those recorded in visual area V1, which projects to area MT. The interaction between two consecutive layers of the multilayered model seem to play an important role in solving the aperture problem.


Information Processing and Management | 1994

INN: an Intelligent Negotiating Neural Network for information systems: a design model

George V. Meghabghab; Dania B. Meghabghab


Journal of the Association for Information Science and Technology | 1991

Application of information theory to query negotiation : toward an optimal questioning strategy

George V. Meghabghab; D. Bilal


the florida ai research society | 1999

Enhanced Simulated Annealing Techniques for Multiprocessor Scheduling

George E. Nasr; A. Harb; George V. Meghabghab


Archive | 2008

Search Engines, Link Analysis, and User's Web Behavior: A Unifying Web Mining Approach

George V. Meghabghab; Abraham Kandel

Collaboration


Dive into the George V. Meghabghab's collaboration.

Top Co-Authors

Avatar

Abraham Kandel

University of South Florida

View shared research outputs
Top Co-Authors

Avatar

D. Bilal

University of Virginia

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

George E. Nasr

Lebanese American University

View shared research outputs
Researchain Logo
Decentralizing Knowledge