Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Simon Y. Berkovich is active.

Publication


Featured researches published by Simon Y. Berkovich.


2013 Fourth International Conference on Computing for Geospatial Research and Application | 2013

Internet of Things as a Methodological Concept

Nima Bari; Ganapathy Mani; Simon Y. Berkovich

Nowadays, we are witnessing formation of a new technological marvel: Internet of Things. This construction is able to combine in a particular operational entity all the bits and pieces of the world around us. Thus, why could not this unique establishment present the long-sought essence in the Nature of Things? The two pillars of modern fundamental science-relativity and quantum mechanics-are just approximate descriptions of some properties of such a constructive possibility. The machinery of the physical world develops on a cellular automaton model employing as the transformation rule the mechanism of distributed mutual synchronization with the property of fault-tolerance. This infrastructure yields traveling wave solutions that exactly correspond to the spectrum of the stable elementary particles of matter with an upper bound on the propagation speed. On top of the considered cellular automaton infrastructure there appears a secondary formation that constitutes the mechanism of the Holographic Universe that is the basis for the Internet of Things. The holographic activities determine all the quantum mechanics properties of the physical world including the nonlocality entanglement. For living systems the arrangement of the Internet of Things elucidates the most puzzling biological capability of morphogenesis that otherwise cannot find any reasonable explanation. In this paper, we present the world view of internet of things and the application of this methodology from geospatial computing to physics. We give specific details on applying IoT concept to geospatial analysis in various fields from agriculture to medicine. We also provide detailed analysis of the profound impact of internet of things on our physical world which is a vital knowledge when it comes to geospatial research. We present calendar variation of quantum world which can be used for geospatial data gathering by fine tuning the equipment based on the time of the year.


international conference and exhibition on computing for geospatial research application | 2010

A new multi-core pipelined architecture for executing sequential programs for parallel geospatial computing

Duoduo Liao; Simon Y. Berkovich

Parallel programming on multi-core processors has become the industrys biggest software challenge. This paper proposes a novel parallel architecture for executing sequential programs using multi-core pipelining based on program slicing by a new memory/cache dynamic management technology. The new architecture is very suitable for processing large geospatial data in parallel without parallel programming. This paper presents a new architecture for parallel computation that addresses the problem of needing to relocate data from one memory hierarchy to another in a multi-core environment. A new memory management technology inserts a layer of abstraction between the processor and the memory hierarchy, allowing the data to stay in one place while the processor effectively migrates as tasks change. The new architecture can make full use of the pipeline and automatically partition data then schedule them onto multi-cores through the pipeline. The most important advantage of this architecture is that most existing sequential programs can be directly used with nearly no change, unlike conventional parallel programming which has to take into account scheduling, load balancing, and data distribution. The new parallel architecture can also be successfully applied to other multi-core/many-core architectures or heterogeneous systems. In this paper, the design of the new multi-core architecture is described in detail. The time complexity and performance analysis are discussed in depth. The experimental results and performance comparison with existing multi-core architectures demonstrate the effectiveness, flexibility, and diversity of the new architecture, in particular, for large geospatial data parallel processing with the examples of Digital Elevation Model (DEM) generation from Light Detection And Ranging (LIDAR) dataset.


Software - Practice and Experience | 2000

A bit-counting algorithm using the frequency division principle

Simon Y. Berkovich; Gennadi M. Lapir; Marilyn Mack

The paper addresses the omnipresent problem of bit counting. This problem is of particular importance for information systems where the choice of a rational access strategy may require repeated evaluation of the cardinalities of retrieved sets of data items. There are several different methods available to implement this procedure, which involve shifting, table look‐up, exploiting the properties of fixed point arithmetic, and manipulations with bitwise logical operations. This paper presents a novel approach to the organization of bit counting based on the principle of frequency division (FD). The developed algorithm emulates a set of 32 binary counters using the bit‐parallelism of computer word operations. The overflowing bits generated by these counters at a lower frequency are processed with the arithmetic‐logic method, which is most efficient for sparse binary vectors. The suggested FD procedure is one of the fastest among the known, widely available procedures for bit counting. In future computers, with 64‐bit words and larger, the gain in speed due to the FD technique will be higher, and the performance of this software could be comparable to that of specialized hardware. Copyright


ACM Transactions on Computer Systems | 1984

A computer communication technique using content-induced transaction overlap

Simon Y. Berkovich; Colleen Roe Wilson

A new technique for multiaccess computer communication is presented. Data is sent as the decentralized preorder traversal of its binary tree representation using Content-Induced Transaction Overlap (CITO). This communication protocol is analogous to an algorithm of ordered retrieval in associative memory. The CITO technique eliminates the inherent redundancy of sequenced transmission which may result in data compression and exhibits simultaneous good performance in throughput, delay, and stability. Several enhancements unique to this technique make it particularly suitable for computer communication systems which are in close proximity. CITO utilizes synchronization at the bit level. Hence, propagation delay and bit transmission rate are to be considered as complementary constraints for applications.


IEEE Transactions on Communications | 1990

The compression effects of the binary tree overlapping method on digital imagery

Louis J. Dimento; Simon Y. Berkovich

A new method of digital image compression called binary tree overlapping (BTO) is described. With this method, an image is divided into bit planes that are transformed into a binary tree representation in which initial identical portions of different lines in the bit planes are transmitted as one path in the tree. To increase compression, distortion can be introduced by considering two lines which differ in fewer that a preset number of bit positions to be identical. A time efficient method of implementing BTO based on a communication technique called content-induced transaction overlap is outlined. With this technique, only simple, logical operations are needed and the compression time is proportional to the number of bits in the compressed image. Simulation studies indicate that BTO produces good quality images with a compression ratio of about three. In terms of implementation and compression efficiencies, BTO is comparable to first-order DPCM. >


IEEE Annals of the History of Computing | 2006

MESM and the Beginning of the Computer Era in the Soviet Union

Anne Fitzpatrick; Tatiana Kazakova; Simon Y. Berkovich

The MESM - Malaya Elektronnaya Schetnaya Mashina (small electronic computing machine) - was the first computer built in the USSR. This history of early Soviet computers examines the technical characteristics of the machines and the background of Soviet computer development


Microprocessors and Microsystems | 1998

A combinatorial architecture for instruction-level parallelism

Efraim Berkovich; Simon Y. Berkovich

Abstract The work presents a new principle for microprocessor design based on a pairwise-balanced combinatorial arrangement of processing and memory elements. The proposed apparatus uses two operand instructions so that a set of executable machine instructions is partitioned by these address pairs. This partitioning allows concurrent processing of data-independent instructions. Because the partitioning is done at compile-time, this design extracts substantial instruction-level parallelism from executable code without the overhead of run-time methods. The sequential consistency of the concurrent execution of instructions, including indirect addressing and conditional jumps, is ensured by inserted directives and queues regulation. Generation of executable code requires minor adjustments to a standard compiler. The hardware is built of regular modular components. This design provides a straightforward division of labor among the different functional units. The suggested combinatorial architecture offers a family of constructions with various degrees of performance enhancement.


Mechanisms of Ageing and Development | 1985

Probability of monozygotic twinning as a reflection of the genetic control of cell development

Simon Y. Berkovich; Sherman Bloom

The stability of the incidence of monozygotic twinning (MZT) suggests that its origin is genetically, rather than environmentally, controlled. Available data, though scant, supports our hypothesis that the MZT probability is (1/2)K, where K is a species-specific integer parameter. For humans MZT occurs in about four of 1000 births, which is close to one occurrence in 2(8) births, i.e. K = 8. The environmental factors are not the cause of MZT, but may influence its expression. When this influence is in effect under some extreme experimental conditions the above form of MZT probability is observed. Binary structure of the MZT probability provides insight into genetic control mechanism of cell division.


ieee international conference on multimedia big data | 2015

Novel Metaknowledge-Based Processing Technique for Multimediata Big Data Clustering Challenges

Nima Bari; Roman Vichr; Kamran Kowsari; Simon Y. Berkovich

Past research has challenged us with the task of showing relational patterns between text-based data and then clustering for predictive analysis using Golay Code technique. We focus on a novel approach to extract metaknowledge in multimedia datasets. Our collaboration has been an on-going task of studying the relational patterns between data points based on met features extracted from metaknowledge in multimedia datasets. Those selected are significant to suit the mining technique we applied, Golay Code algorithm. In this research paper we summarize findings in optimization of metaknowledge representation for 23-bit representation of structured and unstructured multimedia data in order to be processed in 23-bit Golay Code for cluster recognition.


international conference data science | 2014

23-bit metaknowledge template towards Big Data knowledge discovery and management

Nima Bari; Roman Vichr; Kamran Kowsari; Simon Y. Berkovich

The global influence of Big Data is not only growing but seemingly endless. The trend is leaning towards knowledge that is attained easily and quickly from massive pools of Big Data. Today we are living in the technological world that Dr. Usama Fayyad and his distinguished research fellows discussed in the introductory explanations of Knowledge Discovery in Databases (KDD) [1] predicted nearly two decades ago. Indeed, they were precise in their outlook on Big Data analytics. In fact, the continued improvement of the interoperability of machine learning, statistics, database building and querying fused to create this increasingly popular science-Data Mining and Knowledge Discovery. The next generation computational theories are geared towards helping to extract insightful knowledge from even larger volumes of data at higher rates of speed. As the trend increases in popularity, the need for a highly adaptive solution for knowledge discovery will be necessary. In this research paper, we are introducing the investigation and development of 23 bit-questions for a Metaknowledge template for Big Data Processing and clustering purposes. This research aims to demonstrate the construction of this methodology and proves the validity and the beneficial utilization that brings Knowledge Discovery from Big Data.

Collaboration


Dive into the Simon Y. Berkovich's collaboration.

Top Co-Authors

Avatar

Duoduo Liao

George Washington University

View shared research outputs
Top Co-Authors

Avatar

Nima Bari

George Washington University

View shared research outputs
Top Co-Authors

Avatar

Ganapathy Mani

George Washington University

View shared research outputs
Top Co-Authors

Avatar

Maryam Yammahi

George Washington University

View shared research outputs
Top Co-Authors

Avatar

Kamran Kowsari

George Washington University

View shared research outputs
Top Co-Authors

Avatar

Marilyn Mack

Goddard Space Flight Center

View shared research outputs
Top Co-Authors

Avatar

Adi Alhudhaif

George Washington University

View shared research outputs
Top Co-Authors

Avatar

Chen Shen

George Washington University

View shared research outputs
Top Co-Authors

Avatar

Tong Yan

George Washington University

View shared research outputs
Top Co-Authors

Avatar

Alexander Kuznetsov

George Washington University

View shared research outputs
Researchain Logo
Decentralizing Knowledge