Sergey Frenkel
Russian Academy of Sciences
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Sergey Frenkel.
Journal of The Optical Society of America A-optics Image Science and Vision | 2013
Eyal Cohen; Shlomi Dolev; Sergey Frenkel; Boris Kryzhanovsky; Alexandr Palagushkin; Michael Rosenblit; Victor N. Zakharov
We present an optical computing system to solve NP-hard problems. As nano-optical computing is a promising venue for the next generation of computers performing parallel computations, we investigate the application of submicron, or even subwavelength, computing device designs. The system utilizes a setup of exponential sized masks with exponential space complexity produced in polynomial time preprocessing. The masks are later used to solve the problem in polynomial time. The size of the masks is reduced to nanoscaled density. Simulations were done to choose a proper design, and actual implementations show the feasibility of such a system.
ieee international conference on science of electrical engineering | 2016
Shlomi Dolev; Sergey Frenkel; Michael Rosenblit; Ram Prasadh Narayanan; K Muni Venkateswarlu
Biological collaborative systems behavior is fascinating, urging researchers to mimic their behavior through programmable matters. These matters constitute a particle system, wherein the particles bind with their neighbors to swarm and navigate. Caterpillar swarm inspired particle system involves layered architecture with a predefined number of layers. Through this work, a coordinated layered particle system inspired by caterpillar swarms is discussed. We first propose a novel design for producible nano-particles that uses electrodes to harvest electricity from the blood serum, energy that can be later used for swarming, inter/outer communication, coordination, sensing and acting according to an instructing program. The benefit of moving and acting in a swarm is demonstrated by a design of telescopic movement in pipes (e.g., blood vessels), wherein each layer uses the accumulated speed of all layers below and moves faster, thus mimicking the faster motion of a caterpillar swarm.
network computing and applications | 2014
Shlomi Dolev; Sergey Frenkel; Marina Kopeetsky
Self-Organization is based on adaptivity. Adaptivity should start with the very basic fundamental communication tasks such as encoding the information to be transmitted or stored. Obviously, the less signal transmitted the less energy in transmission used. In this paper we present a novel on-line and entropy adaptive compression scheme for streaming unbounded length inputs. The scheme extends the window dictionary Lempel-Ziv compression, is adaptive and is tailored to on-line compress inputs with non stationary entropy. Specifically, the window dictionary size is changed in an adaptive manner to fit the current best compression rate for the input. On-line Entropy Adaptive Compression scheme (EAC), that is introduced and analyzed in this paper, examines all possible sliding window sizes over the next input portion to choose the optimal window size for this portion, a size that implies the best compression ratio. The size found is then used in the actual compression of this portion. We suggest an adaptive encoding scheme, which optimizes the parameters block by block, and base the compression performance on the optimality proof of Lempel Ziv algorithm when applied to blocks. The EAC scheme was tested over files of different types (docx, ppt, jpeg, xls) and over synthesized files that were generated as segments of homogeneous Markov Chains. Our experiments demonstrate that the EAC scheme typically provides a higher compression ratio than LZ77 does, when examined in the scope of on-line per-block compression of transmitted (or compressed) files.
OSC'10 Proceedings of the Third international conference on Optical supercomputing | 2010
Eyal Cohen; Shlomi Dolev; Sergey Frenkel; Rami Puzis; Michael Rosenblit
We present a design for a micro optical architecture for solving instances of NP-hard problems, using nano-technology. The architecture is using pre-processed masks to block some of the light propagating through them. We demonstrate how such a device could be used to solve instances of Hamiltoniancycle and the Permanent problems.
haifa verification conference | 2017
Sergey Frenkel
The system designers need in various design tools which could help them both for estimation of possible threats to the security and select one or another ways of their neutralization. There are many approaches to the evaluation (verification) of the degree of protection of programs against possible attacks.
convention of electrical and electronics engineers in israel | 2010
Shlomi Dolev; Sergey Frenkel
Holographic coding has the very appealing property of obtaining partial information on data, from any part of the coded information. We present holographic coding schemes based on the Walsh orthogonal codes. The schemes use only addition for coding and decoding. We propose randomizing the data so that the value of the coefficient of the Walsh code will be approximately distributed normally to ensure, with high probability, a fixed gain of information. The data is xored with randomly chosen bits from random data that has been stored during a preprocessing stage or pseudo-random data produced by a pseudo-random generator. We suggest schemes to cope with erasures in the scope of Walsh codes. We suggest parity based schemes to support the erasure correcting of the Walsh coefficient which can tolerate a bounded number of erasures without using multiplication. We then suggest a scheme based on Preparata use of discrete Fourier coefficients, extending the data with zeros. Lastly, we present a rateless erasure coding scheme.
International Symposium on Cyber Security Cryptography and Machine Learning | 2018
Sergey Frenkel; Victor N. Zakharov
Design of program secure systems is connected with choice of mathematical models of the systems. A widely-used approach to malware detection (or classification as “benign-malicious”) is based on the system calls traces similarity measurement. Presently both the set-theoretical metrics (for example, Jaccard similarity, the Edit (Levenshtein) distance (ED) [1]) between the traces of system calls and the Markov chain based models of attack effect are used. Jaccard similarity is used when the traces are considered as a non-ordering set. The Edit Distance, namely, the minimal number of edit operations (delete, insert and substitute of a single symbol) required to convert one sequence to the other, is used as it reflects the traces ordering and semantics. However, the time and space complexity of the edit distance between two strings requires quadratic (in symbol numbers) complexity [1]. The traces can also be represented as a system calls graphs [2], the nodes of which are the system calls (or the items of the q-grams [1]). That is, we can consider the traces description by the ordered string as a partial case of the graph representation, for which it is possible to use the same similarity metrics with the same computational complexity.
Wireless Networks | 2017
Shlomi Dolev; Sergey Frenkel; Marina Kopeetsky; Muni Venkateswarlu Kumaramangalam
Since energy efficiency, high bandwidth, and low transmission delay are challenging issues in mobile networks, due to resource constraints, there is a great importance in designing of new communication methods. In particular, lossless data compression may provide high performance under constrained resources. In this paper we present a novel on-line and entropy adaptive compression scheme for streaming unbounded length inputs. The scheme extends the window dictionary Lempel–Ziv compression and is adaptive and tailored to compress on-line non entropy stationary inputs. Specifically, the window dictionary size is changed in an adaptive manner to fit the current best compression rate for the input. On-line entropy adaptive compression scheme (EAC), introduced and analyzed in this paper, examines all possible sliding window sizes over the next input portion to choose the optimal window size for this portion; a size that implies the best compression ratio. The size found is then used in the actual compression of this portion. We suggest an adaptive encoding scheme, which optimizes the parameters block by block, and base the compression performance on the optimality proof of LZ77 when applied to blocks (Ziv in IEEE Trans Inf Theory 55(5):1941–1944, 2009). This adaptivity can be useful for many communication tasks. In particular, providing efficient utilization of energy consuming wireless devices by data compression. Due to the dynamic and non-uniform structure of multimedia data, adaptive approaches for data processing are of special interest. The EAC scheme was tested on different types of files (docx, ppt, jpeg, xls) and over synthesized files that were generated as segments of homogeneous Markov Chains. Our experiments demonstrate that the EAC scheme typically provides a higher compression ratio than LZ77 does, when examined in the scope of on-line per-block compression of transmitted (or compressed) files. We propose techniques intended to control the adaptive on-line compression process by estimating relative entropy between two sequential blocks of data. This approach may enhance performance of the mobile networks.
International Conference on Cyber Security Cryptography and Machine Learning | 2017
Sergey Frenkel; Victor N. Zakharov
At present, the problem of accounting of possible failures effect that can occur in the program memory area both due to some physical effects and some malicious attacks, and can distort values of variables, operations, the codes, etc., is solved by applying the widely used Fault Injection (FI) simulation technique. Main drawback of the FI is necessity to have different expensive software that can not be used to solve other design problems, in particular verification and testing.
international symposium on stochastic models in reliability engineering life science and operations management | 2016
Sergey Frenkel; Marina Kopeetsky; Roman Molotkovski
This paper proposes an improvement over Lempel-Ziv-Welch (LZW) compression algorithm by employing a new method that uses exponential decay (ED) as a tool to manage and remove infrequently used entries in the LZW dictionary. The presented results demonstrate that ED may be an efficient tool to manage and refresh the LZW dictionary. The achieved compression ratio is higher than in the traditional methods like Dictionary Reset DR and Least Recently used LRU. The experimental results demonstrate that the dictionary refresh by the ED may provide higher compression ratio, compared with the original LZW algorithm. In order to investigate the benefits of ED method, it is compared with some other way of LRU-based enhancements. In particular, we consider a LRU-like LZW scheme with Huffman coding of difference from last used word.