Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Panayiotis Vlamos is active.

Publication


Featured researches published by Panayiotis Vlamos.


Behaviour & Information Technology | 2013

Using Facebook out of habit

Michail N. Giannakos; Konstantinos Chorianopoulos; Konstantinos K. Giotopoulos; Panayiotis Vlamos

This article investigates the uses and gratifications of the popular social networking site Facebook. In the exploratory stage, 70 users generated phrases to describe the manner they used Facebook. Interestingly, some users not only described the uses, but also mentioned how they perceive these uses. These phrases were coded into 14 items and clustered into four factors. The principal component analysis that was conducted in the third stage of the study, which was addressed to 222 Facebook users, verified the validity of the four factors: Social Connection, Social Network Surfing, Wasting Time and Using Applications. Previous user studies on Facebook have examined the immediate social effects of this popular social networking site, but they have not regarded emerging uses of the platform, such as gaming and applications, which do have a social component as a feature and not as a core principle. The ‘Wasting Time’ factor and the growth of ‘Using Applications’ factor indicate that Facebook has already become an integral part of daily computing routine, alongside with the rest of the entertainment desktop and web applications.


Chemical Engineering Science | 2000

An interpretation of the behavior of EoS/GE models for asymmetric systems

Georgios M. Kontogeorgis; Panayiotis Vlamos

Abstract The recently developed EoS/ G E models are mixing rules for the equations of state energy parameters based on the equality of the G E s as obtained from the EoS and from an explicit ( G E )/activity coefficient model, e.g. UNIFAC, NRTL. This equality is, depending on the model, postulated at infinite or zero pressure or at other conditions (systems pressure, constant volume packing fraction). In a number of publications over the last years, the achievements and the shortcomings of the various EoS/ G E models have been presented via phase equilibrium calculations. This short communication provides an explanation of several literature EoS/ G E models, especially those based on zero-reference pressure (PSRK, MHV1, MHV2), in the prediction of phase equilibria for asymmetric systems as well as an interpretation of the LCVM and κ -MHV1 models which provide an empirical — yet as shown here theoretically justified — solution to these problems.


British Journal of Educational Technology | 2013

Using Webcasts in Education: Evaluation of its Effectiveness.

Michail N. Giannakos; Panayiotis Vlamos

Abstract Educational webcasts are nowadays widely used by many organizations and institutions all over the world. However, the educational effectiveness of webcasts when used as an autonomous method is yet to be explored. In this paper, the clarification of certain issues concerning their educational effectiveness is attempted. Following specific instructions, an educational webcast was developed, and then a between group evaluation experiment was conducted. The experiment compared traditional learning and an educational webcast. A total of 66 gymnasium (middle school) students were placed in two groups based on a pretest method. The results of the evaluation showed that educational webcasts can be very effective on certain conditions. On the one hand, the educational effectiveness of the webcast was particularly high when applied to tasks that required simple comprehension. On the other hand, the webcast had poor performance in the consolidation of complex tasks. Practitioner Notes What is already known about this topic • Advocates of webcasting in education indicate that these technologies can improve student performance; they mention that webcasting lectures in their entirety are useful for revision, reviewing purposes and assist students to fill in learning gaps. • Webcasts provide students the opportunity to actively engage with the material by allowing learners to directly and repeatedly access a specific section of presentations and/or control the speed to play the media file. • Prior studies (eg, McKinney, Dyck & Luber, 2009; Traphagan, Kucsera & Kishi, 2010) also found that students who reported using lecture webcasts as a replacement for the in-class lecture exhibited lower performance. What this paper adds • We focus on the effectiveness of webcast as an autonomous learning tool and find that in some cases, webcasts performance can reach traditional learning performance. • This research found that in simple comprehension tasks, webcasts seem to have much better performance compared with traditional learning. • Findings indicate that in tasks where a greater degree of comprehension is required, webcast and traditional learning seem to have the same performance. • This research found that in complex tasks that required additional comprehension and a great degree of consolidation, webcasts had very low performance and few of the students coped with the complex task. Implications for practice and/or policy • Asynchronous-autonomous teaching with webcasts may demonstrate good performance. • Webcast is more efficient when it comes to light-epidermal tasks (eg, multiple choice questions). • For knowledge that has to be comprehended and consolidated, in order to be used in combination with other knowledge for solving complex tasks, the webcast is not recommended because its performance in this area is very poor.


Bioinformation | 2011

Modeling the mitochondrial dysfunction in neurogenerative diseases due to high H+ concentration.

Athanasios Alexiou; John Rekkas; Panayiotis Vlamos

Considering the latest researches, disruptions in the regulation of mitochondrial dynamics, low energy production, increased reactive oxygen species and mtDNA damage are relevant to human diseases, mainly in neurogenerative diseases and cancer. This article represents inner mitochondrial membrane as a natural superconductor giving also the corresponding mathematical model; nevertheless the creation of electric complexes into the inner mitochondrial membrane due to the unusual concentration of protons disrupts the normal flow of electrons and the production of ATP. Therefore, we propose the term ‘electric thromboses’ for the explanation of these inadequate electrons’ flow, presenting simultaneously a natural mechanism of this important and unique phenomenon.


Computational and Theoretical Polymer Science | 2000

Application of the sCPA equation of state for polymer solutions

Georgios M. Kontogeorgis; I.V Yakoumis; Panayiotis Vlamos

Abstract Specific interactions, for example hydrogen bonding, dominate in numerous industrially important polymeric systems, both polymer solutions and blends. Typical cases are water-soluble polymers including biopolymers of special interest to biotechnology (e.g. the system polyethyleneglycol/dextran/water). Furthermore, most polymer blends are non-compatible and the requirement for compatible polymer pairs is often the presence of hydrogen-bonding interactions (e.g. polyvinylchloride/chlorinated polyethylene). In this work we give at first a short, comparative evaluation of existing thermodynamic models suitable for polymeric systems that take into account, explicitly, specific interactions like HB. The range of application of the models in terms of phase equilibria and their specific characteristics (accuracy of calculation, degree of complexity) are discussed. Finally, vapor–liquid equilibria (VLE) calculations for a number of polymer+solvent systems (including five different polymers) with a novel and very promising model are presented. This model is in the form of an equation of state that is (in its general formulation) non-cubic with respect to volume and has separate terms for physical and chemical interactions. The model has recently been proposed and has already been successfully applied to non-polymeric hydrogen-bonding systems (alcohol/water/hydrocarbons). This is the first time that it is extended to polymer solutions.


Advances in Artificial Intelligence | 2012

A cultural algorithm for the representation of mitochondrial population

Athanasios Alexiou; Panayiotis Vlamos

We propose a novel Cultural Algorithm for the representation of mitochondrial population in mammalian cells as an autonomous culture. While mitochondrial dysfunctions are highly associated with neurodegenerative diseases and related disorders, an alternative theoretical framework is described for the representation of mitochondrial dynamics. A new perspective of bioinspired algorithm is produced, combining the particle-based Brownian dynamics simulation and the combinatorial representation of mitochondrial population in the lattice, involving the optimization problem of ATP production in mammalian cells.


2009 First International Conference on Advances in Satellite and Space Communications | 2009

New Approaches in Image Compression and Noise Removal

Luminita State; Catalina Cocianu; Corina Sararu; Panayiotis Vlamos

The increasing use of location-based services has raised many issues of decision support and resource allocation. A crucial problem is how to solve queries of Group k-Nearest Neighbour (GkNN). A typical example of a GkNN query is finding one or many nearest meeting places for a group of people. Existing methods mostly rely on a centralised base station. However, mobile P2P systems offer many benefits, including self-organization, fault-tolerance and load-balancing. In this study, we propose and evaluate a novel P2P algorithm focusing on GkNN queries, in which mobile query objects and static objects of interest are of two different categories. The algorithm is evaluated in the MiXiM simulation framework with both real and synthetic datasets. The results show the practical feasibility of the P2P approach for solving GkNN queries for mobile networks.The Discrete Fourier Transform (DFT) can be viewed as the Fourier Transform of a periodic and regularly sampled signal. The Non-Uniform Discrete Fourier Transform (NuDFT) is a generalization of the DFT for data that may not be regularly sampled in spatial or temporal dimensions. This flexibility allows for benefits in situation where sensor placement cannot be guaranteed to be regular or where prior knowledge of the informational content could allow for better sampling patterns than a regular one. NuDFT is used in applications such as Synthetic Aperture Radar (SAR), Computed Tomography (CT), and Magnetic Resonance Imaging (MRI). Direct calculation of NDFT is time consuming and, in general, Non-uniform Fast Fourier Transform (NuFFT) is used. The key of computing NuFFT is to interpolate the non-uniformly sampled data onto a uniform grid and then use the Fast Fourier Transform. The interpolation process, called re-gridding or data-translation, is known to be the most time consuming (over 90% of the overall computation time of NuFFT) [1]. FPGA have been shown in prior work to be a power efficient way to perform this re-gridding as in [1]. We propose a novel memory-efficient FPGA based technique based on grouping the source points together in on-chip memory and hence reducing the number of memory accesses. The proposed architecture exhibits high performance for the re-gridding process. A speed-up of over 7.5 X was achieved when compared with existing FPGA-based technique for a target grid of size 256 atimes; 256. The basic procedure for re-gridding is based on updating all the target points within a specified distance of the source point using an interpolation kernel function. In this paper, we refer to this specified distance as interpolation threshold and its value is expressed in terms of the number of target points. Our proposed architecture is based on dividing the 2- Dimensional (2D) uniform target grid T into smaller 2D sub-grid. These sub-grids are called tiles. Corresponding to each tile, a block memory based FIFO is used. The idea is to group the source points that affect a tile into the FIFO corresponding to the tile. FIFOs are read one at a time and the tile corresponding to the FIFO being read is fetched from the external memory into the device. Performance of the proposed architecture is evaluated by simulating and computing the number of clock cycles required. Using a clock frequency of 50 MHz, which is chosen to be less then the achieved maximum frequency of 60.16 MHz, computation time for the translation process is calculated. Based on this computed time, throughput is calculated in terms of frames per second (fps).Principal Component Analysis is a well-known statistical method for feature extraction and it has been broadly used in a large series of image processing applications. The multiresolution support provides a suitable framework for noise filtering and image restoration by noise suppression. The procedure used is to determine statistically significant wavelet coefficients and from this to specify the multiresolution support. In the third section, we introduce the algorithms Generalized Multiresolution Noise Removal, and Noise Feature Principal Component Analysis. The algorithm Generalized Multiresolution Noise Removal extends the Multiresolution Noise Removal algorithm to the case of general uncorrelated Gaussian noise, and Noise Feature Principal Component Analysis algorithm allows the restoration of an image using a noise decorrelation process. A comparative analysis of the performance of the algorithms Generalized Multiresolution Noise Removal and Noise Feature Principal Component Analysis is experimentally performed against the standard Adaptive Mean Variance Restoration and Minimum Mean Squared Error algorithms. In the fourth section, we propose the Compression Shrinkage Principal Component Analysis algorithm and its model-free version as Shrinkage-Principal Component Analysis based methods for noise removal and image restoration. A series of conclusive remarks are supplied in the final section of the paper.


ieee international conference on information technology and applications in biomedicine | 2010

A theoretical artificial approach on reducing mitochondrial abnormalities in Alzheimer's disease

Athanasios Alexiou; Panayiotis Vlamos; Kimon G. Volikas

Considering the latest researches on the significant between mitochondrial disorders and human diseases, mainly on the nervous system, we propose a novel solution concerning the enriching of healthy mitochondrial population into cells. It is a totally theoretical proposal and refers mostly to mitochondrial encephalomyopathies which involve to brain disorders, like Alzheimers disease.


Journal of Computers | 2009

Canonical Polygon Queries on the Plane: A New Approach

Spyros Sioutas; Dimitrios Sofotassios; Kostas Tsichlas; Dimitrios Sotiropoulos; Panayiotis Vlamos

The polygon retrieval problem on points is the problem of preprocessing a set of n points on the plane, so that given a polygon query, the subset of points lying inside it can be reported efficiently. It is of great interest in areas such as Computer Graphics, CAD applications, Spatial Databases and GIS developing tasks. In this paper we study the problem of canonical k-vertex polygon queries on the plane. A canonical k-vertex polygon query always meets the following specific property: a point retrieval query can be transformed into a linear number (with respect to the number of vertices) of point retrievals for orthogonal objects such as rectangles and triangles (throughout this work we call a triangle orthogonal iff two of its edges are axisparallel). We present two new algorithms for this problem. The first one requires O(n log 2 n) space and O(k log 3 n/loglogn +A) query time. A simple modification scheme on first algorithm lead us to a second solution, which consumes O(n 2 ) space and O(k logn/loglogn + A) query time, where A denotes the size of the answer and k is the number of vertices. The best previous solution for the general polygon retrieval problem uses O(n 2 ) space and answers a query in O(k log n + A) time, where k is the number of vertices. It is also very complicated and difficult to be implemented in a standard imperative programming language such as C or C++.


international symposium on signal processing and information technology | 2015

Molecular basis of Huntington's disease and brain imaging evidence

Antonia Plerou; Catherine Bobori; Panayiotis Vlamos

Huntingtons disease as a neurodegenerative disease is characterized by motor and cognitive impairment. The disease is caused by the mutation of the gene that produces the huntingtin protein causing the repetition of trinucleotide CAG. The mutant protein reacts with other proteins inside and out of the cell causing problems to its normal function and cell death. Recent advances in the signal analysis have engendered EEG with the status of a true brain mapping and brain imaging method able of providing spatio-temporal information regarding brain (dys)function. Authors aim to review objectively and quantitatively the neurophysiological basis of the disease in HD patients as compared to normal controls, with the use of brain imaging in general and EEG brain imaging methods.

Collaboration


Dive into the Panayiotis Vlamos's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Catalina Cocianu

Bucharest University of Economic Studies

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Michail N. Giannakos

Norwegian University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge