C. P. Panos
Aristotle University of Thessaloniki
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by C. P. Panos.
Journal of Chemical Physics | 2005
K. Ch. Chatzisavvas; Ch. C. Moustakidis; C. P. Panos
Shannon information entropies in position and momentum spaces and their sum are calculated as functions of Z(2 < or = Z < or = 54) in atoms. Roothaan-Hartree-Fock electron wave functions are used. The universal property S = a + b ln Z is verified. In addition, we calculate the Kullback-Leibler relative entropy, the Jensen-Shannon divergence, Onicescus information energy, and a complexity measure recently proposed. Shell effects at closed-shell atoms are observed. The complexity measure shows local minima at the closed-shell atoms indicating that for the above atoms complexity decreases with respect to neighboring atoms. It is seen that complexity fluctuates around an average value, indicating that the atom cannot grow in complexity as Z increases. Onicescus information energy is correlated with the ionization potential. Kullback distance and Jensen-Shannon distance are employed to compare Roothaan-Hartree-Fock density distributions with other densities of previous works.
Physics Letters A | 1998
S.E. Massen; C. P. Panos
Abstract The position- and momentum-space information entropies of the electron distributions of atomic clusters are calculated using a Woods-Saxon single particle potential. The same entropies are also calculated for nuclear distributions according to the Skyrme parametrization of the nuclear mean field. It turns out that a similar functional form S = a + b ln N for the entropy as function of the number of particles N holds approximately for atoms, nuclei and atomic clusters. It is conjectured that this is a universal property of a many-fermion system in a mean field.
Physics Letters A | 2007
K.D. Sen; C. P. Panos; K. Ch. Chatzisavvas; Ch. C. Moustakidis
Abstract The net Fisher information measure I T , defined as the product of position and momentum Fisher information measures I r and I k and derived from the non-relativistic Hartree–Fock wave functions for atoms with Z = 1 – 102 , is found to correlate well with the inverse of the experimental ionization potential. Strong direct correlations of I T are also reported for the static dipole polarizability of atoms with Z = 1 – 88 . The complexity measure, defined as the ratio of the net Onicescu information measure E T and I T , exhibits clearly marked regions corresponding to the periodicity of the atomic shell structure. The reported correlations highlight the need for using the net information measures in addition to either the position or momentum space analogues. With reference to the correlation of the experimental properties considered here, the net Fisher information measure is found to be superior than the net Shannon information entropy.
Physics Letters A | 2007
C. P. Panos; K. Ch. Chatzisavvas; Ch. C. Moustakidis; E.G. Kyrkou
Abstract The simple measure of complexity Γ α , β of Shiner, Davison and Landsberg (SDL) and the statistical one C , according to Lopez-Ruiz, Mancini and Calbet (LMC), are compared in atoms as functions of the atomic number Z . Shell effects i.e. local minima at the closed shells atoms are observed, as well as certain qualitative trends of Γ α , β ( Z ) and C ( Z ) . If we impose the condition that Γ and C behave similarly as functions of Z , then we can conclude that complexity increases with Z and for atoms the strength of disorder is α ≃ 0 and order is β ≃ 4 .
Physics Letters A | 2001
S.E. Massen; C. P. Panos
Abstract A direct connection of information entropy S and kinetic energy T is obtained for nuclei and atomic clusters, which establishes T as a measure of the information in a distribution. It is conjectured that this is a universal property for fermionic many-body systems. We also check rigorous inequalities previously found to hold between S and T for atoms and verify that they hold for nuclei and atomic clusters as well. These inequalities give a relationship of Shannons information entropy in position-space with an experimental quantity, i.e., the rms radius of nuclei and clusters.
Physics Letters A | 2009
C. P. Panos; N.S. Nikolaidis; K. Ch. Chatzisavvas; C.C. Tsouros
Abstract We present a very simple method for the calculation of Shannon, Fisher and Onicescu entropies in atoms, as well as SDL and LMC complexity measures, as functions of the atomic number Z. Fractional occupation probabilities of electrons in atomic orbitals are employed, instead of the more complicated continuous electron probability densities in position- and momentum-spaces, used so far in the literature. Our main conclusions are compatible with the results of more sophisticated approaches and correlate fairly with experimental data. A practical way towards scalability of the quantification of complexity for systems with more components than the atom is indicated. We also discuss the issue if the complexity of the electronic structure of atoms increases with Z. A Pair ( α , β ) of Order-Disorder Indices (PODI), which can be introduced for any quantum many-body system, is evaluated in atoms ( α = 0.085 , β = 1.015 ). We conclude, by observing the trend of closed shells atoms, that “atoms are ordered systems, which grow in complexity as Z increases”.
Physics Letters A | 2002
S. E. Massen; Ch. C. Moustakidis; C. P. Panos
It is shown that a similar functional form S = a + bln N holds approximately for the information entropy S as function of the number of particles N for atoms, nuclei and atomic clusters (fermionic systems) and correlated boson-atoms in a trap (bosonic systems). It is also seen that rigorous inequalities previously found to hold between S and the kinetic energy T for fermionic systems, hold for bosonic systems as well. It is found that Landsberg’s order parameter is an increasing function of N for the above systems. It is conjectured that the above properties are universal i.e. they do not depend on the kind of constituent particles (fermions or correlated bosons) and the size of the system. Shannon’s information entropy for a continuous probability distribution p(x) is defined as
Physics Letters A | 2001
C. P. Panos
Abstract The order parameter Ω , defined by Landsberg, is calculated for nuclei and atomic clusters. The framework of information theory is employed. It is seen that Ω is an increasing function of the number of particles N and the total kinetic energy T . The values of Ω are almost the same for the two systems under consideration. It is conjectured that these properties are universal.
International Journal of Modern Physics E-nuclear Physics | 1998
G. A. Lalazissis; S. E. Massen; C. P. Panos; S. S. Dimitrova
The information entropy of a nuclear density distribution is calculated for a number of nuclei. Various phenomenological models for the density distribution using different geometries are employed. Nuclear densities calculated within various microscopic mean field approaches are also employed. It turns out that the entropy increases from crude phenomenological models to more sophisticated (microscopic) ones. It is concluded that the larger the information entropy, the better the quality of the nuclear density distribution. An alternative approach is also examined: the net information content, i.e. the sum of information entropies in position and momentum space Sr+Sk. It is indicated that Sr+Sk is a maximum, when the best fit to experimental data of the density and momentum distributions is attained.
Physical Review C | 2001
C. P. Panos; S. E. Massen; C. G. Koutroulos
We calculate the information entropy of single-particle states in position space