Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Joanna Tyrcha is active.

Publication


Featured researches published by Joanna Tyrcha.


Physical Review E | 2009

Ising model for neural data: Model quality and approximate methods for extracting functional connectivity

Yasser Roudi; Joanna Tyrcha; John Hertz

We study pairwise Ising models for describing the statistics of multineuron spike trains, using data from a simulated cortical network. We explore efficient ways of finding the optimal couplings in these models and examine their statistical properties. To do this, we extract the optimal couplings for subsets of size up to 200 neurons, essentially exactly, using Boltzmann learning. We then study the quality of several approximate methods for finding the couplings by comparing their results with those found from Boltzmann learning. Two of these methods--inversion of the Thouless-Anderson-Palmer equations and an approximation proposed by Sessak and Monasson--are remarkably accurate. Using these approximations for larger subsets of neurons, we find that extracting couplings using data from a subset smaller than the full network tends systematically to overestimate their magnitude. This effect is described qualitatively by infinite-range spin-glass theory for the normal phase. We also show that a globally correlated input to the neurons in the network leads to a small increase in the average coupling. However, the pair-to-pair variation in the couplings is much larger than this and reflects intrinsic properties of the network. Finally, we study the quality of these models by comparing their entropies with that of the data. We find that they perform well for small subsets of the neurons in the network, but the fit quality starts to deteriorate as the subset size grows, signaling the need to include higher-order correlations to describe the statistics of large networks.


Journal of Mathematical Biology | 1988

Asymptotic stability in a generalized probabilistic/deterministic model of the cell cycle

Joanna Tyrcha

A new mathematical model of the cell cycle is presented which generalizes the probabilistic/deterministic model of Lasota-Mackey [1] and the tandem model of Tyson and Hannsgen [7]. By the use of a multiplicative (exponential) Lyapunov function a stability theorem is proved, parallel to the results of Lasota-Mackey. Some open problems related to the tandem model are also solved.


Journal of Mathematical Biology | 1992

The statistical dynamics of recurrent biological events

Andrzej Lasota; Michael C. Mackey; Joanna Tyrcha

In this paper we develop a general modeling framework within which many models for systems which produce events at irregular times through a combination of probabilistic and deterministic dynamics can be comprehended. We state and prove new sufficient conditions for the global asymptotic behaviour of the density evolution in these systems, and apply our results to many previously published models for the cell division cycle. In addition, we develop a new interpretation for the statistics of action potential production in excitable cells.


Biological Cybernetics | 1998

A neural network solution to the transverse patterning problem depends on repetition of the input code

Xiangbao Wu; Joanna Tyrcha; William B. Levy

Abstract. Using computer simulations, this paper investigates how input codes affect a minimal computational model of the hippocampal region CA3. Because encoding context seems to be a function of the hippocampus, we have studied problems that require learning context for their solution. Here we study a hippocampally dependent, configural learning problem called transverse patterning. Previously, we showed that the network does not produce long local context codings when the sequential input patterns are orthogonal, and it fails to solve many context-dependent problems in such situations. Here we show that this need not be the case if we assume that the input changes more slowly than a processing interval. Stuttering, i.e., repeating inputs, allows the network to create long local context firings even for orthogonal inputs. With these long local context firings, the network is able to solve the transverse patterning problem. Without stuttering, transverse patterning is not learned. Because stuttering is so useful, we investigate the relationship between the stuttering repetition length and relative context length in a simple, idealized sequence prediction problem. The relative context length, defined as the average length of the local context codes divided by the stuttering length, interacts with activity levels and has an optimal stuttering repetition length. Moreover, the increase in average context length can reach this maximum without loss of relative capacity. Finally, we note that stuttering is an example of maintained or introduced redundancy that can improve neural computations.


PLOS ONE | 2014

Plasticity in the Macromolecular-Scale Causal Networks of Cell Migration

John G. Lock; Mehrdad Jafari Mamaghani; Xiaowei Gong; Joanna Tyrcha; Staffan Strömblad

Heterogeneous and dynamic single cell migration behaviours arise from a complex multi-scale signalling network comprising both molecular components and macromolecular modules, among which cell-matrix adhesions and F-actin directly mediate migration. To date, the global wiring architecture characterizing this network remains poorly defined. It is also unclear whether such a wiring pattern may be stable and generalizable to different conditions, or plastic and context dependent. Here, synchronous imaging-based quantification of migration system organization, represented by 87 morphological and dynamic macromolecular module features, and migration system behaviour, i.e., migration speed, facilitated Granger causality analysis. We thereby leveraged natural cellular heterogeneity to begin mapping the directionally specific causal wiring between organizational and behavioural features of the cell migration system. This represents an important advance on commonly used correlative analyses that do not resolve causal directionality. We identified organizational features such as adhesion stability and adhesion F-actin content that, as anticipated, causally influenced cell migration speed. Strikingly, we also found that cell speed can exert causal influence over organizational features, including cell shape and adhesion complex location, thus revealing causality in directions contradictory to previous expectations. Importantly, by comparing unperturbed and signalling-modulated cells, we provide proof-of-principle that causal interaction patterns are in fact plastic and context dependent, rather than stable and generalizable.


Journal of Statistical Mechanics: Theory and Experiment | 2013

The effect of nonstationarity on models inferred from neural data

Joanna Tyrcha; Yasser Roudi; Matteo Marsili; John Hertz

Neurons subject to a common nonstationary input may exhibit a correlated firing behavior. Correlations in the statistics of neural spike trains also arise as the effect of interaction between neurons. Here we show that these two situations can be distinguished with machine learning techniques, provided that the data are rich enough. In order to do this, we study the problem of inferring a kinetic Ising model, stationary or nonstationary, from the available data. We apply the inference procedure to two data sets: one from salamander retinal ganglion cells and the other from a realistic computational cortical network model. We show that many aspects of the concerted activity of the salamander retinal neurons can be traced simply to the external input. A model of non-interacting neurons subject to a nonstationary external field outperforms a model with stationary input with couplings between neurons, even accounting for the differences in the number of model parameters. When couplings are added to the nonstationary model, for the retinal data, little is gained: the inferred couplings are generally not significant. Likewise, the distribution of the sizes of sets of neurons that spike simultaneously and the frequency of spike patterns as a function of their rank (Zipf plots) are well explained by an independent-neuron model with time-dependent external input, and adding connections to such a model does not offer significant improvement. For the cortical model data, robust couplings, well correlated with the real connections, can be inferred using the nonstationary model. Adding connections to this model slightly improves the agreement with the data for the probability of synchronous spikes but hardly affects the Zipf plot.


BMC Neuroscience | 2010

Inferring network connectivity using kinetic Ising models

John Hertz; Yasser Roudi; Andreas Thorning; Joanna Tyrcha; Erik Aurell; Hong-Li Zeng

One approach that has been explored recently for analyzing functional connectivity involves parametrizing the spike pattern distribution by an Ising model with symmetric connectivity. However, the connections found using this procedure do not generally agree well with the true synaptic connectivity [1]. Here we try, instead, a kinetic Ising network with asymmetric connections [2], updated either asynchronously or synchronously. For these models, it is possible to derive an iterative algorithm for the optimal connection strengths by formally maximizing the log-likelihood of the measured history [2]. For weak coupling, one can expand to first order and derive an approximate closed-form expression for the connections that takes the form J = ADC-1, where C is the equal-time correlation matrix, D is the correlation matrix with time lag one step, and A is a diagonal matrix with elements Aii = 1/(1 -〈Si〉2). We have tested the iterative algorithm on data generated by randomly-connected, synchronously-updated Ising networks for a range of sizes, firing rates, noise levels and concentration of nonzero connections. In all cases, the rms reconstruction error falls off approximately like a -1/3 power of the length of the run used to generate the correlations statistics. The approximate formula gives qualitatively good results, enabling us to identify the strongest connection reliably, though the magnitudes obtained tend to be off by a scaling factor that depends on noise level and mean firing rate. These conclusions hold for the asynchronous model as well. Furthermore, if the inference is done on a subset of the neurons in the network, the connections among them can still be recovered approximately. This approximation becomes excellent in the limit of weak connection concentration, where it permits reliable identification of the nonzero connections. We applied the approximate formula to data from a realistic cortical network model [3]. Fig. ​Fig.11 shows histograms of the connection strengths found in 30 randomly chosen sets of n = 50 inhibitory neurons, separated according to whether there actually was a synapse connecting the pair in question (blue) or not (green). If the cn(n-1) pairs with the most negative inferred couplings are identified as connected, with c the connection probability in the population, we find average false-positive and false-negative rates of 5.6% and 7.2%, respectively. To illustrate the point visually, Fig. ​Fig.22 shows the actual connections in the original simulated network and Fig. ​Fig.33 shows those identified by this procedure for a set of 25 neurons. Figure 1 Histograms of connections. Figure 2 Connections in model. Figure 3 Inferred connections.


Journal of Statistical Mechanics: Theory and Experiment | 2015

Belief propagation and replicas for inference and learning in a kinetic Ising model with hidden spins

Claudia Battistin; John Hertz; Joanna Tyrcha; Yasser Roudi

We propose a new algorithm for inferring the state of hidden spins and reconstructing the connections in a synchronous kinetic Ising model, given the observed history. Focusing on the case in which the hidden spins are conditionally independent of each other given the state of observable spins, we show that calculating the likelihood of the data can be simplified by introducing a set of replicated auxiliary spins. Belief Propagation (BP) and Susceptibility Propagation (SusP) can then be used to infer the states of hidden variables and learn the couplings. We study the convergence and performance of this algorithm for networks with both Gaussian-distributed and binary bonds. We also study how the algorithm behaves as the fraction of hidden nodes and the amount of data are changed, showing that it outperforms the TAP equations for reconstructing the connections.


BMC Bioinformatics | 2008

Mixture models for analysis of melting temperature data

Christoffer Nellåker; Fredrik Uhrzander; Joanna Tyrcha; Håkan Karlsson

BackgroundIn addition to their use in detecting undesired real-time PCR products, melting temperatures are useful for detecting variations in the desired target sequences. Methodological improvements in recent years allow the generation of high-resolution melting-temperature (Tm) data. However, there is currently no convention on how to statistically analyze such high-resolution Tm data.ResultsMixture model analysis was applied to Tm data. Models were selected based on Akaikes information criterion. Mixture model analysis correctly identified categories in Tm data obtained for known plasmid targets. Using simulated data, we investigated the number of observations required for model construction. The precision of the reported mixing proportions from data fitted to a preconstructed model was also evaluated.ConclusionMixture model analysis of Tm data allows the minimum number of different sequences in a set of amplicons and their relative frequencies to be determined. This approach allows Tm data to be analyzed, classified, and compared in an unbiased manner.


Statistics | 1989

Multivariate gamma distributions-properties and shape estimation

Teresa Kowalczyk; Joanna Tyrcha

A sequence of suitably defined multivariate gamma distributions with decreasing skewness is proved to converge to the respective multivariate normal distribution. Other properties of multivariate gamma distributions are given, and the generation of pseudorandom numbers is presented. Parameter estimation is shown to reduce to the evaluation of shape parameter ? in the univariate case. A new estimator of ?, based on the mode of the smallest sample observation, is proposed. A simulation study suggests that the estimator performs better than several other estimators being in use

Collaboration


Dive into the Joanna Tyrcha's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Yasser Roudi

Norwegian University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Xiangbao Wu

University of Virginia

View shared research outputs
Top Co-Authors

Avatar

Andrzej Lasota

Polish Academy of Sciences

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Erik Aurell

Royal Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge