Kooperativer Bibliotheksverbund

Berlin Brandenburg

and
and

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
Language
Year
  • 1
    Language: English
    In: BMC Bioinformatics, July 22, 2014, Vol.15(1)
    Description: Background Network inference deals with the reconstruction of molecular networks from experimental data. Given N molecular species, the challenge is to find the underlying network. Due to data limitations, this typically is an ill-posed problem, and requires the integration of prior biological knowledge or strong regularization. We here focus on the situation when time-resolved measurements of a system's response after systematic perturbations are available. Results We present a novel method to infer signaling networks from time-course perturbation data. We utilize dynamic Bayesian networks with probabilistic Boolean threshold functions to describe protein activation. The model posterior distribution is analyzed using evolutionary MCMC sampling and subsequent clustering, resulting in probability distributions over alternative networks. We evaluate our method on simulated data, and study its performance with respect to data set size and levels of noise. We then use our method to study EGF-mediated signaling in the ERBB pathway. Conclusions Dynamic Probabilistic Threshold Networks is a new method to infer signaling networks from time-series perturbation data. It exploits the dynamic response of a system after external perturbation for network reconstruction. On simulated data, we show that the approach outperforms current state of the art methods. On the ERBB data, our approach recovers a significant fraction of the known interactions, and predicts novel mechanisms in the ERBB pathway.
    Keywords: Probability Distributions
    ISSN: 1471-2105
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 2
    Description: To cope with the complexity of large networks, a number of dimensionality reduction techniques for graphs have been developed. However, the extent to which information is lost or preserved when these techniques are employed has not yet been clear. Here we develop a framework, based on algorithmic information theory, to quantify the extent to which information is preserved when network motif analysis, graph spectra and spectral sparsification methods are applied to over twenty different biological and artificial networks. We find that the spectral sparsification is highly sensitive to high number of edge deletion, leading to significant inconsistencies, and that graph spectral methods are the most irregular, capturing algebraic information in a condensed fashion but largely losing most of the information content of the original networks. However, the approach shows that network motif analysis excels at preserving the relative algorithmic information content of a network, hence validating and generalizing the remarkable fact that despite their inherent combinatorial possibilities, local regularities preserve information to such an extent that essential properties are fully recoverable across different networks to determine their family group to which they belong to (eg genetic vs social network). Our algorithmic information methodology thus provides a rigorous framework enabling a fundamental assessment and comparison between different data dimensionality reduction methods thereby facilitating the identification and evaluation of the capabilities of old and new methods. Comment: 29 pages, 6 figures
    Keywords: Quantitative Biology - Molecular Networks ; Computer Science - Information Theory ; Quantitative Biology - Quantitative Methods
    Source: Cornell University
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 3
    Language: English
    In: Physical review. E, July 2017, Vol.96(1-1), pp.012308
    Description: In estimating the complexity of objects, in particular, of graphs, it is common practice to rely on graph- and information-theoretic measures. Here, using integer sequences with properties such as Borel normality, we explain how these measures are not independent of the way in which an object, such as a graph, can be described or observed. From observations that can reconstruct the same graph and are therefore essentially translations of the same description, we see that when applying a computable measure such as the Shannon entropy, not only is it necessary to preselect a feature of interest where there is one, and to make an arbitrary selection where there is not, but also more general properties, such as the causal likelihood of a graph as a measure (opposed to randomness), can be largely misrepresented by computable measures such as the entropy and entropy rate. We introduce recursive and nonrecursive (uncomputable) graphs and graph constructions based on these integer sequences, whose different lossless descriptions have disparate entropy values, thereby enabling the study and exploration of a measure's range of applications and demonstrating the weaknesses of computable measures of complexity.
    Keywords: Computer Science - Information Theory ; Computer Science - Computational Complexity ; Mathematics - Combinatorics ; F.1.3;
    E-ISSN: 2470-0053
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 4
    Description: We undertake an extensive numerical investigation of the graph spectra of thousands regular graphs, a set of random Erd\"os-R\'enyi graphs, the two most popular types of complex networks and an evolving genetic network by using novel conceptual and experimental tools. Our objective in so doing is to contribute to an understanding of the meaning of the Eigenvalues of a graph relative to its topological and information-theoretic properties. We introduce a technique for identifying the most informative Eigenvalues of evolving networks by comparing graph spectra behavior to their algorithmic complexity. We suggest that extending techniques can be used to further investigate the behavior of evolving biological networks. In the extended version of this paper we apply these techniques to seven tissue specific regulatory networks as static example and network of a na\"ive pluripotent immune cell in the process of differentiating towards a Th17 cell as evolving example, finding the most and least informative Eigenvalues at every stage. Comment: Forthcoming in 3rd International Work-Conference on Bioinformatics and Biomedical Engineering (IWBBIO), Lecture Notes in Bioinformatics, 2015
    Keywords: Computer Science - Information Theory ; Mathematics - Dynamical Systems ; Mathematics - Spectral Theory
    Source: Cornell University
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 5
    Description: Without loss of generalisation to other systems, including possibly non-deterministic ones, we demonstrate the application of methods drawn from algorithmic information dynamics to the characterisation and classification of emergent and persistent patterns, motifs and colliding particles in Conway's Game of Life (GoL), a cellular automaton serving as a case study illustrating the way in which such ideas can be applied to a typical discrete dynamical system. We explore the issue of local observations of closed systems whose orbits may appear open because of inaccessibility to the global rules governing the overall system. We also investigate aspects of symmetry related to complexity in the distribution of patterns that occur with high frequency in GoL (which we thus call motifs) and analyse the distribution of these motifs with a view to tracking the changes in their algorithmic probability over time. We demonstrate how the tools introduced are an alternative to other computable measures that are unable to capture changes in emergent structures in evolving complex systems that are often too small or too subtle to be properly characterised by methods such as lossless compression and Shannon entropy. Comment: 18 pages + 1 sup page, 8 figures in total. Online complexity calculator: http://complexitycalculator.com/
    Keywords: Nonlinear Sciences - Cellular Automata And Lattice Gases ; Computer Science - Information Theory ; Mathematics - Dynamical Systems
    Source: Cornell University
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 6
    In: Journal of Complex Networks, 2016, Vol. 4(3), pp.342-362
    Description: To cope with the complexity of large networks, a number of dimensionality reduction techniques for graphs have been developed. However, the extent to which information is lost or preserved when these techniques are employed has not yet been clear. Here, we develop a framework, based on algorithmic information theory, to quantify the extent to which information is preserved when network motif analysis, graph spectra and spectral sparsification methods are applied to over 20 different biological and artificial networks. We find that the spectral sparsification is highly sensitive to high number of edge deletion, leading to significant inconsistencies, and that graph spectral methods are the most irregular, capturing algebraic information in a condensed fashion but largely losing most of the information content of the original networks. However, the approach shows that network motif analysis excels at preserving the relative algorithmic information content of a network, hence validating and generalizing the remarkable fact that despite their inherent combinatorial possibilities, local regularities preserve information to such an extent that essential properties are fully recoverable across different networks to determine their family group to which they belong to (e.g. genetic vs social network). Our algorithmic information methodology thus provides a rigorous framework enabling a fundamental assessment and comparison between different data dimensionality reduction methods thereby facilitating the identification and evaluation of the capabilities of old and new methods.
    Keywords: Dimensionality Reduction Techniques ; Kolmogorov Complexity ; Network ; Graph Spectra ; Graph Motifs ; Graph Sparsification
    ISSN: 2051-1310
    E-ISSN: 2051-1329
    Source: Oxford University Press
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 7
    Language: English
    In: Entropy, 01 June 2019, Vol.21(6), p.560
    Description: The principle of maximum entropy (Maxent) is often used to obtain prior probability distributions as a method to obtain a Gibbs measure under some restriction giving the probability that a system will be in a certain state compared to the rest of the elements in the distribution. Because classical entropy-based Maxent collapses cases confounding all distinct degrees of randomness and pseudo-randomness, here we take into consideration the generative mechanism of the systems considered in the ensemble to separate objects that may comply with the principle under some restriction and whose entropy is maximal but may be generated recursively from those that are actually algorithmically random offering a refinement to classical Maxent. We take advantage of a causal algorithmic calculus to derive a thermodynamic-like result based on how difficult it is to reprogram a computer code. Using the distinction between computable and algorithmic randomness, we quantify the cost in information loss associated with reprogramming. To illustrate this, we apply the algorithmic refinement to Maxent on graphs and introduce a Maximal Algorithmic Randomness Preferential Attachment (MARPA) Algorithm, a generalisation over previous approaches. We discuss practical implications of evaluation of network randomness. Our analysis provides insight in that the reprogrammability asymmetry appears to originate from a non-monotonic relationship to algorithmic probability. Our analysis motivates further analysis of the origin and consequences of the aforementioned asymmetries, reprogrammability, and computation.
    Keywords: Second Law of Thermodynamics ; Reprogrammability ; Algorithmic Complexity ; Generative Mechanisms ; Deterministic Systems ; Algorithmic Randomness ; Principle of Maximum Entropy ; Maxent
    E-ISSN: 1099-4300
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 8
    Description: We introduce a definition of algorithmic symmetry able to capture essential aspects of geometric symmetry. We review, study and apply a method for approximating the algorithmic complexity (also known as Kolmogorov-Chaitin complexity) of graphs and networks based on the concept of Algorithmic Probability (AP). AP is a concept (and method) capable of recursively enumeration all properties of computable (causal) nature beyond statistical regularities. We explore the connections of algorithmic complexity---both theoretical and numerical---with geometric properties mainly symmetry and topology from an (algorithmic) information-theoretic perspective. We show that approximations to algorithmic complexity by lossless compression and an Algorithmic Probability-based method can characterize properties of polyominoes, polytopes, regular and quasi-regular polyhedra as well as polyhedral networks, thereby demonstrating its profiling capabilities. Comment: 18 pages, 4 figures + Appendix (1 figure)
    Keywords: Computer Science - Computational Complexity ; Computer Science - Computational Geometry ; Computer Science - Discrete Mathematics ; Computer Science - Information Theory
    Source: Cornell University
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 9
    Description: We survey and introduce concepts and tools located at the intersection of information theory and network biology. We show that Shannon's information entropy, compressibility and algorithmic complexity quantify different local and global aspects of synthetic and biological data. We show examples such as the emergence of giant components in Erdos-Renyi random graphs, and the recovery of topological properties from numerical kinetic properties simulating gene expression data. We provide exact theoretical calculations, numerical approximations and error estimations of entropy, algorithmic probability and Kolmogorov complexity for different types of graphs, characterizing their variant and invariant properties. We introduce formal definitions of complexity for both labeled and unlabeled graphs and prove that the Kolmogorov complexity of a labeled graph is a good approximation of its unlabeled Kolmogorov complexity and thus a robust definition of graph complexity. Comment: 28 pages. Forthcoming in the journal Seminars in Cell and Developmental Biology
    Keywords: Quantitative Biology - Molecular Networks ; Quantitative Biology - Quantitative Methods
    Source: Cornell University
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 10
    Language: English
    In: Seminars in Cell and Developmental Biology, March 2016, Vol.51, pp.44-52
    Description: Network inference is a rapidly advancing field, with new methods being proposed on a regular basis. Understanding the advantages and limitations of different network inference methods is key to their effective application in different circumstances. The common structural properties shared by diverse networks naturally pose a challenge when it comes to devising accurate inference methods, but surprisingly, there is a paucity of comparison and evaluation methods. Historically, every new methodology has only been tested against (true values) purpose-designed synthetic and real-world (validated) biological networks. In this paper we aim to assess the impact of taking into consideration aspects of topological and information content in the evaluation of the final accuracy of an inference procedure. Specifically, we will compare the best inference methods, in both graph-theoretic and information-theoretic terms, for preserving topological properties and the original information content of synthetic and biological networks. New methods for performance comparison are introduced by borrowing ideas from gene set enrichment analysis and by applying concepts from algorithmic complexity. Experimental results show that no individual algorithm outperforms all others in all cases, and that the challenging and non-trivial nature of network inference is evident in the struggle of some of the algorithms to turn in a performance that is superior to random guesswork. Therefore special care should be taken to suit the method to the purpose at hand. Finally, we show that evaluations from data generated using different underlying topologies have different signatures that can be used to better choose a network reconstruction method.
    Keywords: Network Reverse Engineering ; Network Reconstruction ; Evaluation of Networks ; Information Content ; Shannon Entropy ; Algorithmic Complexity ; Biology
    ISSN: 1084-9521
    E-ISSN: 1096-3634
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. Further information can be found on the KOBV privacy pages