LIVIVO - The Search Portal for Life Sciences

zur deutschen Oberfläche wechseln
Advanced search

Search results

Result 1 - 10 of total 15

Search options

  1. Article ; Online: Accelerating network layouts using graph neural networks.

    Both, Csaba / Dehmamy, Nima / Yu, Rose / Barabási, Albert-László

    Nature communications

    2023  Volume 14, Issue 1, Page(s) 1560

    Abstract: Graph layout algorithms used in network visualization represent the first and the most widely used tool to unveil the inner structure and the behavior of complex networks. Current network visualization software relies on the force-directed layout (FDL) ... ...

    Abstract Graph layout algorithms used in network visualization represent the first and the most widely used tool to unveil the inner structure and the behavior of complex networks. Current network visualization software relies on the force-directed layout (FDL) algorithm, whose high computational complexity makes the visualization of large real networks computationally prohibitive and traps large graphs into high energy configurations, resulting in hard-to-interpret "hairball" layouts. Here we use Graph Neural Networks (GNN) to accelerate FDL, showing that deep learning can address both limitations of FDL: it offers a 10 to 100 fold improvement in speed while also yielding layouts which are more informative. We analytically derive the speedup offered by GNN, relating it to the number of outliers in the eigenspectrum of the adjacency matrix, predicting that GNNs are particularly effective for networks with communities and local regularities. Finally, we use GNN to generate a three-dimensional layout of the Internet, and introduce additional measures to assess the layout quality and its interpretability, exploring the algorithm's ability to separate communities and the link-length distribution. The novel use of deep neural networks can help accelerate other network-based optimization problems as well, with applications from reaction-diffusion systems to epidemics.
    Language English
    Publishing date 2023-03-21
    Publishing country England
    Document type Journal Article
    ZDB-ID 2553671-0
    ISSN 2041-1723 ; 2041-1723
    ISSN (online) 2041-1723
    ISSN 2041-1723
    DOI 10.1038/s41467-023-37189-2
    Database MEDical Literature Analysis and Retrieval System OnLINE

    More links

    Kategorien

  2. Book ; Online: Latent Space Symmetry Discovery

    Yang, Jianke / Dehmamy, Nima / Walters, Robin / Yu, Rose

    2023  

    Abstract: Equivariant neural networks require explicit knowledge of the symmetry group. Automatic symmetry discovery methods aim to relax this constraint and learn invariance and equivariance from data. However, existing symmetry discovery methods are limited to ... ...

    Abstract Equivariant neural networks require explicit knowledge of the symmetry group. Automatic symmetry discovery methods aim to relax this constraint and learn invariance and equivariance from data. However, existing symmetry discovery methods are limited to linear symmetries in their search space and cannot handle the complexity of symmetries in real-world, often high-dimensional data. We propose a novel generative model, Latent LieGAN (LaLiGAN), which can discover nonlinear symmetries from data. It learns a mapping from data to a latent space where the symmetries become linear and simultaneously discovers symmetries in the latent space. Theoretically, we show that our method can express any nonlinear symmetry under certain conditions. Experimentally, our method can capture the intrinsic symmetry in high-dimensional observations, which results in a well-structured latent space that is useful for other downstream tasks. We demonstrate the use cases for LaLiGAN in improving equation discovery and long-term forecasting for various dynamical systems.
    Keywords Computer Science - Machine Learning
    Subject code 006 ; 531
    Publishing date 2023-09-29
    Publishing country us
    Document type Book ; Online
    Database BASE - Bielefeld Academic Search Engine (life sciences selection)

    More links

    Kategorien

  3. Book ; Online: Generative Adversarial Symmetry Discovery

    Yang, Jianke / Walters, Robin / Dehmamy, Nima / Yu, Rose

    2023  

    Abstract: Despite the success of equivariant neural networks in scientific applications, they require knowing the symmetry group a priori. However, it may be difficult to know which symmetry to use as an inductive bias in practice. Enforcing the wrong symmetry ... ...

    Abstract Despite the success of equivariant neural networks in scientific applications, they require knowing the symmetry group a priori. However, it may be difficult to know which symmetry to use as an inductive bias in practice. Enforcing the wrong symmetry could even hurt the performance. In this paper, we propose a framework, LieGAN, to automatically discover equivariances from a dataset using a paradigm akin to generative adversarial training. Specifically, a generator learns a group of transformations applied to the data, which preserve the original distribution and fool the discriminator. LieGAN represents symmetry as interpretable Lie algebra basis and can discover various symmetries such as the rotation group $\mathrm{SO}(n)$, restricted Lorentz group $\mathrm{SO}(1,3)^+$ in trajectory prediction and top-quark tagging tasks. The learned symmetry can also be readily used in several existing equivariant neural networks to improve accuracy and generalization in prediction.
    Keywords Computer Science - Machine Learning
    Subject code 531
    Publishing date 2023-01-31
    Publishing country us
    Document type Book ; Online
    Database BASE - Bielefeld Academic Search Engine (life sciences selection)

    More links

    Kategorien

  4. Book ; Online: Symmetry Teleportation for Accelerated Optimization

    Zhao, Bo / Dehmamy, Nima / Walters, Robin / Yu, Rose

    2022  

    Abstract: Existing gradient-based optimization methods update parameters locally, in a direction that minimizes the loss function. We study a different approach, symmetry teleportation, that allows parameters to travel a large distance on the loss level set, in ... ...

    Abstract Existing gradient-based optimization methods update parameters locally, in a direction that minimizes the loss function. We study a different approach, symmetry teleportation, that allows parameters to travel a large distance on the loss level set, in order to improve the convergence speed in subsequent steps. Teleportation exploits symmetries in the loss landscape of optimization problems. We derive loss-invariant group actions for test functions in optimization and multi-layer neural networks, and prove a necessary condition for teleportation to improve convergence rate. We also show that our algorithm is closely related to second order methods. Experimentally, we show that teleportation improves the convergence speed of gradient descent and AdaGrad for several optimization problems including test functions, multi-layer regressions, and MNIST classification.

    Comment: 20 pages, 8 figures, NeurIPS 2022
    Keywords Computer Science - Machine Learning
    Subject code 510
    Publishing date 2022-05-21
    Publishing country us
    Document type Book ; Online
    Database BASE - Bielefeld Academic Search Engine (life sciences selection)

    More links

    Kategorien

  5. Book ; Online: Faster Optimization on Sparse Graphs via Neural Reparametrization

    Dehmamy, Nima / Both, Csaba / Long, Jianzhi / Yu, Rose

    2022  

    Abstract: In mathematical optimization, second-order Newton's methods generally converge faster than first-order methods, but they require the inverse of the Hessian, hence are computationally expensive. However, we discover that on sparse graphs, graph neural ... ...

    Abstract In mathematical optimization, second-order Newton's methods generally converge faster than first-order methods, but they require the inverse of the Hessian, hence are computationally expensive. However, we discover that on sparse graphs, graph neural networks (GNN) can implement an efficient Quasi-Newton method that can speed up optimization by a factor of 10-100x. Our method, neural reparametrization, modifies the optimization parameters as the output of a GNN to reshape the optimization landscape. Using a precomputed Hessian as the propagation rule, the GNN can effectively utilize the second-order information, reaching a similar effect as adaptive gradient methods. As our method solves optimization through architecture design, it can be used in conjunction with any optimizers such as Adam and RMSProp. We show the application of our method on scientifically relevant problems including heat diffusion, synchronization and persistent homology.
    Keywords Computer Science - Machine Learning ; Mathematics - Optimization and Control ; Physics - Computational Physics
    Subject code 510
    Publishing date 2022-05-26
    Publishing country us
    Document type Book ; Online
    Database BASE - Bielefeld Academic Search Engine (life sciences selection)

    More links

    Kategorien

  6. Book ; Online: Symmetries, flat minima, and the conserved quantities of gradient flow

    Zhao, Bo / Ganev, Iordan / Walters, Robin / Yu, Rose / Dehmamy, Nima

    2022  

    Abstract: Empirical studies of the loss landscape of deep networks have revealed that many local minima are connected through low-loss valleys. Ensemble models sampling different parts of a low-loss valley have reached SOTA performance. Yet, little is known about ... ...

    Abstract Empirical studies of the loss landscape of deep networks have revealed that many local minima are connected through low-loss valleys. Ensemble models sampling different parts of a low-loss valley have reached SOTA performance. Yet, little is known about the theoretical origin of such valleys. We present a general framework for finding continuous symmetries in the parameter space, which carve out low-loss valleys. Importantly, we introduce a novel set of nonlinear, data-dependent symmetries for neural networks. These symmetries can transform a trained model such that it performs similarly on new samples. We then show that conserved quantities associated with linear symmetries can be used to define coordinates along low-loss valleys. The conserved quantities help reveal that using common initialization methods, gradient flow only explores a small part of the global minimum. By relating conserved quantities to convergence rate and sharpness of the minimum, we provide insights on how initialization impacts convergence and generalizability. We also find the nonlinear action to be viable for ensemble building to improve robustness under certain adversarial attacks.

    Comment: Preliminary version; comments welcome
    Keywords Computer Science - Machine Learning ; Mathematics - Representation Theory
    Subject code 006
    Publishing date 2022-10-31
    Publishing country us
    Document type Book ; Online
    Database BASE - Bielefeld Academic Search Engine (life sciences selection)

    More links

    Kategorien

  7. Article ; Online: Systemic stress test model for shared portfolio networks.

    Vodenska, Irena / Dehmamy, Nima / Becker, Alexander P / Buldyrev, Sergey V / Havlin, Shlomo

    Scientific reports

    2021  Volume 11, Issue 1, Page(s) 3358

    Abstract: We propose a dynamic model for systemic risk using a bipartite network of banks and assets in which the weight of links and node attributes vary over time. Using market data and bank asset holdings, we are able to estimate a single parameter as an ... ...

    Abstract We propose a dynamic model for systemic risk using a bipartite network of banks and assets in which the weight of links and node attributes vary over time. Using market data and bank asset holdings, we are able to estimate a single parameter as an indicator of the stability of the financial system. We apply the model to the European sovereign debt crisis and observe that the results closely match real-world events (e.g., the high risk of Greek sovereign bonds and the distress of Greek banks). Our model could become complementary to existing stress tests, incorporating the contribution of interconnectivity of the banks to systemic risk in time-dependent networks. Additionally, we propose an institutional systemic importance ranking, BankRank, for the financial institutions analyzed in this study to assess the contribution of individual banks to the overall systemic risk.
    Language English
    Publishing date 2021-02-08
    Publishing country England
    Document type Journal Article ; Research Support, U.S. Gov't, Non-P.H.S. ; Research Support, Non-U.S. Gov't
    ZDB-ID 2615211-3
    ISSN 2045-2322 ; 2045-2322
    ISSN (online) 2045-2322
    ISSN 2045-2322
    DOI 10.1038/s41598-021-82904-y
    Database MEDical Literature Analysis and Retrieval System OnLINE

    More links

    Kategorien

  8. Article ; Online: Understanding the onset of hot streaks across artistic, cultural, and scientific careers.

    Liu, Lu / Dehmamy, Nima / Chown, Jillian / Giles, C Lee / Wang, Dashun

    Nature communications

    2021  Volume 12, Issue 1, Page(s) 5392

    Abstract: Across a range of creative domains, individual careers are characterized by hot streaks, which are bursts of high-impact works clustered together in close succession. Yet it remains unclear if there are any regularities underlying the beginning of hot ... ...

    Abstract Across a range of creative domains, individual careers are characterized by hot streaks, which are bursts of high-impact works clustered together in close succession. Yet it remains unclear if there are any regularities underlying the beginning of hot streaks. Here, we analyze career histories of artists, film directors, and scientists, and develop deep learning and network science methods to build high-dimensional representations of their creative outputs. We find that across all three domains, individuals tend to explore diverse styles or topics before their hot streak, but become notably more focused after the hot streak begins. Crucially, hot streaks appear to be associated with neither exploration nor exploitation behavior in isolation, but a particular sequence of exploration followed by exploitation, where the transition from exploration to exploitation closely traces the onset of a hot streak. Overall, these results may have implications for identifying and nurturing talents across a wide range of creative domains.
    Language English
    Publishing date 2021-09-13
    Publishing country England
    Document type Journal Article ; Research Support, U.S. Gov't, Non-P.H.S.
    ZDB-ID 2553671-0
    ISSN 2041-1723 ; 2041-1723
    ISSN (online) 2041-1723
    ISSN 2041-1723
    DOI 10.1038/s41467-021-25477-8
    Database MEDical Literature Analysis and Retrieval System OnLINE

    More links

    Kategorien

  9. Article ; Online: A structural transition in physical networks.

    Dehmamy, Nima / Milanlouei, Soodabeh / Barabási, Albert-László

    Nature

    2018  Volume 563, Issue 7733, Page(s) 676–680

    Abstract: In many physical networks, including neurons in the ... ...

    Abstract In many physical networks, including neurons in the brain
    MeSH term(s) Algorithms ; Animals ; Axons/physiology ; Brain/anatomy & histology ; Brain/cytology ; Brain/physiology ; Friction ; Gels/chemistry ; Mammals/anatomy & histology ; Models, Biological ; Models, Structural ; Nerve Net/anatomy & histology ; Nerve Net/cytology ; Nerve Net/physiology ; Printing, Three-Dimensional ; Stress, Mechanical
    Chemical Substances Gels
    Language English
    Publishing date 2018-11-28
    Publishing country England
    Document type Journal Article ; Research Support, N.I.H., Extramural ; Research Support, Non-U.S. Gov't ; Research Support, U.S. Gov't, Non-P.H.S.
    ZDB-ID 120714-3
    ISSN 1476-4687 ; 0028-0836
    ISSN (online) 1476-4687
    ISSN 0028-0836
    DOI 10.1038/s41586-018-0726-6
    Database MEDical Literature Analysis and Retrieval System OnLINE

    More links

    Kategorien

  10. Book ; Online: Automatic Symmetry Discovery with Lie Algebra Convolutional Network

    Dehmamy, Nima / Walters, Robin / Liu, Yanchen / Wang, Dashun / Yu, Rose

    2021  

    Abstract: Existing equivariant neural networks require prior knowledge of the symmetry group and discretization for continuous groups. We propose to work with Lie algebras (infinitesimal generators) instead of Lie groups. Our model, the Lie algebra convolutional ... ...

    Abstract Existing equivariant neural networks require prior knowledge of the symmetry group and discretization for continuous groups. We propose to work with Lie algebras (infinitesimal generators) instead of Lie groups. Our model, the Lie algebra convolutional network (L-conv) can automatically discover symmetries and does not require discretization of the group. We show that L-conv can serve as a building block to construct any group equivariant feedforward architecture. Both CNNs and Graph Convolutional Networks can be expressed as L-conv with appropriate groups. We discover direct connections between L-conv and physics: (1) group invariant loss generalizes field theory (2) Euler-Lagrange equation measures the robustness, and (3) equivariance leads to conservation laws and Noether current.These connections open up new avenues for designing more general equivariant networks and applying them to important problems in physical sciences
    Keywords Computer Science - Machine Learning ; Computer Science - Artificial Intelligence ; Mathematics - Group Theory
    Subject code 512
    Publishing date 2021-09-15
    Publishing country us
    Document type Book ; Online
    Database BASE - Bielefeld Academic Search Engine (life sciences selection)

    More links

    Kategorien

To top