LIVIVO - The Search Portal for Life Sciences

zur deutschen Oberfläche wechseln
Advanced search

Your last searches

  1. AU="Lu, Yulong"
  2. AU=Tan Z C
  3. AU="Kwatra, Shawn"

Search results

Result 1 - 10 of total 15

Search options

  1. Book ; Online: Two-Scale Gradient Descent Ascent Dynamics Finds Mixed Nash Equilibria of Continuous Games

    Lu, Yulong

    A Mean-Field Perspective

    2022  

    Abstract: Finding the mixed Nash equilibria (MNE) of a two-player zero sum continuous game is an important and challenging problem in machine learning. A canonical algorithm to finding the MNE is the noisy gradient descent ascent method which in the infinite ... ...

    Abstract Finding the mixed Nash equilibria (MNE) of a two-player zero sum continuous game is an important and challenging problem in machine learning. A canonical algorithm to finding the MNE is the noisy gradient descent ascent method which in the infinite particle limit gives rise to the {\em Mean-Field Gradient Descent Ascent} (GDA) dynamics on the space of probability measures. In this paper, we first study the convergence of a two-scale Mean-Field GDA dynamics for finding the MNE of the entropy-regularized objective. More precisely we show that for each finite temperature (or regularization parameter), the two-scale Mean-Field GDA with a suitable {\em finite} scale ratio converges exponentially to the unique MNE without assuming the convexity or concavity of the interaction potential. The key ingredient of our proof lies in the construction of new Lyapunov functions that dissipate exponentially along the Mean-Field GDA. We further study the simulated annealing of the Mean-Field GDA dynamics. We show that with a temperature schedule that decays logarithmically in time the annealed Mean-Field GDA converges to the MNE of the original unregularized objective.

    Comment: A few typos and mistakes were fixed
    Keywords Mathematics - Optimization and Control ; Computer Science - Machine Learning ; Mathematics - Analysis of PDEs ; Mathematics - Probability ; Statistics - Machine Learning
    Subject code 519
    Publishing date 2022-12-16
    Publishing country us
    Document type Book ; Online
    Database BASE - Bielefeld Academic Search Engine (life sciences selection)

    More links

    Kategorien

  2. Book ; Online: Optimal Deep Neural Network Approximation for Korobov Functions with respect to Sobolev Norms

    Yang, Yahong / Lu, Yulong

    2023  

    Abstract: This paper establishes the nearly optimal rate of approximation for deep neural networks (DNNs) when applied to Korobov functions, effectively overcoming the curse of dimensionality. The approximation results presented in this paper are measured with ... ...

    Abstract This paper establishes the nearly optimal rate of approximation for deep neural networks (DNNs) when applied to Korobov functions, effectively overcoming the curse of dimensionality. The approximation results presented in this paper are measured with respect to $L_p$ norms and $H^1$ norms. Our achieved approximation rate demonstrates a remarkable "super-convergence" rate, outperforming traditional methods and any continuous function approximator. These results are non-asymptotic, providing error bounds that consider both the width and depth of the networks simultaneously.
    Keywords Mathematics - Numerical Analysis ; Computer Science - Machine Learning ; Statistics - Machine Learning ; 68Q25 ; 41A25 ; 41A46 ; 65D07
    Publishing date 2023-11-08
    Publishing country us
    Document type Book ; Online
    Database BASE - Bielefeld Academic Search Engine (life sciences selection)

    More links

    Kategorien

  3. Article ; Online: Evolution of renal cyst to renal carcinoma: a case report and review of literature.

    Lu, Yulong / Hu, Jialin / Feng, Ninghan

    International journal of clinical and experimental pathology

    2021  Volume 14, Issue 4, Page(s) 463–468

    Abstract: Background: Renal cyst is a common benign disease which is rare to progress from simple renal cyst to renal cell carcinoma.: Case presentation: A 62-year-old woman who suffered a simple renal cyst for over 20 years complained intermittent lumbar in ... ...

    Abstract Background: Renal cyst is a common benign disease which is rare to progress from simple renal cyst to renal cell carcinoma.
    Case presentation: A 62-year-old woman who suffered a simple renal cyst for over 20 years complained intermittent lumbar in recent 2 years. At her latest admission, the cyst lesion displayed enhancement in the cystic wall by CT scan and cystic to partially solid change by ultrasound, so we did a partial nephrectomy and found that the cystic lesion had become a cyst-solid transition. The pathology turned out to be renal clear cell carcinoma.
    Conclusions: Although the canceration of a renal cyst is a small probability event, patients with a long history of a cyst, especially those with symptoms, need to seek for medical treatment in time, and if necessary, lesion biopsy or resection may be under consideration.
    Language English
    Publishing date 2021-04-15
    Publishing country United States
    Document type Journal Article
    ZDB-ID 2418306-4
    ISSN 1936-2625 ; 1936-2625
    ISSN (online) 1936-2625
    ISSN 1936-2625
    Database MEDical Literature Analysis and Retrieval System OnLINE

    More links

    Kategorien

  4. Book ; Online: Transfer Learning Enhanced DeepONet for Long-Time Prediction of Evolution Equations

    Xu, Wuzhe / Lu, Yulong / Wang, Li

    2022  

    Abstract: Deep operator network (DeepONet) has demonstrated great success in various learning tasks, including learning solution operators of partial differential equations. In particular, it provides an efficient approach to predict the evolution equations in a ... ...

    Abstract Deep operator network (DeepONet) has demonstrated great success in various learning tasks, including learning solution operators of partial differential equations. In particular, it provides an efficient approach to predict the evolution equations in a finite time horizon. Nevertheless, the vanilla DeepONet suffers from the issue of stability degradation in the long-time prediction. This paper proposes a {\em transfer-learning} aided DeepONet to enhance the stability. Our idea is to use transfer learning to sequentially update the DeepONets as the surrogates for propagators learned in different time frames. The evolving DeepONets can better track the varying complexities of the evolution equations, while only need to be updated by efficient training of a tiny fraction of the operator networks. Through systematic experiments, we show that the proposed method not only improves the long-time accuracy of DeepONet while maintaining similar computational cost but also substantially reduces the sample size of the training set.
    Keywords Computer Science - Machine Learning ; Mathematics - Numerical Analysis
    Subject code 006
    Publishing date 2022-12-08
    Publishing country us
    Document type Book ; Online
    Database BASE - Bielefeld Academic Search Engine (life sciences selection)

    More links

    Kategorien

  5. Book ; Online: A Universal Approximation Theorem of Deep Neural Networks for Expressing Probability Distributions

    Lu, Yulong / Lu, Jianfeng

    2020  

    Abstract: This paper studies the universal approximation property of deep neural networks for representing probability distributions. Given a target distribution $\pi$ and a source distribution $p_z$ both defined on $\mathbb{R}^d$, we prove under some assumptions ... ...

    Abstract This paper studies the universal approximation property of deep neural networks for representing probability distributions. Given a target distribution $\pi$ and a source distribution $p_z$ both defined on $\mathbb{R}^d$, we prove under some assumptions that there exists a deep neural network $g:\mathbb{R}^d\rightarrow \mathbb{R}$ with ReLU activation such that the push-forward measure $(\nabla g)_\# p_z$ of $p_z$ under the map $\nabla g$ is arbitrarily close to the target measure $\pi$. The closeness are measured by three classes of integral probability metrics between probability distributions: $1$-Wasserstein distance, maximum mean distance (MMD) and kernelized Stein discrepancy (KSD). We prove upper bounds for the size (width and depth) of the deep neural network in terms of the dimension $d$ and the approximation error $\varepsilon$ with respect to the three discrepancies. In particular, the size of neural network can grow exponentially in $d$ when $1$-Wasserstein distance is used as the discrepancy, whereas for both MMD and KSD the size of neural network only depends on $d$ at most polynomially. Our proof relies on convergence estimates of empirical measures under aforementioned discrepancies and semi-discrete optimal transport.

    Comment: Accepted in the Thirty-fourth Conference on Neural Information Processing Systems (NeurIPS 2020)
    Keywords Computer Science - Machine Learning ; Mathematics - Numerical Analysis ; Mathematics - Statistics Theory ; Statistics - Machine Learning
    Subject code 519
    Publishing date 2020-04-19
    Publishing country us
    Document type Book ; Online
    Database BASE - Bielefeld Academic Search Engine (life sciences selection)

    More links

    Kategorien

  6. Article ; Online: An updated patent review of glutaminase inhibitors (2019-2022).

    Wang, Danni / Li, Xiaohong / Gong, Guangyue / Lu, Yulong / Guo, Ziming / Chen, Rui / Huang, Huidan / Li, Zhiyu / Bian, Jinlei

    Expert opinion on therapeutic patents

    2023  Volume 33, Issue 1, Page(s) 17–28

    Abstract: Introduction: Kidney-type glutaminase (GLS1), a key enzyme controlling the hydrolysis of glutamine to glutamate to resolve the 'glutamine addiction' of cancer cells, has been shown to play a central role in supporting cancer growth and proliferation. ... ...

    Abstract Introduction: Kidney-type glutaminase (GLS1), a key enzyme controlling the hydrolysis of glutamine to glutamate to resolve the 'glutamine addiction' of cancer cells, has been shown to play a central role in supporting cancer growth and proliferation. Therefore, the inhibition of GLS1 as a novel cancer treating strategy is of great interest.
    Areas covered: This review covers recent patents (2019-present) involving GLS1 inhibitors, which are mostly focused on their chemical structures, molecular mechanisms of action, pharmacokinetic properties, and potential clinical applications.
    Expert opinion: Currently, despite significant efforts, the search for potent GLS1 inhibitors has not resulted in the development of compounds for therapeutic applications. Most recent patents and literature focus on GLS1 inhibitors IPN60090 and DRP104, which have entered clinical trials. While other patent disclosures during this period have not generated any drug candidates, the clinical update will inform the potential of these inhibitors as promising therapeutic agents either as single or as combination interventions.
    MeSH term(s) Humans ; Glutamine ; Glutaminase ; Patents as Topic ; Enzyme Inhibitors/pharmacology ; Neoplasms
    Chemical Substances Glutamine (0RH81L854J) ; Glutaminase (EC 3.5.1.2) ; Enzyme Inhibitors
    Language English
    Publishing date 2023-02-02
    Publishing country England
    Document type Review ; Journal Article
    ZDB-ID 1186201-4
    ISSN 1744-7674 ; 0962-2594 ; 1354-3776
    ISSN (online) 1744-7674
    ISSN 0962-2594 ; 1354-3776
    DOI 10.1080/13543776.2023.2173573
    Database MEDical Literature Analysis and Retrieval System OnLINE

    More links

    Kategorien

  7. Book ; Online: On the Representation of Solutions to Elliptic PDEs in Barron Spaces

    Chen, Ziang / Lu, Jianfeng / Lu, Yulong

    2021  

    Abstract: Numerical solutions to high-dimensional partial differential equations (PDEs) based on neural networks have seen exciting developments. This paper derives complexity estimates of the solutions of $d$-dimensional second-order elliptic PDEs in the Barron ... ...

    Abstract Numerical solutions to high-dimensional partial differential equations (PDEs) based on neural networks have seen exciting developments. This paper derives complexity estimates of the solutions of $d$-dimensional second-order elliptic PDEs in the Barron space, that is a set of functions admitting the integral of certain parametric ridge function against a probability measure on the parameters. We prove under some appropriate assumptions that if the coefficients and the source term of the elliptic PDE lie in Barron spaces, then the solution of the PDE is $\epsilon$-close with respect to the $H^1$ norm to a Barron function. Moreover, we prove dimension-explicit bounds for the Barron norm of this approximate solution, depending at most polynomially on the dimension $d$ of the PDE. As a direct consequence of the complexity estimates, the solution of the PDE can be approximated on any bounded domain by a two-layer neural network with respect to the $H^1$ norm with a dimension-explicit convergence rate.
    Keywords Mathematics - Numerical Analysis ; Computer Science - Machine Learning ; Mathematics - Analysis of PDEs
    Subject code 515 ; 518
    Publishing date 2021-06-14
    Publishing country us
    Document type Book ; Online
    Database BASE - Bielefeld Academic Search Engine (life sciences selection)

    More links

    Kategorien

  8. Book ; Online: A Priori Generalization Analysis of the Deep Ritz Method for Solving High Dimensional Elliptic Equations

    Lu, Jianfeng / Lu, Yulong / Wang, Min

    2021  

    Abstract: This paper concerns the a priori generalization analysis of the Deep Ritz Method (DRM) [W. E and B. Yu, 2017], a popular neural-network-based method for solving high dimensional partial differential equations. We derive the generalization error bounds of ...

    Abstract This paper concerns the a priori generalization analysis of the Deep Ritz Method (DRM) [W. E and B. Yu, 2017], a popular neural-network-based method for solving high dimensional partial differential equations. We derive the generalization error bounds of two-layer neural networks in the framework of the DRM for solving two prototype elliptic PDEs: Poisson equation and static Schr\"odinger equation on the $d$-dimensional unit hypercube. Specifically, we prove that the convergence rates of generalization errors are independent of the dimension $d$, under the a priori assumption that the exact solutions of the PDEs lie in a suitable low-complexity space called spectral Barron space. Moreover, we give sufficient conditions on the forcing term and the potential function which guarantee that the solutions are spectral Barron functions. We achieve this by developing a new solution theory for the PDEs on the spectral Barron space, which can be viewed as an analog of the classical Sobolev regularity theory for PDEs.
    Keywords Mathematics - Numerical Analysis ; Computer Science - Machine Learning ; Mathematics - Analysis of PDEs ; Mathematics - Statistics Theory ; Statistics - Machine Learning
    Subject code 518
    Publishing date 2021-01-05
    Publishing country us
    Document type Book ; Online
    Database BASE - Bielefeld Academic Search Engine (life sciences selection)

    More links

    Kategorien

  9. Article ; Online: Reduced PDZRN4 promotes breast cancer progression and predicts poor prognosis.

    Lu, Yu-Long / Yang, Xin / Liu, Ya-Kui

    International journal of clinical and experimental pathology

    2019  Volume 12, Issue 1, Page(s) 142–153

    Abstract: Breast cancer (BC) is one of the most lethal types of cancer throughout the world due its proliferation and invasion. PDZ domain containing ring finger 4 (PDZRN4) belongs to the LNX family, which has E3 ubiquitin ligase activity and is involved in the ... ...

    Abstract Breast cancer (BC) is one of the most lethal types of cancer throughout the world due its proliferation and invasion. PDZ domain containing ring finger 4 (PDZRN4) belongs to the LNX family, which has E3 ubiquitin ligase activity and is involved in the progression of cancer. However, the role of PDZRN4 in the progression of BC remains unknown. In the present study, the public database Oncomine was used to detect PDZRN4 expression for primary screening. BC tissues and matched normal tissues were collected for detection of expression and cohort analysis. BC cells were used for invasion and proliferation function tests
    Language English
    Publishing date 2019-01-01
    Publishing country United States
    Document type Journal Article
    ZDB-ID 2418306-4
    ISSN 1936-2625 ; 1936-2625
    ISSN (online) 1936-2625
    ISSN 1936-2625
    Database MEDical Literature Analysis and Retrieval System OnLINE

    More links

    Kategorien

  10. Book ; Online: Accelerating Langevin Sampling with Birth-death

    Lu, Yulong / Lu, Jianfeng / Nolen, James

    2019  

    Abstract: A fundamental problem in Bayesian inference and statistical machine learning is to efficiently sample from multimodal distributions. Due to metastability, multimodal distributions are difficult to sample using standard Markov chain Monte Carlo methods. ... ...

    Abstract A fundamental problem in Bayesian inference and statistical machine learning is to efficiently sample from multimodal distributions. Due to metastability, multimodal distributions are difficult to sample using standard Markov chain Monte Carlo methods. We propose a new sampling algorithm based on a birth-death mechanism to accelerate the mixing of Langevin diffusion. Our algorithm is motivated by its mean field partial differential equation (PDE), which is a Fokker-Planck equation supplemented by a nonlocal birth-death term. This PDE can be viewed as a gradient flow of the Kullback-Leibler divergence with respect to the Wasserstein-Fisher-Rao metric. We prove that under some assumptions the asymptotic convergence rate of the nonlocal PDE is independent of the potential barrier, in contrast to the exponential dependence in the case of the Langevin diffusion. We illustrate the efficiency of the birth-death accelerated Langevin method through several analytical examples and numerical experiments.
    Keywords Statistics - Machine Learning ; Computer Science - Machine Learning ; Mathematics - Analysis of PDEs ; Mathematics - Statistics Theory
    Subject code 519
    Publishing date 2019-05-23
    Publishing country us
    Document type Book ; Online
    Database BASE - Bielefeld Academic Search Engine (life sciences selection)

    More links

    Kategorien

To top