LIVIVO - The Search Portal for Life Sciences

zur deutschen Oberfläche wechseln
Advanced search

Search results

Result 1 - 10 of total 38

Search options

  1. Article ; Online: Innovative Practice of Music Education in the Universities in the Context of 5G Network.

    Huang, Shaohan

    publication RETRACTED

    Computational intelligence and neuroscience

    2022  Volume 2022, Page(s) 3451422

    Abstract: Music education is among the most significant subjects covered in providing high-quality education in Chinese universities and colleges. Music education is critical to providing high-quality education to students. It contributes significantly to the ... ...

    Abstract Music education is among the most significant subjects covered in providing high-quality education in Chinese universities and colleges. Music education is critical to providing high-quality education to students. It contributes significantly to the development of students' creative motivation, inventive capacity, and personality development. Music education provides excellent outcomes in music instruction and fosters students' original thinking and comprehensive abilities, and therefore supports the overall development of high-quality education. With the development of the educational system, it is becoming more vital to teach students a high level of musical literacy. With the advent of 5G mobile communication, it will become one of the core technologies in Chinese music education, providing an innovative framework for music education. In this study, a novel music education model is proposed for the development of music education using the GTZAN dataset which is comprised of 100 distinct specimens for every genre and ten various kinds of music. The dataset is normalized to prepare it for further processing and the characteristics of the song are retrieved using a technique called spectrum-based feature extraction (SBF). Bi-recurrent neural networks (Bi-RNN). are used to classify objects in space. An improved TCP congestion control algorithm (ITCCA) is proposed for efficient data transmission between the 5G networks. To optimize the performance of the transmission protocol, the honey bee optimization algorithm is employed. The performance of the proposed model is examined and contrasted with that of the currently used approaches. The proposed model shows high performance in terms of throughput, average delay, and packet delivery ratio. The model has the potential to successfully integrate 5G technologies and music education and provide the students with rich and diversified teaching materials and flexible instructional formats.
    MeSH term(s) Animals ; Humans ; Music ; Students ; Universities
    Language English
    Publishing date 2022-06-14
    Publishing country United States
    Document type Journal Article ; Retracted Publication
    ZDB-ID 2388208-6
    ISSN 1687-5273 ; 1687-5273
    ISSN (online) 1687-5273
    ISSN 1687-5273
    DOI 10.1155/2022/3451422
    Database MEDical Literature Analysis and Retrieval System OnLINE

    More links

    Kategorien

  2. Article ; Online: Clinical and genetic analysis of pseudohypoparathyroidism complicated by hypokalemia: a case report and review of the literature.

    Huang, Shaohan / He, Yingzi / Lin, Xihua / Sun, Shuiya / Zheng, Fenping

    BMC endocrine disorders

    2022  Volume 22, Issue 1, Page(s) 98

    Abstract: Background: Pseudohypoparathyroidism (PHP) encompasses a highly heterogenous group of disorders, characterized by parathyroid hormone (PTH) resistance caused by mutations in the GNAS gene or other upstream targets. Here, we investigate the ... ...

    Abstract Background: Pseudohypoparathyroidism (PHP) encompasses a highly heterogenous group of disorders, characterized by parathyroid hormone (PTH) resistance caused by mutations in the GNAS gene or other upstream targets. Here, we investigate the characteristics of a female patient diagnosed with PHP complicated with hypokalemia, and her family members.
    Case presentation and gene analysis: A 27-year-old female patient occasionally exhibited asymptomatic hypocalcemia and hypokalemia during her pregnancy 1 year ago. Seven months after delivery, she experienced tetany and dysphonia with diarrhea. Tetany symptoms were relieved after intravenous calcium gluconate supplementation and she was then transferred to our Hospital. Laboratory assessments of the patient revealed hypokalemia, hypocalcemia and hyperphosphatemia despite elevated PTH levels. CT scanning of the brain revealed globus pallidus calcification. Possible mutations in GNAS and hypokalemia related genes were identified using WES, exon copies of STX16 were analized by MLPA and the methylation status of GNAS in three differential methylated regions (DMRs) was analyzed by methylation-specific polymerase chain reaction, followed by confirmation with gene sequencing. The patient was clinically diagnosed with PHP-1b. Loss of methylation in the A/B region and hypermethylation in the NESP55 region were detected. No other mutations in GNAS or hypokalemia related genes and no deletions of STX16 exons were detected. A negative family history and abnormal DMRs in GNAS led to a diagnosis of sporadic PHP-1b of the patient.
    Conclusions: Hypokalemia is a rare disorder associated with PHP-1b. Analysis of genetic and epigenetic mutations can aid in the diagnosis and accurate subtyping of PHP.
    MeSH term(s) Adult ; Chromogranins/genetics ; Female ; GTP-Binding Protein alpha Subunits, Gs/genetics ; Humans ; Hypocalcemia/genetics ; Hypokalemia/genetics ; Pseudohypoparathyroidism/complications ; Pseudohypoparathyroidism/diagnosis ; Pseudohypoparathyroidism/genetics ; Tetany
    Chemical Substances Chromogranins ; GTP-Binding Protein alpha Subunits, Gs (EC 3.6.5.1)
    Language English
    Publishing date 2022-04-11
    Publishing country England
    Document type Case Reports ; Journal Article ; Review
    ZDB-ID 2091323-0
    ISSN 1472-6823 ; 1472-6823
    ISSN (online) 1472-6823
    ISSN 1472-6823
    DOI 10.1186/s12902-022-01011-9
    Database MEDical Literature Analysis and Retrieval System OnLINE

    More links

    Kategorien

  3. Book ; Online: MoEC

    Xie, Yuan / Huang, Shaohan / Chen, Tianyu / Wei, Furu

    Mixture of Expert Clusters

    2022  

    Abstract: Sparsely Mixture of Experts (MoE) has received great interest due to its promising scaling capability with affordable computational overhead. MoE converts dense layers into sparse experts, and utilizes a gated routing network to make experts ... ...

    Abstract Sparsely Mixture of Experts (MoE) has received great interest due to its promising scaling capability with affordable computational overhead. MoE converts dense layers into sparse experts, and utilizes a gated routing network to make experts conditionally activated. However, as the number of experts grows, MoE with outrageous parameters suffers from overfitting and sparse data allocation. Such problems are especially severe on tasks with limited data, thus hindering the progress for MoE models to improve performance by scaling up. In this work, we propose Mixture of Expert Clusters - a general approach to enable expert layers to learn more diverse and appropriate knowledge by imposing variance-based constraints on the routing stage. We further propose a cluster-level expert dropout strategy specifically designed for the expert cluster structure. Our experiments reveal that MoEC could improve performance on machine translation and natural language understanding tasks, and raise the performance upper bound for scaling up experts under limited data. We also verify that MoEC plays a positive role in mitigating overfitting and sparse data allocation.
    Keywords Computer Science - Computation and Language ; Computer Science - Machine Learning
    Subject code 004
    Publishing date 2022-07-19
    Publishing country us
    Document type Book ; Online
    Database BASE - Bielefeld Academic Search Engine (life sciences selection)

    More links

    Kategorien

  4. Article ; Online: Canagliflozin ameliorates the development of NAFLD by preventing NLRP3-mediated pyroptosis through FGF21-ERK1/2 pathway.

    Huang, Shaohan / Wu, Beibei / He, Yingzi / Qiu, Ruojun / Yang, Tian / Wang, Shuo / Lei, Yongzhen / Li, Hong / Zheng, Fenping

    Hepatology communications

    2023  Volume 7, Issue 3, Page(s) e0045

    Abstract: Recent studies have suggested that sodium-glucose co-transporter2 inhibitors go beyond their glycemic advantages to ameliorate the development of NAFLD. However, little research has been done on the underlying mechanisms. Here, we took deep insight into ... ...

    Abstract Recent studies have suggested that sodium-glucose co-transporter2 inhibitors go beyond their glycemic advantages to ameliorate the development of NAFLD. However, little research has been done on the underlying mechanisms. Here, we took deep insight into the effect of canagliflozin (CANA), one of the sodium-glucose co-transporter2 inhibitor, on the progression of NAFLD, and explored the molecular mechanisms. Our findings showed that CANA-treated ob/ob and diabetic mice developed improved glucose and insulin tolerance, although their body weights were comparable or even increased compared with the controls. The CANA treatment ameliorated hepatic steatosis and lipid accumulation of free fatty acid-treated AML12 cells, accompanied by decreased lipogenic gene expression and increased fatty acid β oxidation-related gene expression. Furthermore, inflammation and fibrosis genes decreased in the livers of CANA-treated ob/ob and diabetic mice mice. FGF21 and its downstream ERK1/2/AMPK signaling decreased, whereas NLRP3-mediated pyroptosis increased in the livers of the ob/ob and diabetic mice mice, which was reversed by the CANA treatment. In addition, blocking FGF21 or ERK1/2 activity antagonized the effects of CANA on NLRP3-mediated pyroptosis in lipopolysaccharide plus nigericin-treated J774A.1 cells. We conclude that CANA treatment alleviated insulin resistance and the progression of NAFLD in ob/ob and diabetic mice mice independent of the body weight change. CANA protected against the progression of NAFLD by inhibiting NLRP3-mediated pyroptosis and enhancing FGF21-ERK1/2 pathway activity in the liver. These findings suggest the therapeutic potential of sodium-glucose co-transporter2 inhibitors in the treatment of NAFLD.
    MeSH term(s) Mice ; Animals ; Non-alcoholic Fatty Liver Disease/drug therapy ; Non-alcoholic Fatty Liver Disease/complications ; Canagliflozin/pharmacology ; Canagliflozin/therapeutic use ; NLR Family, Pyrin Domain-Containing 3 Protein/genetics ; NLR Family, Pyrin Domain-Containing 3 Protein/metabolism ; Diabetes Mellitus, Experimental/complications ; Diabetes Mellitus, Experimental/drug therapy ; MAP Kinase Signaling System ; Pyroptosis ; Blood Glucose/metabolism ; Insulin ; Glucose ; Sodium
    Chemical Substances Canagliflozin (0SAC974Z85) ; fibroblast growth factor 21 ; NLR Family, Pyrin Domain-Containing 3 Protein ; Blood Glucose ; Insulin ; Glucose (IY9XDZ35W2) ; Sodium (9NEZ333N27) ; Nlrp3 protein, mouse
    Language English
    Publishing date 2023-02-09
    Publishing country United States
    Document type Journal Article ; Research Support, Non-U.S. Gov't
    ISSN 2471-254X
    ISSN (online) 2471-254X
    DOI 10.1097/HC9.0000000000000045
    Database MEDical Literature Analysis and Retrieval System OnLINE

    More links

    Kategorien

  5. Book ; Online: Learning Music Sequence Representation from Text Supervision

    Chen, Tianyu / Xie, Yuan / Zhang, Shuai / Huang, Shaohan / Zhou, Haoyi / Li, Jianxin

    2023  

    Abstract: Music representation learning is notoriously difficult for its complex human-related concepts contained in the sequence of numerical signals. To excavate better MUsic SEquence Representation from labeled audio, we propose a novel text-supervision pre- ... ...

    Abstract Music representation learning is notoriously difficult for its complex human-related concepts contained in the sequence of numerical signals. To excavate better MUsic SEquence Representation from labeled audio, we propose a novel text-supervision pre-training method, namely MUSER. MUSER adopts an audio-spectrum-text tri-modal contrastive learning framework, where the text input could be any form of meta-data with the help of text templates while the spectrum is derived from an audio sequence. Our experiments reveal that MUSER could be more flexibly adapted to downstream tasks compared with the current data-hungry pre-training method, and it only requires 0.056% of pre-training data to achieve the state-of-the-art performance.
    Keywords Computer Science - Sound ; Computer Science - Machine Learning ; Electrical Engineering and Systems Science - Audio and Speech Processing
    Publishing date 2023-05-31
    Publishing country us
    Document type Book ; Online
    Database BASE - Bielefeld Academic Search Engine (life sciences selection)

    More links

    Kategorien

  6. Book ; Online: Kosmos-G

    Pan, Xichen / Dong, Li / Huang, Shaohan / Peng, Zhiliang / Chen, Wenhu / Wei, Furu

    Generating Images in Context with Multimodal Large Language Models

    2023  

    Abstract: Recent advancements in text-to-image (T2I) and vision-language-to-image (VL2I) generation have made significant strides. However, the generation from generalized vision-language inputs, especially involving multiple images, remains under-explored. This ... ...

    Abstract Recent advancements in text-to-image (T2I) and vision-language-to-image (VL2I) generation have made significant strides. However, the generation from generalized vision-language inputs, especially involving multiple images, remains under-explored. This paper presents Kosmos-G, a model that leverages the advanced perception capabilities of Multimodal Large Language Models (MLLMs) to tackle the aforementioned challenge. Our approach aligns the output space of MLLM with CLIP using the textual modality as an anchor and performs compositional instruction tuning on curated data. Kosmos-G demonstrates a unique capability of zero-shot multi-entity subject-driven generation. Notably, the score distillation instruction tuning requires no modifications to the image decoder. This allows for a seamless substitution of CLIP and effortless integration with a myriad of U-Net techniques ranging from fine-grained controls to personalized image decoder variants. We posit Kosmos-G as an initial attempt towards the goal of "image as a foreign language in image generation."

    Comment: Code: https://aka.ms/Kosmos-G Project Page: https://xichenpan.github.io/kosmosg
    Keywords Computer Science - Computer Vision and Pattern Recognition ; Computer Science - Computation and Language
    Subject code 004 ; 121
    Publishing date 2023-10-04
    Publishing country us
    Document type Book ; Online
    Database BASE - Bielefeld Academic Search Engine (life sciences selection)

    More links

    Kategorien

  7. Book ; Online: Kosmos-2

    Peng, Zhiliang / Wang, Wenhui / Dong, Li / Hao, Yaru / Huang, Shaohan / Ma, Shuming / Wei, Furu

    Grounding Multimodal Large Language Models to the World

    2023  

    Abstract: We introduce Kosmos-2, a Multimodal Large Language Model (MLLM), enabling new capabilities of perceiving object descriptions (e.g., bounding boxes) and grounding text to the visual world. Specifically, we represent refer expressions as links in Markdown, ...

    Abstract We introduce Kosmos-2, a Multimodal Large Language Model (MLLM), enabling new capabilities of perceiving object descriptions (e.g., bounding boxes) and grounding text to the visual world. Specifically, we represent refer expressions as links in Markdown, i.e., ``[text span](bounding boxes)'', where object descriptions are sequences of location tokens. Together with multimodal corpora, we construct large-scale data of grounded image-text pairs (called GrIT) to train the model. In addition to the existing capabilities of MLLMs (e.g., perceiving general modalities, following instructions, and performing in-context learning), Kosmos-2 integrates the grounding capability into downstream applications. We evaluate Kosmos-2 on a wide range of tasks, including (i) multimodal grounding, such as referring expression comprehension, and phrase grounding, (ii) multimodal referring, such as referring expression generation, (iii) perception-language tasks, and (iv) language understanding and generation. This work lays out the foundation for the development of Embodiment AI and sheds light on the big convergence of language, multimodal perception, action, and world modeling, which is a key step toward artificial general intelligence. Code and pretrained models are available at https://aka.ms/kosmos-2.

    Comment: 20 pages
    Keywords Computer Science - Computation and Language ; Computer Science - Computer Vision and Pattern Recognition
    Subject code 401
    Publishing date 2023-06-26
    Publishing country us
    Document type Book ; Online
    Database BASE - Bielefeld Academic Search Engine (life sciences selection)

    More links

    Kategorien

  8. Article ; Online: Mice harboring a R133L heterozygous mutation in LMNA exhibited ectopic lipid accumulation, aging, and mitochondrial dysfunction in adipose tissue.

    Qiu, Ruojun / Wang, Shuo / Lin, Dingyi / He, Yingzi / Huang, Shaohan / Wu, Beibei / Li, Hong / Wang, Min / Zheng, Fenping

    FASEB journal : official publication of the Federation of American Societies for Experimental Biology

    2022  Volume 37, Issue 2, Page(s) e22730

    Abstract: The LMNA gene encodes for the nuclear envelope proteins lamin A and C (lamin A/C). A novel R133L heterozygous mutation in the LMNA gene causes atypical progeria syndrome (APS). However, the underlying mechanism remains unclear. Here, we used transgenic ... ...

    Abstract The LMNA gene encodes for the nuclear envelope proteins lamin A and C (lamin A/C). A novel R133L heterozygous mutation in the LMNA gene causes atypical progeria syndrome (APS). However, the underlying mechanism remains unclear. Here, we used transgenic mice (Lmna
    MeSH term(s) Animals ; Mice ; Adipose Tissue/metabolism ; Aging/genetics ; Insulin Resistance ; Lamin Type A/genetics ; Lamin Type A/metabolism ; Lipids ; Mitochondria/genetics ; Mitochondria/metabolism ; Mutation ; Progeria/genetics ; Progeria/metabolism
    Chemical Substances Lamin Type A ; Lipids ; Lmna protein, mouse
    Language English
    Publishing date 2022-12-28
    Publishing country United States
    Document type Journal Article ; Research Support, Non-U.S. Gov't
    ZDB-ID 639186-2
    ISSN 1530-6860 ; 0892-6638
    ISSN (online) 1530-6860
    ISSN 0892-6638
    DOI 10.1096/fj.202201252RR
    Database MEDical Literature Analysis and Retrieval System OnLINE

    More links

    Kategorien

  9. Book ; Online: Kformer

    Yao, Yunzhi / Huang, Shaohan / Dong, Li / Wei, Furu / Chen, Huajun / Zhang, Ningyu

    Knowledge Injection in Transformer Feed-Forward Layers

    2022  

    Abstract: Recent days have witnessed a diverse set of knowledge injection models for pre-trained language models (PTMs); however, most previous studies neglect the PTMs' own ability with quantities of implicit knowledge stored in parameters. A recent study has ... ...

    Abstract Recent days have witnessed a diverse set of knowledge injection models for pre-trained language models (PTMs); however, most previous studies neglect the PTMs' own ability with quantities of implicit knowledge stored in parameters. A recent study has observed knowledge neurons in the Feed Forward Network (FFN), which are responsible for expressing factual knowledge. In this work, we propose a simple model, Kformer, which takes advantage of the knowledge stored in PTMs and external knowledge via knowledge injection in Transformer FFN layers. Empirically results on two knowledge-intensive tasks, commonsense reasoning (i.e., SocialIQA) and medical question answering (i.e., MedQA-USMLE), demonstrate that Kformer can yield better performance than other knowledge injection technologies such as concatenation or attention-based injection. We think the proposed simple model and empirical findings may be helpful for the community to develop more powerful knowledge injection methods. Code available in https://github.com/zjunlp/Kformer.

    Comment: Accepted by NLPCC 2022
    Keywords Computer Science - Computation and Language ; Computer Science - Artificial Intelligence ; Computer Science - Databases ; Computer Science - Information Retrieval ; Computer Science - Machine Learning
    Subject code 004
    Publishing date 2022-01-14
    Publishing country us
    Document type Book ; Online
    Database BASE - Bielefeld Academic Search Engine (life sciences selection)

    More links

    Kategorien

  10. Book ; Online: DeepNet

    Wang, Hongyu / Ma, Shuming / Dong, Li / Huang, Shaohan / Zhang, Dongdong / Wei, Furu

    Scaling Transformers to 1,000 Layers

    2022  

    Abstract: In this paper, we propose a simple yet effective method to stabilize extremely deep Transformers. Specifically, we introduce a new normalization function (DeepNorm) to modify the residual connection in Transformer, accompanying with theoretically derived ...

    Abstract In this paper, we propose a simple yet effective method to stabilize extremely deep Transformers. Specifically, we introduce a new normalization function (DeepNorm) to modify the residual connection in Transformer, accompanying with theoretically derived initialization. In-depth theoretical analysis shows that model updates can be bounded in a stable way. The proposed method combines the best of two worlds, i.e., good performance of Post-LN and stable training of Pre-LN, making DeepNorm a preferred alternative. We successfully scale Transformers up to 1,000 layers (i.e., 2,500 attention and feed-forward network sublayers) without difficulty, which is one order of magnitude deeper than previous deep Transformers. Remarkably, on a multilingual benchmark with 7,482 translation directions, our 200-layer model with 3.2B parameters significantly outperforms the 48-layer state-of-the-art model with 12B parameters by 5 BLEU points, which indicates a promising scaling direction.

    Comment: Work in progress
    Keywords Computer Science - Computation and Language ; Computer Science - Machine Learning
    Subject code 006
    Publishing date 2022-03-01
    Publishing country us
    Document type Book ; Online
    Database BASE - Bielefeld Academic Search Engine (life sciences selection)

    More links

    Kategorien

To top