LIVIVO - The Search Portal for Life Sciences

zur deutschen Oberfläche wechseln
Advanced search

Search results

Result 1 - 10 of total 27

Search options

  1. Article: Prevalence and Factors Associated with BRCA1/2 Gene Mutation in Chinese Populations with Breast Cancer.

    Huang, Guoding / Lu, Hongquan / Chen, Qizhu / Huang, Xinting

    International journal of general medicine

    2022  Volume 15, Page(s) 6783–6789

    Abstract: Objective: We aimed to evaluate the prevalence of BRCA1 and BRCA2 mutations in Chinese populations with breast cancer. Factors associated with BRCA1 and BRCA2 mutations are also evaluated.: Methods: This was a cross-sectional study, and patients with ...

    Abstract Objective: We aimed to evaluate the prevalence of BRCA1 and BRCA2 mutations in Chinese populations with breast cancer. Factors associated with BRCA1 and BRCA2 mutations are also evaluated.
    Methods: This was a cross-sectional study, and patients with breast cancer were included. Data on clinical characteristics, information of breast cancer, and BRCA1 and BRCA2 mutations were extracted. Patients were divided into the carrier and noncarrier groups.
    Results: A total of 368 patients were included. Compared to the noncarrier group (n = 240), patients in the carrier group (n = 128) were younger and more likely to have breast cancer at age <40 years. Of the overall 128 patients in the carrier groups, 58 had BRCA1 mutation and 70 had BRCA2 mutation. Among patients with early onset breast cancer, there was no difference in the prevalence of BRCA1 and BRCA2 (20.7% vs 17.1%, P = 0.35). While among patients with a family history of breast/ovarian cancer, BRCA2 mutation was more prevalent than BRCA1 mutation (54.3% vs 44.8%, P = 0.01); and among patients with triple-negative breast cancer, BRCA1 mutation was more prevalent than BRCA2 mutation (34.5% vs 28.6%, P = 0.04). After adjusting for covariates, factors associated with BRCA1 mutation included breast cancer diagnosed <40 years, tumor size >2 cm, and lymph node metastasis; and after adjusting for covariates, factors associated with BRCA2 mutation included age, tumor size >2 cm, and triple-negative breast cancer.
    Conclusion: The prevalence of BRCA1 and BRCA2 mutations varied according to three specific subgroups. Factors associated with BRCA1 and BRCA2 mutations were differential.
    Language English
    Publishing date 2022-08-24
    Publishing country New Zealand
    Document type Journal Article
    ZDB-ID 2452220-X
    ISSN 1178-7074
    ISSN 1178-7074
    DOI 10.2147/IJGM.S378706
    Database MEDical Literature Analysis and Retrieval System OnLINE

    More links

    Kategorien

  2. Article: The Role of Indoleamine 2, 3-Dioxygenase 1 in Regulating Tumor Microenvironment.

    Huang, Xinting / Zhang, Feng / Wang, Xiaobo / Liu, Ke

    Cancers

    2022  Volume 14, Issue 11

    Abstract: Indoleamine 2, 3-dioxygenase 1 (IDO1) is a rate-limiting enzyme that metabolizes an essential amino acid tryptophan (Trp) into kynurenine (Kyn), and it promotes the occurrence of immunosuppressive effects by regulating the consumption of Trp and the ... ...

    Abstract Indoleamine 2, 3-dioxygenase 1 (IDO1) is a rate-limiting enzyme that metabolizes an essential amino acid tryptophan (Trp) into kynurenine (Kyn), and it promotes the occurrence of immunosuppressive effects by regulating the consumption of Trp and the accumulation of Kyn in the tumor microenvironment (TME). Recent studies have shown that the main cellular components of TME interact with each other through this pathway to promote the formation of tumor immunosuppressive microenvironment. Here, we review the role of the immunosuppression mechanisms mediated by the IDO1 pathway in tumor growth. We discuss obstacles encountered in using IDO1 as a new tumor immunotherapy target, as well as the current clinical research progress.
    Language English
    Publishing date 2022-06-01
    Publishing country Switzerland
    Document type Journal Article ; Review
    ZDB-ID 2527080-1
    ISSN 2072-6694
    ISSN 2072-6694
    DOI 10.3390/cancers14112756
    Database MEDical Literature Analysis and Retrieval System OnLINE

    More links

    Kategorien

  3. Article ; Online: How Late-Life Working Affects Depression Among Retirement-Aged Workers? An Examination of the Influence Paths of Job-Related (Non-Job-Related) Physical Activity and Social Contact.

    Li, Jiannan / Yuan, Bocong / Lan, Junbang / Huang, Xinting

    Journal of occupational and environmental medicine

    2022  Volume 64, Issue 8, Page(s) e435–e442

    Abstract: Purpose: This study investigates the influence paths that late career participation affects depression of older workers.: Method: The data of China Health and Retirement Longitudinal Study (2018) are used. Those who have reached the statutory ... ...

    Abstract Purpose: This study investigates the influence paths that late career participation affects depression of older workers.
    Method: The data of China Health and Retirement Longitudinal Study (2018) are used. Those who have reached the statutory retirement age in China (>60 years for males/>55 years for females) are investigated.
    Results: Late career participation may positively affect job-related physical activity and social contact (2.110 and 0.028, P < 0.01) and negatively affect non-job-related physical activity (-0.343, P < 0.01). Besides, job-related physical activity may exacerbate depression symptoms among older workers (0.017, P < 0.01), whereas non-job-related physical activity and social contact may alleviate it (-0.015 and -0.038, P < 0.01).
    Conclusions: Late career participation could be associated with depression through different pathways involving job-related (and non-job-related) physical activity and social contact. The overall impact of late career participation on depression would depend on which influence pathway is dominant.
    MeSH term(s) Aged ; China ; Exercise ; Female ; Humans ; Longitudinal Studies ; Male ; Middle Aged ; Occupations ; Retirement
    Language English
    Publishing date 2022-06-11
    Publishing country United States
    Document type Journal Article ; Research Support, Non-U.S. Gov't
    ZDB-ID 1223932-x
    ISSN 1536-5948 ; 1076-2752
    ISSN (online) 1536-5948
    ISSN 1076-2752
    DOI 10.1097/JOM.0000000000002572
    Database MEDical Literature Analysis and Retrieval System OnLINE

    More links

    Kategorien

  4. Article: Socioeconomic disadvantages and vulnerability to the pandemic among children and youth: A macro-level investigation of American counties.

    Yuan, Bocong / Huang, Xinting / Li, Jiannan / He, Longtao

    Children and youth services review

    2022  Volume 136, Page(s) 106429

    Abstract: This study intends to reveal the underlying structural inequity in vulnerability to infection of the novel coronavirus disease pandemic among children and youth. Using multi-source data from New York Times novel coronavirus disease tracking project and ... ...

    Abstract This study intends to reveal the underlying structural inequity in vulnerability to infection of the novel coronavirus disease pandemic among children and youth. Using multi-source data from New York Times novel coronavirus disease tracking project and County Health Rankings & Roadmap Program, this study shows that children and youth in socioeconomically disadvantaged status are faced with disproportionate risk of infection in this pandemic. On the county level, socioeconomic disadvantages (i.e., single parent family, low birthweight, severe housing problems) contribute to the confirmed cases and death cases of the novel coronavirus disease. Policymakers should pay more attention to this vulnerable group to implement more targeted and effective epidemic prevention and control.
    Language English
    Publishing date 2022-02-23
    Publishing country United States
    Document type Journal Article
    ISSN 0190-7409
    ISSN 0190-7409
    DOI 10.1016/j.childyouth.2022.106429
    Database MEDical Literature Analysis and Retrieval System OnLINE

    More links

    Kategorien

  5. Book ; Online: Longer Fixations, More Computation

    Huang, Xinting / Wan, Jiajing / Kritikos, Ioannis / Hollenstein, Nora

    Gaze-Guided Recurrent Neural Networks

    2023  

    Abstract: Humans read texts at a varying pace, while machine learning models treat each token in the same way in terms of a computational process. Therefore, we ask, does it help to make models act more like humans? In this paper, we convert this intuition into a ... ...

    Abstract Humans read texts at a varying pace, while machine learning models treat each token in the same way in terms of a computational process. Therefore, we ask, does it help to make models act more like humans? In this paper, we convert this intuition into a set of novel models with fixation-guided parallel RNNs or layers and conduct various experiments on language modeling and sentiment analysis tasks to test their effectiveness, thus providing empirical validation for this intuition. Our proposed models achieve good performance on the language modeling task, considerably surpassing the baseline model. In addition, we find that, interestingly, the fixation duration predicted by neural networks bears some resemblance to humans' fixation. Without any explicit guidance, the model makes similar choices to humans. We also investigate the reasons for the differences between them, which explain why "model fixations" are often more suitable than human fixations, when used to guide language models.
    Keywords Computer Science - Computation and Language
    Publishing date 2023-10-31
    Publishing country us
    Document type Book ; Online
    Database BASE - Bielefeld Academic Search Engine (life sciences selection)

    More links

    Kategorien

  6. Book ; Online: Pre-training Multi-party Dialogue Models with Latent Discourse Inference

    Li, Yiyang / Huang, Xinting / Bi, Wei / Zhao, Hai

    2023  

    Abstract: Multi-party dialogues are more difficult for models to understand than one-to-one two-party dialogues, since they involve multiple interlocutors, resulting in interweaving reply-to relations and information flows. To step over these obstacles, an ... ...

    Abstract Multi-party dialogues are more difficult for models to understand than one-to-one two-party dialogues, since they involve multiple interlocutors, resulting in interweaving reply-to relations and information flows. To step over these obstacles, an effective way is to pre-train a model that understands the discourse structure of multi-party dialogues, namely, to whom each utterance is replying. However, due to the lack of explicitly annotated discourse labels in multi-party dialogue corpora, previous works fail to scale up the pre-training process by putting aside the unlabeled multi-party conversational data for nothing. To fully utilize the unlabeled data, we propose to treat the discourse structures as latent variables, then jointly infer them and pre-train the discourse-aware model by unsupervised latent variable inference methods. Experiments on multiple downstream tasks show that our pre-trained model outperforms strong baselines by large margins and achieves state-of-the-art (SOTA) results, justifying the effectiveness of our method. The official implementation of this paper is available at https://github.com/EricLee8/MPD_EMVI.

    Comment: Accepted by ACL 2023
    Keywords Computer Science - Computation and Language
    Subject code 006
    Publishing date 2023-05-24
    Publishing country us
    Document type Book ; Online
    Database BASE - Bielefeld Academic Search Engine (life sciences selection)

    More links

    Kategorien

  7. Book ; Online: SEGO

    Zhao, Xueliang / Huang, Xinting / Bi, Wei / Kong, Lingpeng

    Sequential Subgoal Optimization for Mathematical Problem-Solving

    2023  

    Abstract: Large Language Models (LLMs) have driven substantial progress in artificial intelligence in recent years, exhibiting impressive capabilities across a wide range of tasks, including mathematical problem-solving. Inspired by the success of subgoal-based ... ...

    Abstract Large Language Models (LLMs) have driven substantial progress in artificial intelligence in recent years, exhibiting impressive capabilities across a wide range of tasks, including mathematical problem-solving. Inspired by the success of subgoal-based methods, we propose a novel framework called \textbf{SE}quential sub\textbf{G}oal \textbf{O}ptimization (SEGO) to enhance LLMs' ability to solve mathematical problems. By establishing a connection between the subgoal breakdown process and the probability of solving problems, SEGO aims to identify better subgoals with theoretical guarantees. Addressing the challenge of identifying suitable subgoals in a large solution space, our framework generates problem-specific subgoals and adjusts them according to carefully designed criteria. Incorporating these optimized subgoals into the policy model training leads to significant improvements in problem-solving performance. We validate SEGO's efficacy through experiments on two benchmarks, GSM8K and MATH, where our approach outperforms existing methods, highlighting the potential of SEGO in AI-driven mathematical problem-solving. Data and code associated with this paper will be available at https://github.com/zhaoxlpku/SEGO

    Comment: Preprint
    Keywords Computer Science - Computation and Language
    Subject code 006
    Publishing date 2023-10-19
    Publishing country us
    Document type Book ; Online
    Database BASE - Bielefeld Academic Search Engine (life sciences selection)

    More links

    Kategorien

  8. Book ; Online: Inferflow

    Shi, Shuming / Zhao, Enbo / Cai, Deng / Cui, Leyang / Huang, Xinting / Li, Huayang

    an Efficient and Highly Configurable Inference Engine for Large Language Models

    2024  

    Abstract: We present Inferflow, an efficient and highly configurable inference engine for large language models (LLMs). With Inferflow, users can serve most of the common transformer models by simply modifying some lines in corresponding configuration files, ... ...

    Abstract We present Inferflow, an efficient and highly configurable inference engine for large language models (LLMs). With Inferflow, users can serve most of the common transformer models by simply modifying some lines in corresponding configuration files, without writing a single line of source code. Compared with most existing inference engines, Inferflow has some key features. First, by implementing a modular framework of atomic build-blocks and technologies, Inferflow is compositionally generalizable to new models. Second, 3.5-bit quantization is introduced in Inferflow as a tradeoff between 3-bit and 4-bit quantization. Third, hybrid model partitioning for multi-GPU inference is introduced in Inferflow to better balance inference speed and throughput than the existing partition-by-layer and partition-by-tensor strategies.

    Comment: Technical report of Inferflow
    Keywords Computer Science - Computation and Language
    Publishing date 2024-01-16
    Publishing country us
    Document type Book ; Online
    Database BASE - Bielefeld Academic Search Engine (life sciences selection)

    More links

    Kategorien

  9. Book ; Online: Mitigating Hallucinations of Large Language Models via Knowledge Consistent Alignment

    Wan, Fanqi / Huang, Xinting / Cui, Leyang / Quan, Xiaojun / Bi, Wei / Shi, Shuming

    2024  

    Abstract: While Large Language Models (LLMs) have proven to be exceptional on a variety of tasks after alignment, they may still produce responses that contradict the context or world knowledge confidently, a phenomenon known as ``hallucination''. In this paper, ... ...

    Abstract While Large Language Models (LLMs) have proven to be exceptional on a variety of tasks after alignment, they may still produce responses that contradict the context or world knowledge confidently, a phenomenon known as ``hallucination''. In this paper, we demonstrate that reducing the inconsistency between the external knowledge encapsulated in the training data and the intrinsic knowledge inherited in the pretraining corpus could mitigate hallucination in alignment. Specifically, we introduce a novel knowledge consistent alignment (KCA) approach, which involves automatically formulating examinations based on external knowledge for accessing the comprehension of LLMs. For data encompassing knowledge inconsistency, KCA implements several simple yet efficient strategies for processing. We illustrate the superior performance of the proposed KCA approach in mitigating hallucinations across six benchmarks using LLMs of different backbones and scales. Furthermore, we confirm the correlation between knowledge inconsistency and hallucination, signifying the effectiveness of reducing knowledge inconsistency in alleviating hallucinations. Our code, model weights, and data are public at \url{https://github.com/fanqiwan/KCA}.

    Comment: Work in progress
    Keywords Computer Science - Computation and Language
    Subject code 004
    Publishing date 2024-01-19
    Publishing country us
    Document type Book ; Online
    Database BASE - Bielefeld Academic Search Engine (life sciences selection)

    More links

    Kategorien

  10. Book ; Online: KaLM at SemEval-2020 Task 4

    Wan, Jiajing / Huang, Xinting

    Knowledge-aware Language Models for Comprehension And Generation

    2020  

    Abstract: This paper presents our strategies in SemEval 2020 Task 4: Commonsense Validation and Explanation. We propose a novel way to search for evidence and choose the different large-scale pre-trained models as the backbone for three subtasks. The results show ... ...

    Abstract This paper presents our strategies in SemEval 2020 Task 4: Commonsense Validation and Explanation. We propose a novel way to search for evidence and choose the different large-scale pre-trained models as the backbone for three subtasks. The results show that our evidence-searching approach improves model performance on commonsense explanation task. Our team ranks 2nd in subtask C according to human evaluation score.

    Comment: 6 pages, 1 figure
    Keywords Computer Science - Computation and Language ; I.2.7
    Publishing date 2020-05-24
    Publishing country us
    Document type Book ; Online
    Database BASE - Bielefeld Academic Search Engine (life sciences selection)

    More links

    Kategorien

To top