LIVIVO - The Search Portal for Life Sciences

zur deutschen Oberfläche wechseln
Advanced search

Search results

Result 1 - 9 of total 9

Search options

  1. Article ; Online: Contrastive encoder pre-training-based clustered federated learning for heterogeneous data.

    Tun, Ye Lin / Nguyen, Minh N H / Thwal, Chu Myaet / Choi, Jinwoo / Hong, Choong Seon

    Neural networks : the official journal of the International Neural Network Society

    2023  Volume 165, Page(s) 689–704

    Abstract: Federated learning (FL) is a promising approach that enables distributed clients to collaboratively train a global model while preserving their data privacy. However, FL often suffers from data heterogeneity problems, which can significantly affect its ... ...

    Abstract Federated learning (FL) is a promising approach that enables distributed clients to collaboratively train a global model while preserving their data privacy. However, FL often suffers from data heterogeneity problems, which can significantly affect its performance. To address this, clustered federated learning (CFL) has been proposed to construct personalized models for different client clusters. One effective client clustering strategy is to allow clients to choose their own local models from a model pool based on their performance. However, without pre-trained model parameters, such a strategy is prone to clustering failure, in which all clients choose the same model. Unfortunately, collecting a large amount of labeled data for pre-training can be costly and impractical in distributed environments. To overcome this challenge, we leverage self-supervised contrastive learning to exploit unlabeled data for the pre-training of FL systems. Together, self-supervised pre-training and client clustering can be crucial components for tackling the data heterogeneity issues of FL. Leveraging these two crucial strategies, we propose contrastive pre-training-based clustered federated learning (CP-CFL) to improve the model convergence and overall performance of FL systems. In this work, we demonstrate the effectiveness of CP-CFL through extensive experiments in heterogeneous FL settings, and present various interesting observations.
    MeSH term(s) Humans ; Learning ; Cluster Analysis ; Privacy
    Language English
    Publishing date 2023-06-10
    Publishing country United States
    Document type Journal Article
    ZDB-ID 740542-x
    ISSN 1879-2782 ; 0893-6080
    ISSN (online) 1879-2782
    ISSN 0893-6080
    DOI 10.1016/j.neunet.2023.06.010
    Database MEDical Literature Analysis and Retrieval System OnLINE

    More links

    Kategorien

  2. Article ; Online: OnDev-LCT: On-Device Lightweight Convolutional Transformers towards federated learning.

    Thwal, Chu Myaet / Nguyen, Minh N H / Tun, Ye Lin / Kim, Seong Tae / Thai, My T / Hong, Choong Seon

    Neural networks : the official journal of the International Neural Network Society

    2023  Volume 170, Page(s) 635–649

    Abstract: Federated learning (FL) has emerged as a promising approach to collaboratively train machine learning models across multiple edge devices while preserving privacy. The success of FL hinges on the efficiency of participating models and their ability to ... ...

    Abstract Federated learning (FL) has emerged as a promising approach to collaboratively train machine learning models across multiple edge devices while preserving privacy. The success of FL hinges on the efficiency of participating models and their ability to handle the unique challenges of distributed learning. While several variants of Vision Transformer (ViT) have shown great potential as alternatives to modern convolutional neural networks (CNNs) for centralized training, the unprecedented size and higher computational demands hinder their deployment on resource-constrained edge devices, challenging their widespread application in FL. Since client devices in FL typically have limited computing resources and communication bandwidth, models intended for such devices must strike a balance between model size, computational efficiency, and the ability to adapt to the diverse and non-IID data distributions encountered in FL. To address these challenges, we propose OnDev-LCT: Lightweight Convolutional Transformers for On-Device vision tasks with limited training data and resources. Our models incorporate image-specific inductive biases through the LCT tokenizer by leveraging efficient depthwise separable convolutions in residual linear bottleneck blocks to extract local features, while the multi-head self-attention (MHSA) mechanism in the LCT encoder implicitly facilitates capturing global representations of images. Extensive experiments on benchmark image datasets indicate that our models outperform existing lightweight vision models while having fewer parameters and lower computational demands, making them suitable for FL scenarios with data heterogeneity and communication bottlenecks.
    MeSH term(s) Humans ; Benchmarking ; Communication ; Machine Learning ; Neural Networks, Computer ; Privacy
    Language English
    Publishing date 2023-11-23
    Publishing country United States
    Document type Journal Article
    ZDB-ID 740542-x
    ISSN 1879-2782 ; 0893-6080
    ISSN (online) 1879-2782
    ISSN 0893-6080
    DOI 10.1016/j.neunet.2023.11.044
    Database MEDical Literature Analysis and Retrieval System OnLINE

    More links

    Kategorien

  3. Book ; Online: Federated Learning based Energy Demand Prediction with Clustered Aggregation

    Tun, Ye Lin / Thar, Kyi / Thwal, Chu Myaet / Hong, Choong Seon

    2022  

    Abstract: To reduce negative environmental impacts, power stations and energy grids need to optimize the resources required for power production. Thus, predicting the energy consumption of clients is becoming an important part of every energy management system. ... ...

    Abstract To reduce negative environmental impacts, power stations and energy grids need to optimize the resources required for power production. Thus, predicting the energy consumption of clients is becoming an important part of every energy management system. Energy usage information collected by the clients' smart homes can be used to train a deep neural network to predict the future energy demand. Collecting data from a large number of distributed clients for centralized model training is expensive in terms of communication resources. To take advantage of distributed data in edge systems, centralized training can be replaced by federated learning where each client only needs to upload model updates produced by training on its local data. These model updates are aggregated into a single global model by the server. But since different clients can have different attributes, model updates can have diverse weights and as a result, it can take a long time for the aggregated global model to converge. To speed up the convergence process, we can apply clustering to group clients based on their properties and aggregate model updates from the same cluster together to produce a cluster specific global model. In this paper, we propose a recurrent neural network based energy demand predictor, trained with federated learning on clustered clients to take advantage of distributed data and speed up the convergence process.

    Comment: Accepted by BigComp 2021
    Keywords Computer Science - Machine Learning ; Electrical Engineering and Systems Science - Signal Processing
    Subject code 006
    Publishing date 2022-10-27
    Publishing country us
    Document type Book ; Online
    Database BASE - Bielefeld Academic Search Engine (life sciences selection)

    More links

    Kategorien

  4. Book ; Online: Contrastive encoder pre-training-based clustered federated learning for heterogeneous data

    Tun, Ye Lin / Nguyen, Minh N. H. / Thwal, Chu Myaet / Choi, Jinwoo / Hong, Choong Seon

    2023  

    Abstract: Federated learning (FL) is a promising approach that enables distributed clients to collaboratively train a global model while preserving their data privacy. However, FL often suffers from data heterogeneity problems, which can significantly affect its ... ...

    Abstract Federated learning (FL) is a promising approach that enables distributed clients to collaboratively train a global model while preserving their data privacy. However, FL often suffers from data heterogeneity problems, which can significantly affect its performance. To address this, clustered federated learning (CFL) has been proposed to construct personalized models for different client clusters. One effective client clustering strategy is to allow clients to choose their own local models from a model pool based on their performance. However, without pre-trained model parameters, such a strategy is prone to clustering failure, in which all clients choose the same model. Unfortunately, collecting a large amount of labeled data for pre-training can be costly and impractical in distributed environments. To overcome this challenge, we leverage self-supervised contrastive learning to exploit unlabeled data for the pre-training of FL systems. Together, self-supervised pre-training and client clustering can be crucial components for tackling the data heterogeneity issues of FL. Leveraging these two crucial strategies, we propose contrastive pre-training-based clustered federated learning (CP-CFL) to improve the model convergence and overall performance of FL systems. In this work, we demonstrate the effectiveness of CP-CFL through extensive experiments in heterogeneous FL settings, and present various interesting observations.

    Comment: Published in Neural Networks
    Keywords Computer Science - Machine Learning ; Computer Science - Distributed ; Parallel ; and Cluster Computing
    Subject code 006
    Publishing date 2023-11-28
    Publishing country us
    Document type Book ; Online
    Database BASE - Bielefeld Academic Search Engine (life sciences selection)

    More links

    Kategorien

  5. Book ; Online: Federated Learning with Intermediate Representation Regularization

    Tun, Ye Lin / Thwal, Chu Myaet / Park, Yu Min / Park, Seong-Bae / Hong, Choong Seon

    2022  

    Abstract: In contrast to centralized model training that involves data collection, federated learning (FL) enables remote clients to collaboratively train a model without exposing their private data. However, model performance usually degrades in FL due to the ... ...

    Abstract In contrast to centralized model training that involves data collection, federated learning (FL) enables remote clients to collaboratively train a model without exposing their private data. However, model performance usually degrades in FL due to the heterogeneous data generated by clients of diverse characteristics. One promising strategy to maintain good performance is by limiting the local training from drifting far away from the global model. Previous studies accomplish this by regularizing the distance between the representations learned by the local and global models. However, they only consider representations from the early layers of a model or the layer preceding the output layer. In this study, we introduce FedIntR, which provides a more fine-grained regularization by integrating the representations of intermediate layers into the local training process. Specifically, FedIntR computes a regularization term that encourages the closeness between the intermediate layer representations of the local and global models. Additionally, FedIntR automatically determines the contribution of each layer's representation to the regularization term based on the similarity between local and global representations. We conduct extensive experiments on various datasets to show that FedIntR can achieve equivalent or higher performance compared to the state-of-the-art approaches. Our code is available at https://github.com/YLTun/FedIntR.

    Comment: IEEE BigComp 2023
    Keywords Computer Science - Machine Learning ; Computer Science - Artificial Intelligence
    Subject code 006
    Publishing date 2022-10-27
    Publishing country us
    Document type Book ; Online
    Database BASE - Bielefeld Academic Search Engine (life sciences selection)

    More links

    Kategorien

  6. Book ; Online: An Efficient Federated Learning Framework for Training Semantic Communication System

    Nguyen, Loc X. / Le, Huy Q. / Tun, Ye Lin / Aung, Pyae Sone / Tun, Yan Kyaw / Han, Zhu / Hong, Choong Seon

    2023  

    Abstract: Semantic communication has emerged as a pillar for the next generation of communication systems due to its capabilities in alleviating data redundancy. Most semantic communication systems are built upon advanced deep learning models whose training ... ...

    Abstract Semantic communication has emerged as a pillar for the next generation of communication systems due to its capabilities in alleviating data redundancy. Most semantic communication systems are built upon advanced deep learning models whose training performance heavily relies on data availability. Existing studies often make unrealistic assumptions of a readily accessible data source, where in practice, data is mainly created on the client side. Due to privacy and security concerns, the transmission of data is restricted, which is necessary for conventional centralized training schemes. To address this challenge, we explore semantic communication in a federated learning (FL) setting that utilizes client data without leaking privacy. Additionally, we design our system to tackle the communication overhead by reducing the quantity of information delivered in each global round. In this way, we can save significant bandwidth for resource-limited devices and reduce overall network traffic. Finally, we introduce a mechanism to aggregate the global model from clients, called FedLol. Extensive simulation results demonstrate the effectiveness of our proposed technique compared to baseline methods.

    Comment: 5 pages, 3 figures
    Keywords Computer Science - Machine Learning
    Subject code 006
    Publishing date 2023-10-19
    Publishing country us
    Document type Book ; Online
    Database BASE - Bielefeld Academic Search Engine (life sciences selection)

    More links

    Kategorien

  7. Book ; Online: Swin Transformer-Based Dynamic Semantic Communication for Multi-User with Different Computing Capacity

    Nguyen, Loc X. / Tun, Ye Lin / Tun, Yan Kyaw / Nguyen, Minh N. H. / Zhang, Chaoning / Han, Zhu / Hong, Choong Seon

    2023  

    Abstract: Semantic communication has gained significant attention from researchers as a promising technique to replace conventional communication in the next generation of communication systems, primarily due to its ability to reduce communication costs. However, ... ...

    Abstract Semantic communication has gained significant attention from researchers as a promising technique to replace conventional communication in the next generation of communication systems, primarily due to its ability to reduce communication costs. However, little literature has studied its effectiveness in multi-user scenarios, particularly when there are variations in the model architectures used by users and their computing capacities. To address this issue, we explore a semantic communication system that caters to multiple users with different model architectures by using a multi-purpose transmitter at the base station (BS). Specifically, the BS in the proposed framework employs semantic and channel encoders to encode the image for transmission, while the receiver utilizes its local channel and semantic decoder to reconstruct the original image. Our joint source-channel encoder at the BS can effectively extract and compress semantic features for specific users by considering the signal-to-noise ratio (SNR) and computing capacity of the user. Based on the network status, the joint source-channel encoder at the BS can adaptively adjust the length of the transmitted signal. A longer signal ensures more information for high-quality image reconstruction for the user, while a shorter signal helps avoid network congestion. In addition, we propose a hybrid loss function for training, which enhances the perceptual details of reconstructed images. Finally, we conduct a series of extensive evaluations and ablation studies to validate the effectiveness of the proposed system.

    Comment: 14 pages, 10 figures
    Keywords Computer Science - Information Theory
    Subject code 003
    Publishing date 2023-07-07
    Publishing country us
    Document type Book ; Online
    Database BASE - Bielefeld Academic Search Engine (life sciences selection)

    More links

    Kategorien

  8. Book ; Online: Federated Learning with Diffusion Models for Privacy-Sensitive Vision Tasks

    Tun, Ye Lin / Thwal, Chu Myaet / Yoon, Ji Su / Kang, Sun Moo / Zhang, Chaoning / Hong, Choong Seon

    2023  

    Abstract: Diffusion models have shown great potential for vision-related tasks, particularly for image generation. However, their training is typically conducted in a centralized manner, relying on data collected from publicly available sources. This approach may ... ...

    Abstract Diffusion models have shown great potential for vision-related tasks, particularly for image generation. However, their training is typically conducted in a centralized manner, relying on data collected from publicly available sources. This approach may not be feasible or practical in many domains, such as the medical field, which involves privacy concerns over data collection. Despite the challenges associated with privacy-sensitive data, such domains could still benefit from valuable vision services provided by diffusion models. Federated learning (FL) plays a crucial role in enabling decentralized model training without compromising data privacy. Instead of collecting data, an FL system gathers model parameters, effectively safeguarding the private data of different parties involved. This makes FL systems vital for managing decentralized learning tasks, especially in scenarios where privacy-sensitive data is distributed across a network of clients. Nonetheless, FL presents its own set of challenges due to its distributed nature and privacy-preserving properties. Therefore, in this study, we explore the FL strategy to train diffusion models, paving the way for the development of federated diffusion models. We conduct experiments on various FL scenarios, and our findings demonstrate that federated diffusion models have great potential to deliver vision services to privacy-sensitive domains.
    Keywords Computer Science - Machine Learning ; Computer Science - Cryptography and Security
    Publishing date 2023-11-28
    Publishing country us
    Document type Book ; Online
    Database BASE - Bielefeld Academic Search Engine (life sciences selection)

    More links

    Kategorien

  9. Article ; Online: Losing a jewel-Rapid declines in Myanmar's intact forests from 2002-2014.

    Bhagwat, Tejas / Hess, Andrea / Horning, Ned / Khaing, Thiri / Thein, Zaw Min / Aung, Kyaw Moe / Aung, Kyaw Htet / Phyo, Paing / Tun, Ye Lin / Oo, Aung Htat / Neil, Anthony / Thu, Win Myo / Songer, Melissa / LaJeunesse Connette, Katherine / Bernd, Asja / Huang, Qiongyu / Connette, Grant / Leimgruber, Peter

    PloS one

    2017  Volume 12, Issue 5, Page(s) e0176364

    Abstract: New and rapid political and economic changes in Myanmar are increasing the pressures on the country's forests. Yet, little is known about the past and current condition of these forests and how fast they are declining. We mapped forest cover in Myanmar ... ...

    Abstract New and rapid political and economic changes in Myanmar are increasing the pressures on the country's forests. Yet, little is known about the past and current condition of these forests and how fast they are declining. We mapped forest cover in Myanmar through a consortium of international organizations and environmental non-governmental groups, using freely-available public domain data and open source software tools. We used Landsat satellite imagery to assess the condition and spatial distribution of Myanmar's intact and degraded forests with special focus on changes in intact forest between 2002 and 2014. We found that forests cover 42,365,729 ha or 63% of Myanmar, making it one of the most forested countries in the region. However, severe logging, expanding plantations, and degradation pose increasing threats. Only 38% of the country's forests can be considered intact with canopy cover >80%. Between 2002 and 2014, intact forests declined at a rate of 0.94% annually, totaling more than 2 million ha forest loss. Losses can be extremely high locally and we identified 9 townships as forest conversion hotspots. We also delineated 13 large (>100,000 ha) and contiguous intact forest landscapes, which are dispersed across Myanmar. The Northern Forest Complex supports four of these landscapes, totaling over 6.1 million ha of intact forest, followed by the Southern Forest Complex with three landscapes, comprising 1.5 million ha. These remaining contiguous forest landscape should have high priority for protection. Our project demonstrates how open source data and software can be used to develop and share critical information on forests when such data are not readily available elsewhere. We provide all data, code, and outputs freely via the internet at (for scripts: https://bitbucket.org/rsbiodiv/; for the data: http://geonode.themimu.info/layers/geonode%3Amyan_lvl2_smoothed_dec2015_resamp).
    MeSH term(s) Conservation of Natural Resources ; Forests ; Myanmar ; Satellite Imagery ; Trees/physiology
    Language English
    Publishing date 2017-05-17
    Publishing country United States
    Document type Journal Article
    ISSN 1932-6203
    ISSN (online) 1932-6203
    DOI 10.1371/journal.pone.0176364
    Database MEDical Literature Analysis and Retrieval System OnLINE

    More links

    Kategorien

To top