LIVIVO - Das Suchportal für Lebenswissenschaften

switch to English language
Erweiterte Suche

Ihre letzten Suchen

  1. AU="Odierna, Francesco"
  2. AU="Monteiro, Valter" AU="Monteiro, Valter"
  3. AU=Konkel Alex
  4. AU="Alnakib, Yasir"
  5. AU=Tallerico Rossana
  6. AU=Scherer Kai
  7. AU="Cao, Guiyun"
  8. AU="Zarrouki, Youssef"
  9. AU="Abayomi, Akin"
  10. AU=Kpatcha Tchazou
  11. AU=Glaeser Robert M
  12. AU="Mioara Cristea"
  13. AU="Turiegano, Enrique"
  14. AU="Russcher, H"
  15. AU="Lim, Kean-Jin"
  16. AU="Spurek, Monika"
  17. AU="Giulia A. Zamboni"

Suchergebnis

Treffer 1 - 2 von insgesamt 2

Suchoptionen

  1. Artikel ; Online: SeReNe: Sensitivity-Based Regularization of Neurons for Structured Sparsity in Neural Networks.

    Tartaglione, Enzo / Bragagnolo, Andrea / Odierna, Francesco / Fiandrotti, Attilio / Grangetto, Marco

    IEEE transactions on neural networks and learning systems

    2022  Band 33, Heft 12, Seite(n) 7237–7250

    Abstract: Deep neural networks include millions of learnable parameters, making their deployment over resource-constrained devices problematic. Sensitivity-based regularization of neurons (SeReNe) is a method for learning sparse topologies with a structure, ... ...

    Abstract Deep neural networks include millions of learnable parameters, making their deployment over resource-constrained devices problematic. Sensitivity-based regularization of neurons (SeReNe) is a method for learning sparse topologies with a structure, exploiting neural sensitivity as a regularizer. We define the sensitivity of a neuron as the variation of the network output with respect to the variation of the activity of the neuron. The lower the sensitivity of a neuron, the less the network output is perturbed if the neuron output changes. By including the neuron sensitivity in the cost function as a regularization term, we are able to prune neurons with low sensitivity. As entire neurons are pruned rather than single parameters, practical network footprint reduction becomes possible. Our experimental results on multiple network architectures and datasets yield competitive compression ratios with respect to state-of-the-art references.
    Mesh-Begriff(e) Neural Networks, Computer ; Data Compression/methods ; Algorithms ; Neurons
    Sprache Englisch
    Erscheinungsdatum 2022-11-30
    Erscheinungsland United States
    Dokumenttyp Journal Article
    ISSN 2162-2388
    ISSN (online) 2162-2388
    DOI 10.1109/TNNLS.2021.3084527
    Datenquelle MEDical Literature Analysis and Retrieval System OnLINE

    Zusatzmaterialien

    Kategorien

  2. Buch ; Online: SeReNe

    Tartaglione, Enzo / Bragagnolo, Andrea / Odierna, Francesco / Fiandrotti, Attilio / Grangetto, Marco

    Sensitivity based Regularization of Neurons for Structured Sparsity in Neural Networks

    2021  

    Abstract: Deep neural networks include millions of learnable parameters, making their deployment over resource-constrained devices problematic. SeReNe (Sensitivity-based Regularization of Neurons) is a method for learning sparse topologies with a structure, ... ...

    Abstract Deep neural networks include millions of learnable parameters, making their deployment over resource-constrained devices problematic. SeReNe (Sensitivity-based Regularization of Neurons) is a method for learning sparse topologies with a structure, exploiting neural sensitivity as a regularizer. We define the sensitivity of a neuron as the variation of the network output with respect to the variation of the activity of the neuron. The lower the sensitivity of a neuron, the less the network output is perturbed if the neuron output changes. By including the neuron sensitivity in the cost function as a regularization term, we areable to prune neurons with low sensitivity. As entire neurons are pruned rather then single parameters, practical network footprint reduction becomes possible. Our experimental results on multiple network architectures and datasets yield competitive compression ratios with respect to state-of-the-art references.
    Schlagwörter Computer Science - Machine Learning ; Computer Science - Artificial Intelligence ; Statistics - Machine Learning
    Thema/Rubrik (Code) 006
    Erscheinungsdatum 2021-02-07
    Erscheinungsland us
    Dokumenttyp Buch ; Online
    Datenquelle BASE - Bielefeld Academic Search Engine (Lebenswissenschaftliche Auswahl)

    Zusatzmaterialien

    Kategorien

Zum Seitenanfang