| Makale Türü |
|
| Makale Alt Türü | SSCI, AHCI, SCI, SCI-Exp dergilerinde yayınlanan tam makale |
| Dergi Adı | The Journal of Supercomputing |
| Dergi ISSN | 1573-0484 Wos Dergi Scopus Dergi |
| Dergi Tarandığı Indeksler | SCI-Expanded |
| Dergi Grubu | Q2 |
| Makale Dili | İngilizce |
| Basım Tarihi | 05-2025 |
| Cilt No | 81 |
| Sayı | 8 |
| DOI Numarası | 10.1007/s11227-025-07347-y |
| Makale Linki | https://doi.org/10.1007/s11227-025-07347-y |
| Özet |
| Convolutional neural networks (CNNs) have become very popular, as they can successfully solve problems in many areas by obtaining representations of input data at different layers with tuned hyperparameters. A CNN’s hyperparameters include design parameters (DPs), which describe the depth of the CNN and order of layers; layer parameters (LPs), which are used for each CNN layer; and training parameters, which are used for training the CNN. The performance of CNNs depends on these hyperparameters, but setting them properly remains a very difficult and important problem. Although there are studies in the literature that optimize each of these three parameter groups separately, there is a lack of methodologies for simultaneous optimization of DPs and LPs in a nested framework. This study proposes a novel method called SwarmCNN, which combines particle swarm optimization and artificial bee colony … |
| Anahtar Kelimeler |
| ABC | CIFAR-10 | Hyperparameter optimization | PSO | MNIST | Neural architecture search |
| Dergi Adı | JOURNAL OF SUPERCOMPUTING |
| Yayıncı | Springer Netherlands |
| Açık Erişim | Hayır |
| ISSN | 0920-8542 |
| E-ISSN | 1573-0484 |
| CiteScore | 7,1 |
| SJR | 0,716 |
| SNIP | 1,181 |