{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,25]],"date-time":"2026-03-25T18:59:42Z","timestamp":1774465182357,"version":"3.50.1"},"reference-count":60,"publisher":"Association for Computing Machinery (ACM)","issue":"2","license":[{"start":{"date-parts":[[2025,5,30]],"date-time":"2025-05-30T00:00:00Z","timestamp":1748563200000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by-sa\/4.0\/"}],"content-domain":{"domain":["dl.acm.org"],"crossmark-restriction":true},"short-container-title":["ACM Trans. Model. Perform. Eval. Comput. Syst."],"published-print":{"date-parts":[[2025,6,30]]},"abstract":"<jats:p>\n            Federated Learning (FL) enables collaborative model training across decentralized edge devices while preserving data privacy. However, existing FL methods often assume clean annotated datasets, which are impractical for resource-constrained edge devices. In reality, noisy labels are prevalent, posing significant challenges to FL performance. Prior approaches attempt label correction and robust training techniques but exhibit limited efficacy, particularly under high noise levels. This article introduces ClipFL (\n            <jats:italic>F<\/jats:italic>\n            ederated\n            <jats:italic>L<\/jats:italic>\n            earning\n            <jats:italic>Cli<\/jats:italic>\n            ent\n            <jats:italic>P<\/jats:italic>\n            runing), a novel framework addressing noisy labels from a fresh perspective. ClipFL identifies and excludes noisy clients based on their performance on a clean validation dataset, tracked using a Noise Candidacy Score (NCS). The framework includes three phases: pre-client pruning to identify potential noisy clients and calculate their NCS, client pruning to exclude a percentage of clients with the highest NCS, and post-client pruning for fine-tuning the global model with standard FL on clean clients. Empirical evaluation demonstrates ClipFL\u2019s efficacy across diverse datasets and noise levels, achieving accurate noisy client identification, superior performance, faster convergence, and reduced communication costs compared to state-of-the-art FL methods. Our code is available at\u00a0\n            <jats:ext-link xmlns:xlink=\"http:\/\/www.w3.org\/1999\/xlink\" xlink:href=\"https:\/\/github.com\/MMorafah\/ClipFL\">https:\/\/github.com\/MMorafah\/ClipFL<\/jats:ext-link>\n            .\n          <\/jats:p>","DOI":"10.1145\/3706058","type":"journal-article","created":{"date-parts":[[2024,11,27]],"date-time":"2024-11-27T09:52:04Z","timestamp":1732701124000},"page":"1-25","update-policy":"https:\/\/doi.org\/10.1145\/crossmark-policy","source":"Crossref","is-referenced-by-count":3,"title":["Federated Learning Client Pruning for Noisy Labels"],"prefix":"10.1145","volume":"10","author":[{"ORCID":"https:\/\/orcid.org\/0000-0002-2518-071X","authenticated-orcid":false,"given":"Mahdi","family":"Morafah","sequence":"first","affiliation":[{"name":"Electrical and Computer Engineering, University of California San Diego, La Jolla, United States"}]},{"ORCID":"https:\/\/orcid.org\/0009-0007-1583-1375","authenticated-orcid":false,"given":"Hojin","family":"Chang","sequence":"additional","affiliation":[{"name":"Electrical and Computer Engineering, University of California San Diego, La Jolla, United States"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-3957-7061","authenticated-orcid":false,"given":"Chen","family":"Chen","sequence":"additional","affiliation":[{"name":"Department of Computer Science, University of Central Florida, Orlando, United States"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-0965-7247","authenticated-orcid":false,"given":"Bill","family":"Lin","sequence":"additional","affiliation":[{"name":"Electrical and Computer Engineering, University of California San Diego, La Jolla, United States"}]}],"member":"320","published-online":{"date-parts":[[2025,5,30]]},"reference":[{"key":"e_1_3_2_2_2","article-title":"Synthetic data aided federated learning using foundation models","author":"Abacha Fatima","year":"2024","unstructured":"Fatima Abacha, Sin G. Teo, Lucas C. Cordeiro, and Mustafa A. Mustafa. 2024. Synthetic data aided federated learning using foundation models. In Proceedings of theInternational Workshop on Federated Learning in the Age of Foundation Models in Conjunction with IJCAI 2024 (FL@FM-IJCAI \u201924).","journal-title":"International Workshop on Federated Learning in the Age of Foundation Models in Conjunction with IJCAI 2024 (FL@FM-IJCAI \u201924)."},{"key":"e_1_3_2_3_2","doi-asserted-by":"publisher","DOI":"10.1109\/TKDE.2024.3361474"},{"key":"e_1_3_2_4_2","volume-title":"Proceedings of the 11th International Conference on Learning Representations","author":"Chen Hong-You","year":"2023","unstructured":"Hong-You Chen, Cheng-Hao Tu, Ziwei Li, Han Wei Shen, and Wei-Lun Chao. 2023. On the importance and applicability of pre-training for federated learning. In Proceedings of the 11th International Conference on Learning Representations. https:\/\/openreview.net\/forum?id=fWWFv--P0xP"},{"key":"e_1_3_2_5_2","first-page":"2089","volume-title":"Proceedings of the International Conference on Machine Learning","author":"Collins Liam","year":"2021","unstructured":"Liam Collins, Hamed Hassani, Aryan Mokhtari, and Sanjay Shakkottai. 2021. Exploiting shared representations for personalized federated learning. In Proceedings of the International Conference on Machine Learning. 2089\u20132099."},{"key":"e_1_3_2_6_2","doi-asserted-by":"publisher","DOI":"10.1038\/s41591-021-01506-3"},{"key":"e_1_3_2_7_2","doi-asserted-by":"publisher","DOI":"10.1109\/CVPR.2009.5206848"},{"key":"e_1_3_2_8_2","unstructured":"Alexey Dosovitskiy Lucas Beyer Alexander Kolesnikov Dirk Weissenborn Xiaohua Zhai Thomas Unterthiner Mostafa Dehghani Matthias Minderer Georg Heigold Sylvain Gelly et\u00a0al. 2021. An image is worth 16x16 words: Transformers for image recognition at scale. arXiv:2010.11929 [cs.CV] (2021)."},{"key":"e_1_3_2_9_2","doi-asserted-by":"publisher","DOI":"10.1109\/CVPR52688.2022.00983"},{"key":"e_1_3_2_10_2","doi-asserted-by":"publisher","DOI":"10.1109\/CVPR52688.2022.00983"},{"key":"e_1_3_2_11_2","article-title":"On the convergence of local descent methods in federated learning","author":"Haddadpour Farzin","year":"2019","unstructured":"Farzin Haddadpour and Mehrdad Mahdavi. 2019. On the convergence of local descent methods in federated learning. arXiv preprint arXiv:1910.14425 (2019).","journal-title":"arXiv preprint arXiv:1910.14425"},{"key":"e_1_3_2_12_2","article-title":"A survey of label-noise representation learning: Past, present and future","author":"Han Bo","year":"2020","unstructured":"Bo Han, Quanming Yao, Tongliang Liu, Gang Niu, Ivor W. Tsang, James T. Kwok, and Masashi Sugiyama. 2020. A survey of label-noise representation learning: Past, present and future. arXiv preprint arXiv:2011.04406 (2020).","journal-title":"arXiv preprint arXiv:2011.04406"},{"key":"e_1_3_2_13_2","article-title":"Federated learning for mobile keyboard prediction","author":"Hard Andrew","year":"2018","unstructured":"Andrew Hard, Kanishka Rao, Rajiv Mathews, Swaroop Ramaswamy, Fran\u00e7oise Beaufays, Sean Augenstein, Hubert Eichner, Chlo\u00e9 Kiddon, and Daniel Ramage. 2018. Federated learning for mobile keyboard prediction. arXiv preprint arXiv:1811.03604 (2018).","journal-title":"arXiv preprint arXiv:1811.03604"},{"key":"e_1_3_2_14_2","article-title":"Measuring the effects of non-identical data distribution for federated visual classification","author":"Hsu Tzu-Ming Harry","year":"2019","unstructured":"Tzu-Ming Harry Hsu, Hang Qi, and Matthew Brown. 2019. Measuring the effects of non-identical data distribution for federated visual classification. arXiv preprint arXiv:1909.06335 (2019).","journal-title":"arXiv preprint arXiv:1909.06335"},{"key":"e_1_3_2_15_2","doi-asserted-by":"publisher","DOI":"10.1145\/3511808.3557475"},{"key":"e_1_3_2_16_2","doi-asserted-by":"publisher","DOI":"10.1561\/2200000083"},{"key":"e_1_3_2_17_2","article-title":"Mime: Mimicking centralized stochastic algorithms in federated learning","author":"Karimireddy Sai Praneeth","year":"2020","unstructured":"Sai Praneeth Karimireddy, Martin Jaggi, Satyen Kale, Mehryar Mohri, Sashank J. Reddi, Sebastian U. Stich, and Ananda Theertha Suresh. 2020. Mime: Mimicking centralized stochastic algorithms in federated learning. arXiv preprint arXiv:2008.03606 (2020).","journal-title":"arXiv preprint arXiv:2008.03606"},{"key":"e_1_3_2_18_2","first-page":"5132","volume-title":"Proceedings of the International Conference on Machine Learning","author":"Karimireddy Sai Praneeth","year":"2020","unstructured":"Sai Praneeth Karimireddy, Satyen Kale, Mehryar Mohri, Sashank Reddi, Sebastian Stich, and Ananda Theertha Suresh. 2020. Scaffold: Stochastic controlled averaging for federated learning. In Proceedings of the International Conference on Machine Learning. 5132\u20135143."},{"key":"e_1_3_2_19_2","first-page":"19184","article-title":"Federated hyperparameter tuning: Challenges, baselines, and connections to weight-sharing","volume":"34","author":"Khodak Mikhail","year":"2021","unstructured":"Mikhail Khodak, Renbo Tu, Tian Li, Liam Li, Maria-Florina F. Balcan, Virginia Smith, and Ameet Talwalkar. 2021. Federated hyperparameter tuning: Challenges, baselines, and connections to weight-sharing. Advances in Neural Information Processing Systems 34 (2021), 19184\u201319197.","journal-title":"Advances in Neural Information Processing Systems"},{"key":"e_1_3_2_20_2","unstructured":"Alex Krizhevsky. 2009. Learning Multiple Layers of Features from Tiny Images. University of Toronto."},{"key":"e_1_3_2_21_2","doi-asserted-by":"publisher","DOI":"10.1145\/3447993.3483278"},{"key":"e_1_3_2_22_2","doi-asserted-by":"publisher","DOI":"10.1145\/3485730.3485929"},{"key":"e_1_3_2_23_2","doi-asserted-by":"publisher","DOI":"10.1109\/CVPR46437.2021.01057"},{"key":"e_1_3_2_24_2","first-page":"429","article-title":"Federated optimization in heterogeneous networks","volume":"2","author":"Li Tian","year":"2020","unstructured":"Tian Li, Anit Kumar Sahu, Manzil Zaheer, Maziar Sanjabi, Ameet Talwalkar, and Virginia Smith. 2020. Federated optimization in heterogeneous networks. Proceedings of Machine Learning and Systems 2 (2020), 429\u2013450.","journal-title":"Proceedings of Machine Learning and Systems"},{"key":"e_1_3_2_25_2","first-page":"1227","volume-title":"Proceedings of the 2019 53rd Asilomar Conference on Signals, Systems, and Computers","author":"Li Tian","year":"2019","unstructured":"Tian Li, Anit Kumar Sahu, Manzil Zaheer, Maziar Sanjabi, Ameet Talwalkar, and Virginia Smithy. 2019. FedDANE: A federated Newton-type method. In Proceedings of the 2019 53rd Asilomar Conference on Signals, Systems, and Computers. IEEE, 1227\u20131231."},{"key":"e_1_3_2_26_2","first-page":"10082","volume-title":"Proceedings of the IEEE\/CVF Conference on Computer Vision and Pattern Recognition","author":"Li Xin-Chun","year":"2022","unstructured":"Xin-Chun Li, Yi-Chu Xu, Shaoming Song, Bingshuai Li, Yinchuan Li, Yunfeng Shao, and De-Chuan Zhan. 2022. Federated learning with position-aware neurons. In Proceedings of the IEEE\/CVF Conference on Computer Vision and Pattern Recognition. 10082\u201310091."},{"key":"e_1_3_2_27_2","article-title":"FedNoisy: Federated noisy label learning benchmark","author":"Liang Siqi","year":"2023","unstructured":"Siqi Liang, Jintao Huang, Junyuan Hong, Dun Zeng, Jiayu Zhou, and Zenglin Xu. 2023. FedNoisy: Federated noisy label learning benchmark. arXiv preprint arXiv:2306.11650 (2023).","journal-title":"arXiv preprint arXiv:2306.11650"},{"key":"e_1_3_2_28_2","first-page":"2351","article-title":"Ensemble distillation for robust model fusion in federated learning","volume":"33","author":"Lin Tao","year":"2020","unstructured":"Tao Lin, Lingjing Kong, Sebastian U. Stich, and Martin Jaggi. 2020. Ensemble distillation for robust model fusion in federated learning. Advances in Neural Information Processing Systems 33 (2020), 2351\u20132363.","journal-title":"Advances in Neural Information Processing Systems"},{"key":"e_1_3_2_29_2","first-page":"13857","volume-title":"Proceedings of the International Conference on Machine Learning","author":"Liu Chang","year":"2022","unstructured":"Chang Liu, Chenfei Lou, Runzhong Wang, Alan Yuhan Xi, Li Shen, and Junchi Yan. 2022. Deep neural network fusion via graph matching with applications to model ensemble and federated learning. In Proceedings of the International Conference on Machine Learning. 13857\u201313869."},{"key":"e_1_3_2_30_2","first-page":"1273","volume-title":"Artificial Intelligence and Statistics","author":"McMahan Brendan","year":"2017","unstructured":"Brendan McMahan, Eider Moore, Daniel Ramage, Seth Hampson, and Blaise Aguera y Arcas. 2017. Communication-efficient learning of deep networks from decentralized data. In Artificial Intelligence and Statistics. PMLR, 1273\u20131282."},{"key":"e_1_3_2_31_2","doi-asserted-by":"publisher","DOI":"10.1109\/CVPR52688.2022.00821"},{"key":"e_1_3_2_32_2","article-title":"Stable diffusion-based data augmentation for federated learning with non-IID data","author":"Morafah Mahdi","year":"2024","unstructured":"Mahdi Morafah, Matthias Reisser, Bill Lin, and Christos Louizos. 2024. Stable diffusion-based data augmentation for federated learning with non-IID data. In Proceedings of theInternational Workshop on Federated Foundation Models for the Web 2024 (FL@FM-TheWebConf \u201924).","journal-title":"International Workshop on Federated Foundation Models for the Web 2024 (FL@FM-TheWebConf \u201924)."},{"key":"e_1_3_2_33_2","unstructured":"Mahdi Morafah Weijia Wang and Bill Lin. 2023. A practical recipe for federated learning under statistical heterogeneity experimental design. arXiv:2307.15245 [cs.LG] (2023)."},{"key":"e_1_3_2_34_2","doi-asserted-by":"publisher","DOI":"10.1145\/3394486.3403176"},{"key":"e_1_3_2_35_2","article-title":"Where to begin? Exploring the impact of pre-training and initialization in federated learning","author":"Nguyen John","year":"2023","unstructured":"John Nguyen, Kshitiz Malik, Maziar Sanjabi, and Michael Rabbat. 2023. Where to begin? Exploring the impact of pre-training and initialization in federated learning. In Proceedings of theInternational Conference on Learning Representations.","journal-title":"International Conference on Learning Representations."},{"key":"e_1_3_2_36_2","unstructured":"Adam Paszke Sam Gross Francisco Massa Adam Lerer James Bradbury Gregory Chanan Trevor Killeen Zeming Lin Natalia Gimelshein Luca Antiga et\u00a0al. 2019. PyTorch: An imperative style high-performance deep learning library. arXiv:1912.01703 [cs.LG] (2019)."},{"key":"e_1_3_2_37_2","doi-asserted-by":"publisher","DOI":"10.1109\/CVPR52688.2022.00982"},{"key":"e_1_3_2_38_2","article-title":"Adaptive federated optimization","author":"Reddi Sashank","year":"2020","unstructured":"Sashank Reddi, Zachary Charles, Manzil Zaheer, Zachary Garrett, Keith Rush, Jakub Kone\u010dn\u1ef3, Sanjiv Kumar, and H. Brendan McMahan. 2020. Adaptive federated optimization. arXiv preprint arXiv:2003.00295 (2020).","journal-title":"arXiv preprint arXiv:2003.00295"},{"key":"e_1_3_2_39_2","unstructured":"Sashank Reddi Zachary Charles Manzil Zaheer Zachary Garrett Keith Rush Jakub Kone\u010dn\u00fd Sanjiv Kumar and H. Brendan McMahan. 2021. Adaptive federated optimization. arXiv:2003.00295 [cs.LG] (2021)."},{"key":"e_1_3_2_40_2","doi-asserted-by":"publisher","DOI":"10.1109\/CVPR52688.2022.01042"},{"key":"e_1_3_2_41_2","doi-asserted-by":"publisher","DOI":"10.1109\/TNNLS.2021.3129371"},{"key":"e_1_3_2_42_2","first-page":"22045","article-title":"Model fusion via optimal transport","volume":"33","author":"Singh Sidak Pal","year":"2020","unstructured":"Sidak Pal Singh and Martin Jaggi. 2020. Model fusion via optimal transport. Advances in Neural Information Processing Systems 33 (2020), 22045\u201322055.","journal-title":"Advances in Neural Information Processing Systems"},{"key":"e_1_3_2_43_2","article-title":"Learning from noisy labels with deep neural networks: A survey","author":"Song Hwanjun","year":"2023","unstructured":"Hwanjun Song, Minseok Kim, Dongmin Park, Yooju Shin, and Jae-Gil Lee. 2023. Learning from noisy labels with deep neural networks: A survey. IEEE Transactions on Neural Networks and Learning Systems 34, 11 (2023), 8135\u20138153.","journal-title":"IEEE Transactions on Neural Networks and Learning Systems"},{"key":"e_1_3_2_44_2","unstructured":"Andreas Steiner Alexander Kolesnikov Xiaohua Zhai Ross Wightman Jakob Uszkoreit and Lucas Beyer. 2022. How to train your ViT? Data augmentation and regularization in Vision Transformers. arXiv:2106.10270 [cs.CV] (2022)."},{"key":"e_1_3_2_45_2","first-page":"2118","volume-title":"Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP \u201920)","author":"Sui Dianbo","year":"2020","unstructured":"Dianbo Sui, Yubo Chen, Jun Zhao, Yantao Jia, Yuantao Xie, and Weijian Sun. 2020. FedED: Federated learning via ensemble distillation for medical relation extraction. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP \u201920). 2118\u20132128."},{"key":"e_1_3_2_46_2","doi-asserted-by":"publisher","DOI":"10.1109\/TNNLS.2022.3160699"},{"key":"e_1_3_2_47_2","first-page":"5020","volume-title":"Proceedings of the 2020 25th International Conference on Pattern Recognition (ICPR \u201921)","author":"Tuor Tiffany","year":"2021","unstructured":"Tiffany Tuor, Shiqiang Wang, Bong Jun Ko, Changchang Liu, and Kin K. Leung. 2021. Overcoming noisy and irrelevant data in federated learning. In Proceedings of the 2020 25th International Conference on Pattern Recognition (ICPR \u201921). IEEE, 5020\u20135027."},{"key":"e_1_3_2_48_2","first-page":"27","volume-title":"Proceedings of the 2021 IEEE 41st International Conference on Distributed Computing Systems Workshops (ICDCSW \u201921)","author":"Vahidian Saeed","year":"2021","unstructured":"Saeed Vahidian, Mahdi Morafah, and Bill Lin. 2021. Personalized federated learning by structured and unstructured pruning under data heterogeneity. In Proceedings of the 2021 IEEE 41st International Conference on Distributed Computing Systems Workshops (ICDCSW \u201921). IEEE, 27\u201334."},{"key":"e_1_3_2_49_2","article-title":"Federated learning with matched averaging","author":"Wang Hongyi","year":"2020","unstructured":"Hongyi Wang, Mikhail Yurochkin, Yuekai Sun, Dimitris Papailiopoulos, and Yasaman Khazaeni. 2020. Federated learning with matched averaging. arXiv preprint arXiv:2002.06440 (2020).","journal-title":"arXiv preprint arXiv:2002.06440"},{"key":"e_1_3_2_50_2","first-page":"7611","article-title":"Tackling the objective inconsistency problem in heterogeneous federated optimization","volume":"33","author":"Wang Jianyu","year":"2020","unstructured":"Jianyu Wang, Qinghua Liu, Hao Liang, Gauri Joshi, and H. Vincent Poor. 2020. Tackling the objective inconsistency problem in heterogeneous federated optimization. Advances in Neural Information Processing Systems 33 (2020), 7611\u20137623.","journal-title":"Advances in Neural Information Processing Systems"},{"key":"e_1_3_2_51_2","doi-asserted-by":"publisher","DOI":"10.1109\/CVPR42600.2020.01374"},{"key":"e_1_3_2_52_2","unstructured":"Ross Wightman. 2019. PyTorch Image Models. Retrieved November 29 2024 from https:\/\/github.com\/rwightman\/pytorch-image-models"},{"key":"e_1_3_2_53_2","doi-asserted-by":"publisher","DOI":"10.1109\/CVPR52688.2022.00994"},{"key":"e_1_3_2_54_2","doi-asserted-by":"publisher","DOI":"10.1145\/3626235"},{"key":"e_1_3_2_55_2","doi-asserted-by":"publisher","DOI":"10.1109\/TVT.2021.3131852"},{"key":"e_1_3_2_56_2","doi-asserted-by":"publisher","DOI":"10.1109\/MIS.2022.3151466"},{"key":"e_1_3_2_57_2","doi-asserted-by":"crossref","unstructured":"Yiqiang Chen Xiaodong Yang Xin Qin Han Yu Biao Chen and Zhiqi Shen. 2020. Focus: Dealing with label quality disparity in federated learning. arXiv preprint arXiv:2001.11359 (2020).","DOI":"10.1007\/978-3-030-63076-8_8"},{"key":"e_1_3_2_58_2","doi-asserted-by":"publisher","DOI":"10.1145\/3519311"},{"key":"e_1_3_2_59_2","article-title":"Text-to-image diffusion model in generative AI: A survey","author":"Zhang Chenshuang","year":"2023","unstructured":"Chenshuang Zhang, Chaoning Zhang, Mengchun Zhang, and In So Kweon. 2023. Text-to-image diffusion model in generative AI: A survey. arXiv preprint arXiv:2303.07909 (2023).","journal-title":"arXiv preprint arXiv:2303.07909"},{"key":"e_1_3_2_60_2","unstructured":"Hongyi Zhang Moustapha Cisse Yann N. Dauphin and David Lopez-Paz. 2018. mixup: Beyond empirical risk minimization. arXiv:1710.09412 [cs.LG] (2018)."},{"key":"e_1_3_2_61_2","article-title":"Federated learning with Non-IID data","author":"Zhao Yue","year":"2018","unstructured":"Yue Zhao, Meng Li, Liangzhen Lai, Naveen Suda, Damon Civin, and Vikas Chandra. 2018. Federated learning with Non-IID data. arXiv preprint arXiv:1806.00582 (2018).","journal-title":"arXiv preprint arXiv:1806.00582"}],"container-title":["ACM Transactions on Modeling and Performance Evaluation of Computing Systems"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/dl.acm.org\/doi\/10.1145\/3706058","content-type":"unspecified","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/dl.acm.org\/doi\/pdf\/10.1145\/3706058","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,6,19]],"date-time":"2025-06-19T01:18:13Z","timestamp":1750295893000},"score":1,"resource":{"primary":{"URL":"https:\/\/dl.acm.org\/doi\/10.1145\/3706058"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2025,5,30]]},"references-count":60,"journal-issue":{"issue":"2","published-print":{"date-parts":[[2025,6,30]]}},"alternative-id":["10.1145\/3706058"],"URL":"https:\/\/doi.org\/10.1145\/3706058","relation":{},"ISSN":["2376-3639","2376-3647"],"issn-type":[{"value":"2376-3639","type":"print"},{"value":"2376-3647","type":"electronic"}],"subject":[],"published":{"date-parts":[[2025,5,30]]},"assertion":[{"value":"2024-04-22","order":0,"name":"received","label":"Received","group":{"name":"publication_history","label":"Publication History"}},{"value":"2024-10-25","order":2,"name":"accepted","label":"Accepted","group":{"name":"publication_history","label":"Publication History"}},{"value":"2025-05-30","order":3,"name":"published","label":"Published","group":{"name":"publication_history","label":"Publication History"}}]}}