Instrumentation & Measurement Magazine 26-2 - 27

use of DTN instead of LN as a way to enhance the focus on the
local information of defects and improve the accuracy. DHT
shows good performance compared to other advanced models
and achieves the state-of-the-art accuracy on industrial surface
defect image classification.
One limitation of this work is that the artificially-designed
defect library is different from the real defects, and the model
needs to be further fine-tuned to improve its performance for
real defect types and sizes. Further research can evaluate the
validity of the proposed model for other defect features, such
as actual defects in chemical pressure vessels or equipment.
Moreover, as the generality of deep learning algorithms has
become a research hotspot, further research can focus on the
performance of this algorithm in other vision tasks. For example,
whether the proposed method can be applied to medical
image slices to detect anatomic pathological deformations or
disorders, etc.
Acknowledgment
This work was supported by Open Fund (No.OGE20210115)
of Key Laboratory of Oil and Gas Equipment, Ministry
of Education (Southwest Petroleum University), Innovative
Entrepreneurial Project (No.CXCY-2021-22) of China Occupational
Safety and Health Association, Science and Technology
Planning Project (No. SCSJZ2022007) of Sichuan Market Supervision
Administration, School Qing Miao Project (No.
QM2021071) of Chengdu Technological University and School
Project (No.114/205190) of Chengdu Technological University.
References
[1] X. Tao et al., " Automatic metallic surface defect detection and
recognition with convolutional neural networks, " Applied
Sciences, vol. 8, no. 9, 2018.
[2] S. Aydin, " Deep learning classification of neuro-emotional phase
domain complexity levels induced by affective video film clips, "
IEEE J. Biomed. Health Informatics, vol. 24, no. 6, pp. 1695-1702,
2020.
[3] W. Liu et al., " A survey of deep neural network architectures and
their applications, " Neurocomputing, vol. 234, pp. 11-26, 2017.
[4] E. A. Smirnov, D. M. Timoshenko, and S.N. Andrianov,
" Comparison of regularization methods for imagnet classification
with deep convolutional neural networks, " in Proc. 2nd AASRI
Conf. Computational Intell. Bioinformatics (CIB), 2013.
[5] C. Szegedy et al., " Going deeper with convolutions, " in Proc. IEEE
Conf. Computer Vision and Pattern Recognition (CVPR), 2015.
[6] K. He et al., " Deep residual learning for image recognition, " in
Proc. IEEE Conf. Computer Vision and Pattern Recognition (CVPR),
2016.
[7] G. Huang et al., " Densely connected convolutional networks, " in
Proc. IEEE Conf. Computer Vision and Pattern Recognition (CVPR),
2017.
[8] O. Russakovsky et al., " ImageNet large scale visual recognition
challenge, " Int. J. Comput. Vision, vol. 115, no. 3, pp. 211-252, 2015.
[9] L. Yuan et al., " Tokens-to-Token ViT: training vision transformers
from scratch on imagenet, " in Proc. IEEE/CVF Int. Conf. Computer
Vision (ICCV), 2021.
April 2023
[10] H. Touvron et al., " Training data-efficient image transformers and
distillation through attention, " in Proc. Int. Conf. Machine Learning
(ICML), 2021.
[11] W. Wang et al., " Pyramid vision transformer: a versatile backbone
for dense prediction without convolutions, " in Proc. IEEE/CVF
Int. Conf. Computer Vision (ICCV), 2021.
[12] K. Han et al., " Transformer in transformer, " Advances in Neural
Information Processing Syst., vol. 34, 2021.
[13] H. Wu et al., " CvT: introducing convolutions to vision
transformers. " in Proc. IEEE/CVF Int. Conf. Computer Vision
(ICCV), 2021.
[14] Z. Liu et al., " Swin transformer: hierarchical vision transformer
using shifted windows, " in Proc. IEEE/CVF Int. Conf. Computer
Vision (ICCV), 2021.
[15] M. Tan and Q. V. Le, " EfficientNet: rethinking model scaling
for convolutional neural networks, " in Proc. Int. Conf. Machine
Learning (ICML), 2019.
[16] A. Howard et al., " Searching for MobileNetV3, " in Proc. IEEE/CVF
Int. Conf. Computer Vision (ICCV), 2019.
[17] Q. Bi, H. Zhang, and K. Qin, " Multi-scale stacking attention
pooling for remote sensing scene classification, " Neurocomputing,
vol. 436, pp. 147-161, 2021.
[18] J. Hu, L. Shen, and G. Sun, " Squeeze-and-excitation networks, " in
Proc. IEEE Conf. Computer Vision and Pattern Recognition (CVPR),
2018.
[19] X. Wang et al., " Non-local neural networks, " in Proc. IEEE Conf.
Computer Vision and Pattern Recognition (CVPR), 2018.
[20] C. Wang and C.-L. Liu, " Scene text recognition by attention
network with gated embedding, " in Proc. Int. Joint Conf. Neural
Networks (IJCNN) and IEEE World Congress on Computational Intell.
(IEEE WCCI), 2020.
[21] D. Lee et al., " Enhancing content preservation in text style transfer
using reverse attention and conditional layer normalization, "
in Proc. Joint Conf. 59th Annual Meeting of the Association-forComputational-Linguistics
(ACL) / 11th Int. Joint Conf. Natural
Language Processing (IJCNLP) / 6th Workshop on Representation
Learning for NLP (RepL4NLP), 2021.
[22] S. Ioffe and C. Szegedy, " Batch normalization: accelerating deep
network training by reducing internal covariate shift, " in Proc. Int.
Conf. Machine Learning (ICML), 2015.
[23] Y. Dukler, Q. Gu, and G. Montufar, " Optimization theory for
ReLU neural networks trained with normalization layers, " in
Proc. Int. Conf. Machine Learning (ICML), 2020.
[24] X.-Y. Zhou et al., " U-Net training with instance-layer
normalization, " in Proc. 1st Int. Workshop on Multiscale Multimodal
Medical Imaging (MMMI), 2019.
[25] H. Fan et al., " Multiscale vision transformers, " in Proc. IEEE/CVF
Int. Conf. Computer Vision (ICCV), 2021.
[26] K. Song and Y. Yan, " A noise robust method based on
completed local binary patterns for hot-rolled steel strip
surface defects, " Applied Surface Sci., vol. 285, pp. 858-864,
2013.
[27] D. Weimer, B. Scholz-Reiter, and M. Shpitalni, " Design of deep
convolutional neural network architectures for automated feature
extraction in industrial inspection, " CIRP Annals, vol. 65, no. 1,
pp. 417-420, 2016.
IEEE Instrumentation & Measurement Magazine
27

Instrumentation & Measurement Magazine 26-2

Table of Contents for the Digital Edition of Instrumentation & Measurement Magazine 26-2

Instrumentation & Measurement Magazine 26-2 - Cover1
Instrumentation & Measurement Magazine 26-2 - Cover2
Instrumentation & Measurement Magazine 26-2 - 1
Instrumentation & Measurement Magazine 26-2 - 2
Instrumentation & Measurement Magazine 26-2 - 3
Instrumentation & Measurement Magazine 26-2 - 4
Instrumentation & Measurement Magazine 26-2 - 5
Instrumentation & Measurement Magazine 26-2 - 6
Instrumentation & Measurement Magazine 26-2 - 7
Instrumentation & Measurement Magazine 26-2 - 8
Instrumentation & Measurement Magazine 26-2 - 9
Instrumentation & Measurement Magazine 26-2 - 10
Instrumentation & Measurement Magazine 26-2 - 11
Instrumentation & Measurement Magazine 26-2 - 12
Instrumentation & Measurement Magazine 26-2 - 13
Instrumentation & Measurement Magazine 26-2 - 14
Instrumentation & Measurement Magazine 26-2 - 15
Instrumentation & Measurement Magazine 26-2 - 16
Instrumentation & Measurement Magazine 26-2 - 17
Instrumentation & Measurement Magazine 26-2 - 18
Instrumentation & Measurement Magazine 26-2 - 19
Instrumentation & Measurement Magazine 26-2 - 20
Instrumentation & Measurement Magazine 26-2 - 21
Instrumentation & Measurement Magazine 26-2 - 22
Instrumentation & Measurement Magazine 26-2 - 23
Instrumentation & Measurement Magazine 26-2 - 24
Instrumentation & Measurement Magazine 26-2 - 25
Instrumentation & Measurement Magazine 26-2 - 26
Instrumentation & Measurement Magazine 26-2 - 27
Instrumentation & Measurement Magazine 26-2 - 28
Instrumentation & Measurement Magazine 26-2 - 29
Instrumentation & Measurement Magazine 26-2 - 30
Instrumentation & Measurement Magazine 26-2 - 31
Instrumentation & Measurement Magazine 26-2 - 32
Instrumentation & Measurement Magazine 26-2 - 33
Instrumentation & Measurement Magazine 26-2 - 34
Instrumentation & Measurement Magazine 26-2 - 35
Instrumentation & Measurement Magazine 26-2 - 36
Instrumentation & Measurement Magazine 26-2 - 37
Instrumentation & Measurement Magazine 26-2 - 38
Instrumentation & Measurement Magazine 26-2 - 39
Instrumentation & Measurement Magazine 26-2 - 40
Instrumentation & Measurement Magazine 26-2 - 41
Instrumentation & Measurement Magazine 26-2 - 42
Instrumentation & Measurement Magazine 26-2 - 43
Instrumentation & Measurement Magazine 26-2 - 44
Instrumentation & Measurement Magazine 26-2 - 45
Instrumentation & Measurement Magazine 26-2 - 46
Instrumentation & Measurement Magazine 26-2 - 47
Instrumentation & Measurement Magazine 26-2 - 48
Instrumentation & Measurement Magazine 26-2 - 49
Instrumentation & Measurement Magazine 26-2 - 50
Instrumentation & Measurement Magazine 26-2 - 51
Instrumentation & Measurement Magazine 26-2 - 52
Instrumentation & Measurement Magazine 26-2 - 53
Instrumentation & Measurement Magazine 26-2 - 54
Instrumentation & Measurement Magazine 26-2 - 55
Instrumentation & Measurement Magazine 26-2 - 56
Instrumentation & Measurement Magazine 26-2 - 57
Instrumentation & Measurement Magazine 26-2 - 58
Instrumentation & Measurement Magazine 26-2 - 59
Instrumentation & Measurement Magazine 26-2 - 60
Instrumentation & Measurement Magazine 26-2 - 61
Instrumentation & Measurement Magazine 26-2 - 62
Instrumentation & Measurement Magazine 26-2 - 63
Instrumentation & Measurement Magazine 26-2 - Cover3
Instrumentation & Measurement Magazine 26-2 - Cover4
https://www.nxtbook.com/allen/iamm/26-6
https://www.nxtbook.com/allen/iamm/26-5
https://www.nxtbook.com/allen/iamm/26-4
https://www.nxtbook.com/allen/iamm/26-3
https://www.nxtbook.com/allen/iamm/26-2
https://www.nxtbook.com/allen/iamm/26-1
https://www.nxtbook.com/allen/iamm/25-9
https://www.nxtbook.com/allen/iamm/25-8
https://www.nxtbook.com/allen/iamm/25-7
https://www.nxtbook.com/allen/iamm/25-6
https://www.nxtbook.com/allen/iamm/25-5
https://www.nxtbook.com/allen/iamm/25-4
https://www.nxtbook.com/allen/iamm/25-3
https://www.nxtbook.com/allen/iamm/instrumentation-measurement-magazine-25-2
https://www.nxtbook.com/allen/iamm/25-1
https://www.nxtbook.com/allen/iamm/24-9
https://www.nxtbook.com/allen/iamm/24-7
https://www.nxtbook.com/allen/iamm/24-8
https://www.nxtbook.com/allen/iamm/24-6
https://www.nxtbook.com/allen/iamm/24-5
https://www.nxtbook.com/allen/iamm/24-4
https://www.nxtbook.com/allen/iamm/24-3
https://www.nxtbook.com/allen/iamm/24-2
https://www.nxtbook.com/allen/iamm/24-1
https://www.nxtbook.com/allen/iamm/23-9
https://www.nxtbook.com/allen/iamm/23-8
https://www.nxtbook.com/allen/iamm/23-6
https://www.nxtbook.com/allen/iamm/23-5
https://www.nxtbook.com/allen/iamm/23-2
https://www.nxtbook.com/allen/iamm/23-3
https://www.nxtbook.com/allen/iamm/23-4
https://www.nxtbookmedia.com