Instrumentation & Measurement Magazine 26-2 - 41

RGB image as the input of CNN. It truncated a length of 4×n
time series data from the original vibration signal and generated
three channels of components for constructing the RGB
image input through the differential method. Similar methodology
has been proposed by Serap [5]. By fusing 14-channel
emotional EEG series, the reconstructed phase space trajectory
matrix can be decomposed into a linear combination of
a finite number of signal harmonics by Principal Component
Analysis (PCA). Then the noise components were reduced and
the main harmonics related to audio-visual evoked potentials
were highlighted effectively. These methods show that multidimensional
data contribute to retaining more fundamental
information, allowing the feature extraction capability of CNN
to be fully utilized.
It should be noted that, because of the intrinsic feed-forward
serial structural characteristic with deep learning
networks, the global information will suffer an abatement
with deepened networks. The performance of deep learning
models highly depend on both large amount of the data, i.e.,
the number of features, and capability of feature extraction
methods in applications. Consequently, the fault diagnosis
performance is negatively affected by the insufficient feature
extraction where the fault signals are involved with the environmental
noise and cross-domain conditions. To supplement
the global feature information, Xu et al. [6] developed a hybrid
deep-learning model based on the convolutional neural network
(CNN) model and deep forest (gcForest) by use of the
time-frequency image from the vibration signals. By virtue
of the parallel structure with the residual network (ResNet),
Liang et al. [7] employed a squeeze-and-excitation attention
module to suppress different environmental noise, where the
dilated convolution was added into the residual network to
construct a multi-level connection. It was proved that ResNet
was competitive in dealing with the global feature enhancement
and real-time demands of the diagnosis model although
it works only under the condition of same distribution between
the source domain and the target domain. However, as
for the cross-domain fault diagnosis task, the diagnosis performance
is largely governed by the feature distinguishability
in addition to the global information. Li et al. [8] discussed the
generalization degradation of the diagnosis model based on
the features at low levels for cross-domain tasks. The results
indicated that the feature distance at the top levels was much
farther than the bottom layers and the fine-tuning on a pretrained
model gave more competitive generalization ability
than a randomly initialized model in cross-domain tasks.
In this paper, inspired by the global feature reinforcement
of the parallel structure, the ResNet is adopted as the backbone
of the diagnosis model for the mechanical fault. At the
same time, a dense connection is introduced into the feature
extraction layers. It is characterized by a parallel architecture
and capable of transmitting the features backwards among
extraction layers. Then, the features of preceding layers are
incorporated by a skipping fashion into subsequent layers.
To combat the distribution difference between the source domain
and the target domain, the STI-based RGB input image
April 2023
transformation and transfer learning strategy are employed.
Then the derived RGB-DResNet model makes full use of the
data in the source domain to obtain a better feature learning
ability. At the same time, only a small amount of labeled sample
data is needed to adapt to the feature distribution of the
target domain under various working conditions.
The remainder of the paper includes a description of the
RGB input image formation through the STI image conversion,
and gives the Dense ResNet model with the RGB input
image for the enhancement of the global feature extraction.
Then, the RGB-DResNet diagnosis model for the cross-domain
mechanical fault is detailed, and the cross-domain validation
experiments of the proposed model are carried out on various
working conditions, followed by the conclusions and discussion
on possible future work.
RGB Input Image Transformation
For the ResNet based diagnosis model of the mechanical
faults, the original one-dimensional signal is converted into
two-dimensional images as the input for the deep learning
networks diagnosis model. Considering that the commonly
used time-frequency distribution images rely on manually extracted
features and will cause information loss in processing
the original signal, an STI image conversion is adopted for the
generation of a RGB input image. It is realized by the differential
operation on the original one-dimensional time-domain
signal and described as:










where PR (i1
,i2), PG
(i2
P i ,i =
R 12
()
G()
23
B 34
()
S 11i ‐
P i ,i = 22
P i ,i = 33
Si ‐
Si ‐
( ) (max( ) min( )) ( )
2 max( ) min( )
X ‐
X ‐


X ‐
X ‐
X Si2 2
X

( ) (max( ) min( )) ( )
2 max( ) min( )
X ‐
X Si 33
X
( ) (max( ) min( )) ( )
2 max( ) min( )

,i3), and PB
(i1
), S2
X ‐
(i3
(i2
X Si4 4
X

,i4) are the R, G, B channel
components of the RGB image, respectively. X is a set of sampling
points intercepted from the original mechanical signal
with a length of L=4×n. S1
sets of X, i.e.:
), S3
S i X i( ),where i n n1, 2,...,2n
S i X i( ),where 2 1,2 2,...,3n
S i X i( ),where 3 1,3 2,...,4n
S i X i( ),where 1,2,...,n
 
11
22
33
44
where ij
( ) 
( ) 
( ) 
( ) 
i 
(2)
i n n
 
 
i n n
= 1, 2, ..., n with j = 1, 2, 3, 4, indexes the component
in the subsets Sj (ij). Through the randomly selection of the
starting point to truncate the original signal, multiple signal
segments are obtained and transformed into corresponding
images.
The derived RGB image retains all characteristics of the
original fault signal. On the one hand, the RGB image includes
both the shape information and the relationship between the
various points in the original time series. That is, every point in
the image from the upper left to the lower right is the difference
IEEE Instrumentation & Measurement Magazine
41
(i3
) and S4
(i4
) are the sub(1)

Instrumentation & Measurement Magazine 26-2

Table of Contents for the Digital Edition of Instrumentation & Measurement Magazine 26-2

Instrumentation & Measurement Magazine 26-2 - Cover1
Instrumentation & Measurement Magazine 26-2 - Cover2
Instrumentation & Measurement Magazine 26-2 - 1
Instrumentation & Measurement Magazine 26-2 - 2
Instrumentation & Measurement Magazine 26-2 - 3
Instrumentation & Measurement Magazine 26-2 - 4
Instrumentation & Measurement Magazine 26-2 - 5
Instrumentation & Measurement Magazine 26-2 - 6
Instrumentation & Measurement Magazine 26-2 - 7
Instrumentation & Measurement Magazine 26-2 - 8
Instrumentation & Measurement Magazine 26-2 - 9
Instrumentation & Measurement Magazine 26-2 - 10
Instrumentation & Measurement Magazine 26-2 - 11
Instrumentation & Measurement Magazine 26-2 - 12
Instrumentation & Measurement Magazine 26-2 - 13
Instrumentation & Measurement Magazine 26-2 - 14
Instrumentation & Measurement Magazine 26-2 - 15
Instrumentation & Measurement Magazine 26-2 - 16
Instrumentation & Measurement Magazine 26-2 - 17
Instrumentation & Measurement Magazine 26-2 - 18
Instrumentation & Measurement Magazine 26-2 - 19
Instrumentation & Measurement Magazine 26-2 - 20
Instrumentation & Measurement Magazine 26-2 - 21
Instrumentation & Measurement Magazine 26-2 - 22
Instrumentation & Measurement Magazine 26-2 - 23
Instrumentation & Measurement Magazine 26-2 - 24
Instrumentation & Measurement Magazine 26-2 - 25
Instrumentation & Measurement Magazine 26-2 - 26
Instrumentation & Measurement Magazine 26-2 - 27
Instrumentation & Measurement Magazine 26-2 - 28
Instrumentation & Measurement Magazine 26-2 - 29
Instrumentation & Measurement Magazine 26-2 - 30
Instrumentation & Measurement Magazine 26-2 - 31
Instrumentation & Measurement Magazine 26-2 - 32
Instrumentation & Measurement Magazine 26-2 - 33
Instrumentation & Measurement Magazine 26-2 - 34
Instrumentation & Measurement Magazine 26-2 - 35
Instrumentation & Measurement Magazine 26-2 - 36
Instrumentation & Measurement Magazine 26-2 - 37
Instrumentation & Measurement Magazine 26-2 - 38
Instrumentation & Measurement Magazine 26-2 - 39
Instrumentation & Measurement Magazine 26-2 - 40
Instrumentation & Measurement Magazine 26-2 - 41
Instrumentation & Measurement Magazine 26-2 - 42
Instrumentation & Measurement Magazine 26-2 - 43
Instrumentation & Measurement Magazine 26-2 - 44
Instrumentation & Measurement Magazine 26-2 - 45
Instrumentation & Measurement Magazine 26-2 - 46
Instrumentation & Measurement Magazine 26-2 - 47
Instrumentation & Measurement Magazine 26-2 - 48
Instrumentation & Measurement Magazine 26-2 - 49
Instrumentation & Measurement Magazine 26-2 - 50
Instrumentation & Measurement Magazine 26-2 - 51
Instrumentation & Measurement Magazine 26-2 - 52
Instrumentation & Measurement Magazine 26-2 - 53
Instrumentation & Measurement Magazine 26-2 - 54
Instrumentation & Measurement Magazine 26-2 - 55
Instrumentation & Measurement Magazine 26-2 - 56
Instrumentation & Measurement Magazine 26-2 - 57
Instrumentation & Measurement Magazine 26-2 - 58
Instrumentation & Measurement Magazine 26-2 - 59
Instrumentation & Measurement Magazine 26-2 - 60
Instrumentation & Measurement Magazine 26-2 - 61
Instrumentation & Measurement Magazine 26-2 - 62
Instrumentation & Measurement Magazine 26-2 - 63
Instrumentation & Measurement Magazine 26-2 - Cover3
Instrumentation & Measurement Magazine 26-2 - Cover4
https://www.nxtbook.com/allen/iamm/26-6
https://www.nxtbook.com/allen/iamm/26-5
https://www.nxtbook.com/allen/iamm/26-4
https://www.nxtbook.com/allen/iamm/26-3
https://www.nxtbook.com/allen/iamm/26-2
https://www.nxtbook.com/allen/iamm/26-1
https://www.nxtbook.com/allen/iamm/25-9
https://www.nxtbook.com/allen/iamm/25-8
https://www.nxtbook.com/allen/iamm/25-7
https://www.nxtbook.com/allen/iamm/25-6
https://www.nxtbook.com/allen/iamm/25-5
https://www.nxtbook.com/allen/iamm/25-4
https://www.nxtbook.com/allen/iamm/25-3
https://www.nxtbook.com/allen/iamm/instrumentation-measurement-magazine-25-2
https://www.nxtbook.com/allen/iamm/25-1
https://www.nxtbook.com/allen/iamm/24-9
https://www.nxtbook.com/allen/iamm/24-7
https://www.nxtbook.com/allen/iamm/24-8
https://www.nxtbook.com/allen/iamm/24-6
https://www.nxtbook.com/allen/iamm/24-5
https://www.nxtbook.com/allen/iamm/24-4
https://www.nxtbook.com/allen/iamm/24-3
https://www.nxtbook.com/allen/iamm/24-2
https://www.nxtbook.com/allen/iamm/24-1
https://www.nxtbook.com/allen/iamm/23-9
https://www.nxtbook.com/allen/iamm/23-8
https://www.nxtbook.com/allen/iamm/23-6
https://www.nxtbook.com/allen/iamm/23-5
https://www.nxtbook.com/allen/iamm/23-2
https://www.nxtbook.com/allen/iamm/23-3
https://www.nxtbook.com/allen/iamm/23-4
https://www.nxtbookmedia.com