Show simple item record

AuthorHeidari, Hanif
AuthorVelichko,
Authorrei
AuthorMurugappan, Murugappan
AuthorChowdhury, Muhammad E. H.
Available date2023-04-17T06:57:40Z
Publication Date2023
Publication NameNonlinear Dynamics
ResourceScopus
URIhttp://dx.doi.org/10.1007/s11071-023-08298-w
URIhttp://hdl.handle.net/10576/41929
AbstractEntropy is a fundamental concept in the field of information theory. During measurement, conventional entropy measures are susceptible to length and amplitude changes in time series. A new entropy metric, neural network entropy (NNetEn), has been developed to overcome these limitations. NNetEn entropy is computed using a modified LogNNet neural network classification model. The algorithm contains a reservoir matrix of N = 19,625 elements that must be filled with the given data. A substantial number of practical time series have fewer elements than 19,625. The contribution of this paper is threefold. Firstly, this work investigates different methods of filling the reservoir with time series (signal) elements. The reservoir filling method determines the accuracy of the entropy estimation by convolution of the study time series and LogNNet test data. The present study proposes 6 methods for filling the reservoir for time series of any length 5 ≤ N ≤ 19,625. Two of them (Method 3 and Method 6) employ the novel approach of stretching the time series to create intermediate elements that complement it, but do not change its dynamics. The most reliable methods for short-time series are Method 3 and Method 5. The second part of the study examines the influence of noise and constant bias on entropy values. In addition to external noise, the hyperparameter (bias) used in entropy calculation also plays a critical role. Our study examines three different time series data types (chaotic, periodic, and binary) with different dynamic properties, Signal-to-Noise Ratio (SNR), and offsets. The NNetEn entropy calculation errors are less than 10% when SNR is greater than 30 dB, and entropy decreases with an increase in the bias component. The third part of the article analyzes real-time biosignal EEG data collected from emotion recognition experiments. The NNetEn measures show robustness under low-amplitude noise using various filters. Thus, NNetEn measures entropy effectively when applied to real-world environments with ambient noise, white noise, and 1/f noise.
Languageen
PublisherSpringer Science and Business Media B.V.
SubjectEEG
Entropy
Neural network
Neural network entropy
NNetEn
Offset
Short length signal
Signal-to-noise ratio
Time series
TitleNovel techniques for improving NNetEn entropy calculation for short and noisy time series
TypeArticle


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record