Base Entropies
Functions for estimating the entropy of a single univariate time series.
The following functions also form the base entropy method used by multiscale entropy functions.
- ApEn(Sig, varargin)
ApEn estimates the approximate entropy of a univariate data sequence.
[Ap, Phi] = ApEn(Sig)
Returns the approximate entropy estimates (
Ap
) and the log-average number of matched vectors (Phi
) form
= [0, 1, 2], estimated from the data sequence (Sig
) using the default parameters: embedding dimension = 2, time delay = 1, radius distance threshold = 0.2*SD(Sig
), logarithm = natural[Ap, Phi] = ApEn(Sig, name, value, …)
Returns the approximate entropy estimates (
Ap
) of the data sequence (Sig
) for dimensions = [0, 1, …,m
] using the specified name/value pair arguments:m
- Embedding Dimension, a positive integertau
- Time Delay, a positive integerr
- Radius Distance Threshold, a positive scalarLogx
- Logarithm base, a positive scalar
- See also:
XApEn, SampEn, MSEn, FuzzEn, PermEn, CondEn, DispEn
- References:
- [1] Steven M. Pincus,
“Approximate entropy as a measure of system complexity.” Proceedings of the National Academy of Sciences 88.6 (1991): 2297-2301.
- AttnEn(Sig, varargin)
AttnEn estimates the attention entropy of a univariate data sequence.
[Attn] = AttnEn(Sig)
Returns the attention entropy (
Attn
) calculated as the average of the sub-entropies (Hxx
,Hxn
,Hnn
,Hnx
) estimated from the data sequence (Sig
) using a base-2 logarithm.[Attn, Hxx, Hnn, Hxn, Hnx] = AttnEn(Sig, ‘Logx’, value)
Returns the attention entropy (
Attn
) and the sub-entropies (Hxx
,Hxn
,Hnn
,Hnx
) from the data sequence (Sig
) where,Hxx
- entropy of local-maxima intervalsHnn
- entropy of local minima intervalsHxn
- entropy of intervals between local maxima and subsequent minimaHnx
- entropy of intervals between local minima and subsequent maxima with the following name/value pair argument:Logx
- Logarithm base, a positive scalar (enter 0 for natural log)
- See also:
EnofEn, SpecEn, XSpecEn, PermEn, MSEn
- References:
- [1] Jiawei Yang, et al.,
“Classification of Interbeat Interval Time-series Using Attention Entropy.” IEEE Transactions on Affective Computing (2020)
- BubbEn(Sig, varargin)
BubbEn estimates the bubble entropy of a univariate data sequence.
[Bubb, H] = BubbEn(Sig)
Returns the bubble entropy (
Bubb
) and the conditional Renyi entropy (H
) estimates from the data sequence (Sig
) using the default parameters: embedding dimension = 2, time delay = 1, logarithm = natural[Bubb, H] = BubbEn(Sig, name, value, …)
Returns the bubble entropy (
Bubb
) estimated from the data sequence (Sig
) using the specified name/value pair arguments:m
- Embedding Dimension, an integer > 1.BubbEn
returns estimates for each dimension [2, …,m
]tau
- Time Delay, a positive integerLogx
- Logarithm base, a positive scalar
- See also:
PhasEn, MSEn
- References:
- [1] George Manis, M.D. Aktaruzzaman and Roberto Sassi,
“Bubble entropy: An entropy almost free of parameters.” IEEE Transactions on Biomedical Engineering 64.11 (2017): 2711-2718.
- CondEn(Sig, varargin)
CondEn estimates the corrected conditional entropy of a univariate data sequence.
[Cond, SEw, SEz] = CondEn(Sig)
Returns the corrected conditional entropy estimates (
Cond
) and the corresponding Shannon entropies (m: SEw
,m+1: SEz
) form
= [1,2] estimated from the data sequence (Sig
) using the default parameters: embedding dimension = 2, time delay = 1, symbols = 6, logarithm = natural normalisation = false.Note:
CondEn(m=1)
returns the Shannon entropy ofSig
.
[Cond, SEw, SEz] = CondEn(Sig, name, value, …)
Returns the corrected conditional entropy estimates (
Cond
) from the data sequence (Sig
) using the specified name/value pair arguments:m
- Embedding Dimension, an integer > 1tau
- Time Delay, a positive integerc
- Number of symbols, an integer > 1Logx
- Logarithm base, a positive scalarNorm
- Normalisation ofCond
value, a boolean.[false] no normalisation - default
[true] normalises w.r.t Shannon entropy of data sequence
Sig
- See also:
XCondEn, MSEn, PermEn, DistEn, XPermEn
- References:
- [1] Alberto Porta, et al.,
“Measuring regularity by means of a corrected conditional entropy in sympathetic outflow.” Biological cybernetics 78.1 (1998): 71-78.
- CoSiEn(Sig, varargin)
CoSiEn estimates the cosine similarity entropy of a univariate data sequence.
[CoSi, Bm] = CoSiEn(Sig)
Returns the cosine similarity entropy (
CoSi
) and the corresponding global probabilities (Bm
) estimated from the data sequence (Sig
) using the default parameters: embedding dimension = 2, time delay = 1, angular threshold = .1, logarithm = base 2,[CoSi, Bm] = CoSiEn(Sig, name, value, …)
Returns the cosine similarity entropy (
CoSi
) estimated from the data sequence (Sig
) using the specified name/value pair arguments:m
- Embedding Dimension, an integer > 1tau
- Time Delay, a positive integerr
- Angular threshold, a value in range [0 <r
< 1]Logx
- Logarithm base, a positive scalar (enter 0 for natural log)Norm
- Normalisation ofSig
, one of the following integers:[0] no normalisation - default
[1] remove median(
Sig
) to get zero-median series[2] remove mean(
Sig
) to get zero-mean series[3] normalises
Sig
w.r.t. SD(Sig
)[4] normalises
Sig
values to range [-1 1]
- See also:
PhasEn, SlopEn, GridEn, MSEn, hMSEn
- References:
- [1] Theerasak Chanwimalueang and Danilo Mandic,
“Cosine similarity entropy: Self-correlation-based complexity analysis of dynamical systems.” Entropy 19.12 (2017): 652.
- DistEn(Sig, varargin)
DistEn estimates the distribution entropy of a univariate data sequence.
[Dist, Ppi] = DistEn(Sig)
Returns the distribution entropy estimate (
Dist
) and the corresponding distribution probabilities (Ppi
) estimated from the data sequence (Sig
) using the default parameters: embedding dimension = 2, time delay = 1, binning method ='Sturges'
, logarithm = base 2, normalisation = w.r.t # of histogram bins[Dist, Ppi] = DistEn(Sig, name, value, …)
Returns the distribution entropy estimate (
Dist
) estimated from the data sequence (Sig
) using the specified name/value pair arguments:m
- Embedding Dimension, a positive integertau
- Time Delay, a positive integerBins
- Histogram bin selection method for distance distribution, either an integer > 1 indicating the number of bins, or one of the following strings {'sturges'
,'sqrt'
,'rice'
,'doanes'
} [default:'sturges'
]Logx
- Logarithm base, a positive scalar (enter 0 for natural log)Norm
- Normalisation ofDist
value, a boolean:[false] no normalisation.
[true] normalises w.r.t # of histogram bins (default)
- See also:
XDistEn, DistEn2D, MSEn, K2En
- References:
- [1] Li, Peng, et al.,
“Assessing the complexity of short-term heartbeat interval series by distribution entropy.” Medical & biological engineering & computing 53.1 (2015): 77-87.
- DispEn(Sig, varargin)
DispEn estimates the dispersion entropy of a univariate data sequence.
[Dispx, RDE] = DispEn(Sig)
Returns the dispersion entropy (
Dispx
) and the reverse dispersion entropy (RDE
) estimated from the data sequence (Sig
) using the default parameters: embedding dimension = 2, time delay = 1, symbols = 3, logarithm = natural, data transform = normalised cumulative density function (ncdf)[Dispx, RDE] = DispEn(Sig, name, value, …)
Returns the dispersion entropy (
Dispx
) and the reverse dispersion entropy (RDE
) estimated from the data sequence (Sig
) using the specified name/value pair arguments:m
- Embedding Dimension, a positive integertau
- Time Delay, a positive integerc
- Number of symbols, an integer > 1Typex
- Type of data-to-symbolic sequence transform, one of the following: {linear
,kmeans
,ncdf
,finesort
,equal
} See the EntropyHub Guide for more info on these transforms.Logx
- Logarithm base, a positive scalarFluct
- WhenFluct == true
,DispEn
returns the fluctuation-based Dispersion entropy. [default: false]Norm
- Normalisation ofDispx
andRDE
values, a boolean:[false] no normalisation - default
[true] normalises w.r.t number of possible dispersion patterns (
c^m
or(2c -1)^m-1
ifFluct == true
).
rho
- *IfTypex == 'finesort'
,rho
is the tuning parameter (default: 1)
- See also:
PermEn, SyDyEn, MSEn.
- References:
- [1] Mostafa Rostaghi and Hamed Azami,
“Dispersion entropy: A measure for time-series analysis.” IEEE Signal Processing Letters 23.5 (2016): 610-614.
- [2] Hamed Azami and Javier Escudero,
“Amplitude-and fluctuation-based dispersion entropy.” Entropy 20.3 (2018): 210.
- [3] Li Yuxing, Xiang Gao and Long Wang,
“Reverse dispersion entropy: A new complexity measure for sensor signal.” Sensors 19.23 (2019): 5203.
- [4] Wenlong Fu, et al.,
“Fault diagnosis for rolling bearings based on fine-sorted dispersion entropy and SVM optimized with mutation SCA-PSO.” Entropy 21.4 (2019): 404.
- EnofEn(Sig, varargin)
EnofEn estimates the entropy of entropy from a univariate data sequence.
[EoE, AvEn, S2] = EnofEn(Sig)
Returns the entropy of entropy (
EoE
), the average Shannon entropy (AvEn
), and the number of levels (S2
) across all windows estimated from the data sequence (Sig
) using the default parameters: window length (samples) = 10, slices = 10, logarithm = natural heartbeat interval range (xmin, xmax) = [min(Sig) max(Sig)][EoE, AvEn, S2] = EnofEn(Sig, name, value, …)
Returns the entropy of entropy (
EoE
) estimated from the data sequence (Sig
) using the specified name/value pair arguments:tau
- Window length, an integer > 1S
- Number of slices, an integer > 1Xrange
- The min and max heartbeat interval,a two-element vector where X(1) < X(2)
Logx
- Logarithm base, a positive scalar
- See also:
SampEn, MSEn
- References:
- [1] Chang Francis Hsu, et al.,
“Entropy of entropy: Measurement of dynamical complexity for biological systems.” Entropy 19.10 (2017): 550.
- FuzzEn(Sig, varargin)
FuzzEn estimates the fuzzy entropy of a univariate data sequence.
[Fuzz, Ps1, Ps2] = FuzzEn(Sig)
Returns the fuzzy entropy estimates (
Fuzz
) and the average fuzzy distances (m: Ps1
,m+1: Ps2
) form
= [1,2] estimated from the data sequence (Sig
) using the default parameters: embedding dimension = 2, time delay = 1, fuzzy function ='default'
, fuzzy function parameters = [0.2, 2], logarithm = natural[Fuzz, Ps1, Ps2] = FuzzEn(Sig, name, value, …)
Returns the fuzzy entropy estimates (
Fuzz
) for dimensions = [1, …,m
] estimated from the data sequence (Sig
) using the specified name/value pair arguments:m
- Embedding Dimension, a positive integer [default: 2]tau
- Time Delay, a positive integer [default: 1]Fx
- Fuzzy function name, one of the following strings: {'sigmoid'
,'modsampen'
,'default'
,'gudermannian'
,'linear'
}r
- Fuzzy function parameters, a 1 element scalar or a 2 element vector of positive values. Ther
parameters for each fuzzy function are defined as follows: (default: [.2 2])- sigmoid:
r(1) = divisor of the exponential argument
r(2) = value subtracted from argument (pre-division)
- modsampen:
r(1) = divisor of the exponential argument
r(2) = value subtracted from argument (pre-division)
- default:
r(1) = divisor of the exponential argument
r(2) = argument exponent (pre-division)
- gudermannian:
r = a scalar whose value is the numerator of
argument to gudermannian function: GD(x) = atan(tanh(r/x)). GD(x) is normalised to have a maximum value of 1.
- linear:
r = an integer value. When
r == 0
, the
argument of the exponential function is normalised between [0 1]. When
r == 1
, the minimuum value of the exponential argument is set to 0.
Logx
- Logarithm base, a positive scalar [default: natural]
For further information on the name/value paire arguments, see the EntropyHub guide
- See also:
SampEn, ApEn, PermEn, DispEn, XFuzzEn, FuzzEn2D, MSEn.
- References:
- [1] Weiting Chen, et al.
“Characterization of surface EMG signal based on fuzzy entropy.” IEEE Transactions on neural systems and rehabilitation engineering 15.2 (2007): 266-272.
- [2] Hong-Bo Xie, Wei-Xing He, and Hui Liu
“Measuring time series regularity using nonlinear similarity-based sample entropy.” Physics Letters A 372.48 (2008): 7140-7146.
- GridEn(Sig, varargin)
GridEn estimates the gridded distribution entropy of a univariate data sequence.
[GDE, GDR] = GridEn(Sig)
Returns the gridded distribution entropy (
GDE
) and the gridded distribution rate (GDR
) estimated from the data sequence (Sig
) using the default parameters: grid coarse-grain = 3, time delay = 1, logarithm = natural[GDE, GDR, PIx, GIx, SIx, AIx] = GridEn(Sig)
In addition to
GDE
andGDR
,GridEn
returns the following indices estimated from the data sequence (Sig
) using the default parameters:PIx
- Percentage of points below the line of identity (LI)GIx
- Proportion of point distances above the LISIx
- Ratio of phase angles (w.r.t. LI) of the points above the LIAIx
- Ratio of the cumulative area of sectors of points above the LI
[GDE, GDR, …,] = GridEn(Sig, name, value, …)
Returns the gridded distribution entropy (
GDE
) estimate of the data sequence (Sig
) using the specified name/value pair arguments:m
- Grid coarse-grain (m
xm
sectors), an integer > 1tau
- Time Delay, a positive integerLogx
- Logarithm base, a positive scalarPlotx
- WhenPlotx == true
, returns gridded Poicare plot and a bivariate histogram of the grid point distribution (default: false)
- See also:
PhasEn, CoSiEn, SlopEn, BubbEn, MSEn
- References:
- [1] Chang Yan, et al.,
“Novel gridded descriptors of Poincare plot for analyzing heartbeat interval time-series.” Computers in biology and medicine 109 (2019): 280-289.
- [2] Chang Yan, et al.
“Area asymmetry of heart rate variability signal.” Biomedical engineering online 16.1 (2017): 1-14.
- [3] Alberto Porta, et al.,
“Temporal asymmetries of short-term heart period variability are linked to autonomic regulation.” American Journal of Physiology-Regulatory, Integrative and Comparative Physiology 295.2 (2008): R550-R557.
- [4] C.K. Karmakar, A.H. Khandoker and M. Palaniswami,
“Phase asymmetry of heart rate variability signal.” Physiological measurement 36.2 (2015): 303.
- IncrEn(Sig, varargin)
IncrEn estimates the increment entropy of a univariate data sequence.
[Incr] = IncrEn(Sig)
Returns the increment entropy (
Incr
) estimated from the data sequence (Sig
) using the default parameters: embedding dimension = 2, time delay = 1, quantifying resolution = 4, logarithm = base 2,[Incr] = IncrEn(Sig, name, value, …)
Returns the increment entropy (
Incr
) estimated from the data sequence (Sig
) using the specified name/value pair arguments:m
- Embedding Dimension, an integer > 1tau
- Time Delay, a positive integerR
- Quantifying resolution, a positive integerLogx
- Logarithm base, a positive scalar (enter 0 for natural log)Norm
- Normalisation of IncrEn value, a boolean:[false] no normalisation - default
[true] normalises w.r.t embedding dimension (m-1).
- See also:
PermEn, SyDyEn, MSEn
- References:
- [1] Xiaofeng Liu, et al.,
“Increment entropy as a measure of complexity for time series.” Entropy 18.1 (2016): 22.1.
- *** “Correction on Liu, X.; Jiang, A.; Xu, N.; Xue, J. - Increment
Entropy as a Measure of Complexity for Time Series, Entropy 2016, 18, 22.” Entropy 18.4 (2016): 133.
- [2] Xiaofeng Liu, et al.,
“Appropriate use of the increment entropy for electrophysiological time series.” Computers in biology and medicine 95 (2018): 13-23.
- K2En(Sig, varargin)
K2En estimates the Kolmogorov (K2) entropy of a univariate data sequence.
[K2, Ci] = K2En(Sig)
Returns the Kolmogorov entropy estimates (
K2
) and the correlation integrals (Ci
) form
= [1,2] estimated from the data sequence (Sig
) using the default parameters: embedding dimension = 2, time delay = 1, distance threshold (r
) = 0.2*SD(Sig
), logarithm = natural[K2, Ci] = K2En(Sig, name, value, …)
Returns the Kolmogorov entropy estimates (
K2
) for dimensions = [1, …,m
] estimated from the data sequence (Sig
) using the specified name/value pair arguments:m
- Embedding Dimension, a positive integertau
- Time Delay, a positive integerr
- Radius Distance Threshold, a positive scalarLogx
- Logarithm base, a positive scalar
- See also:
DistEn, XK2En, MSEn
- References:
- [1] Peter Grassberger and Itamar Procaccia,
“Estimation of the Kolmogorov entropy from a chaotic signal.” Physical review A 28.4 (1983): 2591.
- [2] Lin Gao, Jue Wang and Longwei Chen
“Event-related desynchronization and synchronization quantification in motor-related EEG by Kolmogorov entropy” J Neural Eng. 2013 Jun;10(3):03602
- PermEn(Sig, varargin)
PermEn estimates the permutation entropy of a univariate data sequence.
[Perm, Pnorm, cPE] = PermEn(Sig)
Returns the permuation entropy estimates (
Perm
), the normalised permutation entropy (Pnorm
) and the conditional permutation entropy (cPE
) form
= [1,2] estimated from the data sequence (Sig
) using the default parameters: embedding dimension = 2, time delay = 1, logarithm = base 2, normalisation = w.r.t #symbols (m-1
) Note: using the standard PermEn estimation,Perm = 0
whenm = 1
. It is recommeneded that signal length, N > 5m! (see [8] and Amigo et al., Europhys. Lett. 83:60005, 2008)[Perm, Pnorm, cPE] = PermEn(Sig, m)
Returns the permutation entropy estimates (
Perm
) estimated from the data sequence (Sig
) using the specified embedding dimensions = [1, …,m
] with other default parameters as listed above.[Perm, Pnorm, cPE] = PermEn(Sig, name, value, …)
Returns the permutation entropy estimates (
Perm
) for dimensions = [1, …,m
] estimated from the data sequence (Sig
) using the specified name/value pair arguments:m
- Embedding Dimension, an integer > 1tau
- Time Delay, a positive integerLogx
- Logarithm base, a positive scalar (enter 0 for natural log)Norm
- Normalisation of Pnorm value, a boolean operator:false - normalises w.r.t log(# of permutation symbols [m-1]) - default
true - normalises w.r.t log(# of all possible permutations [m!])
Note: Normalised permutation entropy is undefined for
m
= 1. Note: WhenTypex = 'uniquant'
andNorm = true
, normalisation is calculated w.r.t.log(tpx^m)
Typex
- Permutation entropy variation, one of the following: {'uniquant'
,'finegrain'
,'modified'
,'ampaware'
,'weighted'
,'edge'
}tpx
- Tuning parameter for associated permutation entropy variation.[uniquant]
tpx
is the L parameter, an integer > 1 (default =4).[finegrain]
tpx
is the alpha parameter, a positive scalar (default = 1)[ampaware]
tpx
is the A parameter, a value in range [0 1] (default = 0.5)[edge]
tpx
is the r sensitivity parameter, a scalar > 0 (default = 1)
See the EntropyHub guide for more info on these permutation entropy variants.
- See also:
XPermEn, MSEn, XMSEn, SampEn, ApEn, CondEn
- References:
- [1] Christoph Bandt and Bernd Pompe,
“Permutation entropy: A natural complexity measure for time series.” Physical Review Letters, 88.17 (2002): 174102.
- [2] Xiao-Feng Liu, and Wang Yue,
“Fine-grained permutation entropy as a measure of natural complexity for time series.” Chinese Physics B 18.7 (2009): 2690.
- [3] Chunhua Bian, et al.,
“Modified permutation-entropy analysis of heartbeat dynamics.” Physical Review E 85.2 (2012) : 021906
- [4] Bilal Fadlallah, et al.,
“Weighted-permutation entropy: A complexity measure for time series incorporating amplitude information.” Physical Review E 87.2 (2013): 022911.
- [5] Hamed Azami and Javier Escudero,
“Amplitude-aware permutation entropy: Illustration in spike detection and signal segmentation.” Computer methods and programs in biomedicine, 128 (2016): 40-51.
- [6] Zhiqiang Huo, et al.,
“Edge Permutation Entropy: An Improved Entropy Measure for Time-Series Analysis,” 45th Annual Conference of the IEEE Industrial Electronics Soc, (2019), 5998-6003
- [7] Zhe Chen, et al.
“Improved permutation entropy for measuring complexity of time series under noisy condition.” Complexity 1403829 (2019).
- [8] Maik Riedl, Andreas Müller, and Niels Wessel,
“Practical considerations of permutation entropy.” The European Physical Journal Special Topics 222.2 (2013): 249-262.
- PhasEn(Sig, varargin)
PhasEn estimates the phase entropy of a univariate data sequence.
[Phas] = PhasEn(Sig)
Returns the phase entropy (
Phas
) estimate of the data sequence (Sig
) using the default parameters: angular partitions = 4, time delay = 1, logarithm = natural, normalisation = true[Phas] = PhasEn(Sig, name, value, …)
Returns the phase entropy (
Phas
) estimate of the data sequence (Sig
) using the specified name/value pair arguments:K
- Angular partitions (coarse graining), an integer > 1- Note: Division of partitions begins along the positive
x-axis. As this point is somewhat arbitrary, it is recommended to use even-numbered (preferably multiples of 4) partitions for sake of symmetry.
tau
- Time Delay, a positive integerLogx
- Logarithm base, a positive scalarNorm
- Normalisation of Phas value, a boolean:[false] no normalisation
[true] normalises w.r.t. the # partitions (
Log(K)
) (Default)
Plotx
- WhenPlotx == true
, returns Poincaré plot (default: false)
- See also:
SampEn, ApEn, GridEn, MSEn, SlopEn, CoSiEn, BubbEn
- References:
- [1] Ashish Rohila and Ambalika Sharma,
“Phase entropy: a new complexity measure for heart rate variability.” Physiological measurement 40.10 (2019): 105006.
- SampEn(Sig, varargin)
SampEn estimates the sample entropy of a univariate data sequence.
[Samp, A, B] = SampEn(Sig)
Returns the sample entropy estimates (
Samp
) and the number of matched state vectors (m: B
,m+1: A
) form
= [0,1,2] estimated from the data sequence (Sig
) using the default parameters: embedding dimension = 2, time delay = 1, radius threshold = 0.2*SD(Sig
), logarithm = natural[Samp, A, B] = SampEn(Sig, name, value, …)
Returns the sample entropy estimates (
Samp
) for dimensions = [0,1,…,m
] estimated from the data sequence (Sig
) using the specified name/value pair arguments:m
- Embedding Dimension, a positive integertau
- Time Delay, a positive integerr
- Radius Distance Threshold, a positive scalarLogx
- Logarithm base, a positive scalar
- See also:
ApEn, FuzzEn, PermEn, CondEn, XSampEn, SampEn2D, MSEn.
- References:
- [1] Joshua S Richman and J. Randall Moorman.
“Physiological time-series analysis using approximate entropy and sample entropy.” American Journal of Physiology-Heart and Circulatory Physiology (2000).
- SlopEn(Sig, varargin)
SlopEn estimates the slope entropy of a univariate data sequence.
[Slop] = SlopEn(Sig)
Returns the slope entropy (
Slop
) estimates for embedding dimensions [2, …,m
] of the data sequence (Sig
) using the default parameters: embedding dimension = 2, time delay = 1, angular thresholds = [5 45], logarithm = base 2[Slop] = SlopEn(Sig, name, value, …)
Returns the slope entropy (
Slop
) estimate of the data sequence (Sig
) using the specified name/value pair arguments:m
- Embedding Dimension, an integer > 1.SlopEn
returns estimates for each dimension [2, …,m
]tau
- Time Delay, a positive integerLvls
- Angular thresolds, a vector of monotonically increasing values in the range [0 90] degrees.Logx
- Logarithm base, a positive scalar (enter 0 for natural log)Norm
- Normalisation ofSlop
value, a boolean:[false] no normalisation
[true] normalises w.r.t. the number of patterns found (default)
- See also:
PhasEn, GridEn, MSEn, CoSiEn, SampEn, ApEn
- References:
- [1] David Cuesta-Frau,
“Slope Entropy: A New Time Series Complexity Estimator Based on Both Symbolic Patterns and Amplitude Information.” Entropy 21.12 (2019): 1167.
- SpecEn(Sig, varargin)
SpecEn estimates the spectral entropy of a univariate data sequence.
[Spec, BandEn] = SpecEn(Sig)
Returns the spectral entropy estimate of the full spectrum (
Spec
) and the within-band entropy (BandEn
) estimated from the data sequence (Sig
) using the default parameters: N-point FFT =length(Sig)*2 + 1
, normalised band edge frequencies = [0 1], logarithm = natural, normalisation = w.r.t # of spectrum/band values.[Spec, BandEn] = SpecEn(Sig, name, value, …)
Returns the spectral entropy (
Spec
) and the within-band entropy (BandEn
) estimate for the data sequence (Sig
) using the specified name/value pair arguments:N
’ - Resolution of spectrum (N-point FFT), an integer > 1Freqs
- Normalised spectrum band edge-frequencies, a 2 element vector with values in range [0 1] where 1 corresponds to the Nyquist frequency (Fs/2). Note: When no band frequencies are entered,BandEn == SpecEn
Logx
- Logarithm base, a positive scalar (default: natural log)Norm
- Normalisation ofSpec
value, a boolean:[false] no normalisation.
[true] normalises w.r.t # of spectrum/band frequency values (default).
- See also:
XSpecEn, fft, periodogram
- References:
- [1] G.E. Powell and I.C. Percival,
“A spectral entropy method for distinguishing regular and irregular motion of Hamiltonian systems.” Journal of Physics A: Mathematical and General 12.11 (1979): 2053.
- [2] Tsuyoshi Inouye, et al.,
“Quantification of EEG irregularity by use of the entropy of the power spectrum.” Electroencephalography and clinical neurophysiology 79.3 (1991): 204-210.
- SyDyEn(Sig, varargin)
SyDyEn estimates the symbolic dynamic entropy of a univariate data sequence.
[SyDy, Zt] = SyDyEn(Sig)
Returns the symbolic dynamic entropy (
SyDy
) and the symbolic sequence (Zt
) of the data sequence (Sig
) using the default parameters: embedding dimension = 2, time delay = 1, symbols = 3, logarithm = natural, symbolic partition type = maximum entropy partitioning (MEP), normalisation = normalises w.r.t # possible vector permutations (c^m
)[SyDy, Zt] = SyDyEn(Sig, name, value, …)
Returns the symbolic dynamic entropy (
SyDy
) and the symbolic sequence (Zt
) estimated of the data sequence (Sig
) using the specified name/value pair arguments:m
- Embedding Dimension, a positive integertau
- Time Delay, a positive integerc
- Number of symbols, an integer > 1Typex
- Type of symbolic sequence partitioning, one of the following: {'linear'
,'uniform'
,'MEP'
(default),'kmeans'
}Logx
- Logarithm base, a positive scalarNorm
- Normalisation ofSyDyEn
value, a boolean:[false] no normalisation
[true] normalises w.r.t # possible vector permutations (
c^m+1
) - default
See the EntropyHub guide for more info.
- See also:
DispEn, PermEn, CondEn, SampEn, MSEn.
- References:
- [1] Yongbo Li, et al.,
“A fault diagnosis scheme for planetary gearboxes using modified multi-scale symbolic dynamic entropy and mRMR feature selection.” Mechanical Systems and Signal Processing 91 (2017): 295-312.
- [2] Jian Wang, et al.,
“Fault feature extraction for multiple electrical faults of aviation electro-mechanical actuator based on symbolic dynamics entropy.” IEEE International Conference on Signal Processing, Communications and Computing (ICSPCC), 2015.
- [3] Venkatesh Rajagopalan and Asok Ray,
“Symbolic time series analysis via wavelet-based partitioning.” Signal processing 86.11 (2006): 3309-3320.