EntropyHub

Information and uncertainty can be regarded as two sides of the same coin: the more uncertainty there is, the more information we gain by removing that uncertainty. In the context of information and probability theory, Entropy quantifies that uncertainty.

The concept of entropy has its origins in classical physics under the second law of thermodynamics, a law considered to underpin our fundamental understanding of time in physics. Attempting to analyse the analog world around us requires that we measure time in discrete steps, but doing so compromises our ability to measure entropy accurately. Various measures have been derived to estimate entropy (uncertainty) from discrete time series, each seeking to best capture the uncertainty of the system under examination. This has resulted in many entropy statistics from approximate entropy and sample entropy, to multiscale sample entropy and refined-composite multiscale cross-sample entropy.

As the number of statisitcal entropy measures grows, it becomes more difficult to identify, contrast and compare the performance of each measure. To overcome this, we have developed EntropyHub - an open-source toolkit designed to integrate the many established entropy methods into one package. The goal of EntropyHub is to provide a comprehensive set of functions with a simple and consistent syntax that allows the user to augment parameters at the command line, enabling a range from basic to advanced entropy methods to be implemented with ease.

It is important to clarify that the entropy functions herein described estimate entropy in the context of probability theory and information theory as defined by Shannon, and not thermodynamic or other entropies from classical physics.

Installation

There are two ways to install EntropyHub for Julia.

Method 1:

  1. In Julia, open the package REPL by typing ]. The command line should appear as:

    @vX.Y. pkg>

    Where X and Y refer to your version of Julia.

  2. Type:

    add EntropyHub

    (Note: this is case sensitive)

Alternatively, one can use the Pkg module to perform the same procedure:

using Pkg

Pkg.add("EntropyHub")

Method 2:

  1. In Julia, open the package REPL by typing ]. The command line should appear as:

    @vX.Y. pkg>

    Where X and Y refer to your version of Julia.

  2. Type:

    add https://github.com/MattWillFlood/EntropyHub.jl

    (Note: this is case sensitive)

System Requirements

There are several package dependencies which will be installed alongside EntropyHub (if not already installed):

DSP, FFTW, HTTP, Random, Plots, StatsBase, StatsFuns, GroupSlices, Statistics, DelimitedFiles, Combinatorics, LinearAlgebra, Dierckx, Clustering

EntropyHub was designed using Julia 1.5 and is intended for use with Julia versions >= 1.2.

Documentation & Help

A key advantage of EntropyHub is the comprehensive documentation available to help users to make the most of the toolkit.

To learn more about a specific function, one can do so easily from the command line by typing: ?, which will open the julia help system, and then typing the function name.

For example:

julia> ?  
help?> SampEn	  # Documentation on sample entropy function

julia> ?  
help?> XSpecEn    # Documentation on cross-spectral entropy function

julia> ?
help?> hXMSEn     # Documentation on hierarchical multiscale cross-entropy function

All information on the EntropyHub package is detailed in the EntropyHub Guide, a .pdf document available here.

Functions

EntropyHub functions fall into 5 categories:

* Base                functions for estimating the entropy of a single univariate time series.
* Cross               functions for estimating the entropy between two univariate time series.
* Bidimensional       functions for estimating the entropy of a two-dimensional univariate matrix.
* Multiscale          functions for estimating the multiscale entropy of a single univariate time series using any of the Base entropy functions.
* Multiscale Cross    functions for estimating the multiscale entropy between two univariate time series using any of the Cross-entropy functions.

The following tables outline the functions available in the EntropyHub package.

When new entropies are published in the scientific literature, efforts will be made to incorporate them in future releases.

Base Entropies:

Entropy Type Function Name
Approximate Entropy ApEn
Sample Entropy SampEn
Fuzzy Entropy FuzzEn
Kolmogorov Entropy K2En
Permutation Entropy PermEn
Conditional Entropy CondEn
Distribution Entropy DistEn
Spectral Entropy SpecEn
Dispersion Entropy DispEn
Symbolic Dynamic Entropy SyDyEn
Increment Entropy IncrEn
Cosine Similarity Entropy CoSiEn
Phase Entropy PhasEn
Slope Entropy SlopEn
Bubble Entropy BubbEn
Gridded Distribution Entropy GridEn
Entropy of Entropy EnofEn
Attention Entropy AttnEn

Cross Entropies:

Entropy Type Function Name
Cross Sample Entropy XSampEn
Cross Approximate Entropy XApEn
Cross Fuzzy Entropy XFuzzEn
Cross Permutation Entropy XPermEn
Cross Conditional Entropy XCondEn
Cross Distribution Entropy XDistEn
Cross Spectral Entropy XSpecEn
Cross Kolmogorov Entropy XK2En

Bidimensional Entropies

Entropy Type Function Name
Bi-Dimensional Sample Entropy SampEn2D
Bi-Dimensional Fuzzy Entropy FuzzEn2D
Bi-Dimensional Distribution Entropy DistEn2D

Multiscale Entropy Functions

Entropy Type Function Name
Multiscale Entropy MSEn
Composite/Refined-Composite Multiscale Entropy cMSEn
Refined Multiscale Entropy rMSEn
Hierarchical Multiscale Entropy hMSEn

Multiscale Cross-Entropy Functions

Entropy Type Function Name
Multiscale Cross-Entropy XMSEn
Composite/Refined-Composite Multiscale Cross-Entropy cXMSEn
Refined Multiscale Cross-Entropy rXMSEn
Hierarchical Multiscale Cross-Entropy hXMSEn

GitHub

https://github.com/MattWillFlood/EntropyHub.jl