Citation
Entropy as a Measure of Chaoticity in EEG

Material Information

Title:
Entropy as a Measure of Chaoticity in EEG
Creator:
Eaton, Robert
Pardalos, Panos ( Mentor )
Place of Publication:
Gainesville, Fla.
Publisher:
University of Florida
Publication Date:
Language:
English

Subjects

Genre:
serial ( sobekcm )

Record Information

Source Institution:
University of Florida
Holding Location:
University of Florida
Rights Management:
All applicable rights reserved by the source institution and holding location.

Downloads

This item has the following downloads:


Full Text








JOurn.31 ofr uin.er.r.3du.3ae I-e-earch

,,Olui le 4, issu-e 1-1 - Jul, 2:.l:ll



Entropy as a Measure of Chaoticity in EEG Signals

Robert Eaton


INTRODUCTION


This paper proposes the use of entropy as a measure of the disorder of electroencephalogram (EEG)

signals. Currently in many medical groups the EEG signal is quantified using the maximum Lyapunov exponent

(Lmax) as a measure for chaoticity. Typically the Lyapunov exponent measures the average rate of change in order

or disorder exhibited by a system. Entropy measures the uncertainty of future states of a system given

information about its previous states [3].



BACKGROUND


Epilepsy



The condition of epilepsy is defined as any of a variety of disorders marked by disturbed

electrical rhythms of the central nervous system, typically manifested by convulsive attacks, or

seizures. Because a significant portion of the population (0.5% - 0.8%, or about 50 million people

world-wide) suffers from epilepsy, there has always been a desire to predict seizures in

epileptic patients. Only recently (within the last few decades) has there been a development to

use nonlinear dynamics to quantify EEG signals. Generally it is not easy to model such nonlinear

systems, but an empirical measure, such as the Lyapunov exponent or entropy, can give insight into

the state of the system at any point.

EEG Signals and Electrode Sites



An EEG signal consists of voltage readings taken 200 times per second, or every 5 ms (Figure 1).

The readings are taken from 28 subdural electrode sites (Figure 2) which are located at different

depths and locations throughout the brain. These electrodes monitor brain activity and output to the

EEG a voltage which is displayed as a continuous graph of voltage vs. time. One goal of using

nonlinear dynamics in quantifying this signal is to recognize some underlying properties of the EEG

which are not visually evident.

























-8000 - -------------------
17M0 1750 1WO 1850 1900 19i 20X0O 2050 2100 2150 2200



Figure 1. EEG signal, recorded in volts, versus time. A seizure occurs between the 1850th and

1925th seconds of this reading


*f1i B ,


Figure 2. A mapping of subdural electrode sites in the brain




TIME SERIES



The first step in analyzing the EEG signal is to coordinate the sequence of voltages into a time series.

A time series consists of vectors of a specified dimension, n. The elements of our vectors will consist

of voltages that occur a given length of time, t, after the previous element. In other words,

beginning with our sequence of EEG readings,


X X21 X.3 ,...I X. k X.k+I ,*,




we designate the time series vectors as:


Xi = 7 2X2+t xi+2tr-7- i+(n-1)te



Selection of n, t


For the selection of the dimension, n, of our time series vectors, it was necessary to choose a

dimension that would correctly portray our entropy values. Choosing an n that is too small will result

in our vector not having enough information to accurately quantify a particular state of our system.

On the other hand, choosing an n that is too large leaves us with vectors that represents too large of

a time span, thus eliminating some of the specifics of the points contained within the vector. As a

result, entropy was calculated using time series of many dimensions and the "best" results were

chosen from the collection of data.



The parameter t should be selected to be as small as possible to capture the shortest change present

in the data. Keeping this in mind, t should also be large enough to allow for independence

between adjacent vectors in our time series [1]. From previous works dealing with this topic [2], a

good value for the time difference found between elements in our time series vector is taken to be

about 14 ms which corresponds to a t of 3 (since every reading is 5 ms apart).



CALCULATION OF ENTROPY


There are many different varieties of entropy, all with the same goal of measuring the uncertainty in

a system. For chaotic systems, the Kolmogorov-Sinai (K-S) entropy is commonly used. For our time

series analysis, the K-S entropy is calculated for a time period of 10.24 seconds, or 2048 EEG readings.

In other words, every 10.24 second interval will be represented by a single entropy value, and our

hour-long EEG sample will consist of 351 entropy values.



K-S entropy, s, is traditionally calculated as follows:



s P- r n Pr




where our value for Pr represents the probability that a given vector is significantly "close" to





another vector and r represents the current time interval we are dealing with. To begin, we define
a distance metric, D, for any vectors X, Y as:




D(X,Y) - I + -y, +...+Ix1 -yl
n


where



X = , ,...,x,,J and Y = {v,,Y2,, ,, }



The distance between vectors is then calculated for each vector with respect to every other vector in
our 10.24-second interval. The standard deviation, _, of distances between vectors is calculated and
one vector is considered "close" to another if the following relationship holds:



D(X,Y)s 0.2c


To calculate the probability:


Mr
P, MI,
M


Mr represents the total number of vectors that are "close" to a particular vector in the time interval,
and M represents the total number of vector pairings possible within that time interval (so that Pr 5
1). We calculate s for each of the vectors in the interval, r, and average these individual entropies
to arrive at an entropy value representative of our whole interval.

This value for the K-S entropy is calculated for each of the 10.24-second time intervals in our EEG
sample and is plotted over time. Table 1 displays the plots of entropy vs. time for a 9-dimension
time series for electrode sites 1, 7, 13, and 21. These specific sites were chosen because they
represent different areas of the brain. It should be noted that the probabilities calculated to arrive at
our values for entropy are dimensionless and therefore so are the values for entropy. With this in mind,
it is important to realize that the calculated values for entropy represent the chaoticity of our system at
a given phase, in our case a time interval of 10.24 seconds. A higher value of entropy corresponds to
a higher level of chaoticity (disorder) and lower value of entropy to a higher level of order.





Eliectode 7


II "~i~aL i~idiI~ L~L JI [Ii iIiIIAJFLAIIIIL


& ir 1 i njiiIu.LmliJUll un 11 ni miEEi i


WI' ~J~afjjqIj~q~.pIyJ !~ IMPU11~I!~~


Time (fsC4o*)


Table 1. Entropy readings vary from electrode to electrode, but it is visually evident that all

electrodes experience the effects of the seizure (-1900 seconds), indicated by a decrease in the value

for entropy.



DIMENSION DIFFERENCE


The driving force behind the modeling of the EEG signal as nonlinear is the goal to be able eventually

to predict a seizure in a patient within a substantial amount of time before the seizure onset. It

is commonly believed that during the period preceding a seizure, known as the preictal period, the

brain begins to exhibit signs of orderliness. It is also believed that through this orderliness there is

an entrainment between the signals produced at differing electrode sites throughout the brain. During

a seizure, the EEG reading displays a saturated, spiked graph (Figure 1), which does not lead to

a conclusion that the brain itself is becoming more ordered. But through a quantification of the

signal with measures such as the Lyapunov exponent and entropy, it is possible to visualize the

reduction in chaoticity of the system. This orderliness is represented by low (between 0 and 2) values

in entropy coincident with the onset of the seizure (Figure 3).



A further investigation into the entrainment of EEG signals can be made by comparing the values

of entropy in an electrode for differing dimensions. Assuming these values are relatively close to

one another and not random, this could indicate an entrainment in the EEG signal well in advance of


1~" Ip�~pyqrqqrI~l~yj 1u~ ~


nI 1113 1r


IF111


Electrode 1


Time ($ecfmd$]

Elctrode 13


4
: -

T -----------------
i D ^ ^ ^ ^ ^ ^ ^ ^ ^ ^- ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ ^


Tim* cfsicrmf)

Electrod 21


A f I I, I LAMI


Timm [cQlr4i j





a seizure onset. To get a data set of these values for entropy, we took the values corresponding

to entropy of a dimension, e.g. 9, and subtracted from those values the entropy of another dimension,

e.g. 8. This was done as follows, where Sn represents the sequence of entropy values for an EEG signal:




Sm = ml Sm2 '""m35 1}

Sn 11 Sn2 S *.***,Sn351,




and




m-n = ml-nl Sm2-nr2 , *" Sm351-n351 1



In our case, there are 351 values in the sequence of entropies since there are 351, 10.24-second

time intervals in our hour-long EEG reading. It should be noted that since we are only interested in

the closeness of the two entropy data sets and not the actual values, it is irrelevant whether the

values are negative or positive.


2000oo


- 11 ' I1l I 1 1


IQIiO ~O~2O 2O~O ~iOO 2160 2200


1700 17Q0 1800 1S50 I000 190S0 2000 200O 2100 21 BO 2200

Figure 3. The top graph represents voltage (EEG signal) plotted against time in seconds. The

bottom graph, in the same scale, plots entropy vs. time. It is obvious that the values for entropy

drop dramatically between 1850 and 1900 seconds. This is concurrent with the onset of the

seizure between those same values







T-Tests


To determine if two sets of entropy are significantly close to one another, a T-test is calculated as follows:






,i- i





where i and j are different dimensions of the time series being compared. In our case we will take

our sample size, n, to be 30. Therefore the sequence of entropies used to derive each individual T

score will have 30 elements, and we will form a series of T scores corresponding to entropy sets

as follows:








where the entropy sequences used to calculate each T score are:




T " * 1, 2,... , 30 J,

T1 ': f2,S3,. . 1



T, + S ,^+1 '' (n+30)-1 J



We will use a critical T-value of 2.045 (95%) - any value falling below that level represents data

sets that are "close" in our test.



RESULTS


After obtaining T values for multiple dimensions, it is possible to compare the different graphs

visually and select a dimension that most accurately represents the underlying event - the

seizure occurrence. Graphs of dimensions that include T values dropping below the critical level

are desired - these are shown to exhibit closeness between data sets. Other graphs that have higher

T values which never drop below the critical point lead to the conclusion that the data sets are

not significantly close to each other. Upon inspection of Table 2, we see that higher dimensions such as

9 and 10 tend to have T values that drop below the critical level, indicating a strong entrainment of






the EEG signal during the time period that the signal is below the critical value. From graphs of
lower dimensions, the T values never drop below the critical value, so we cannot say that the data sets
of entropy are significantly close together. Also from the visual data we see that electrode 1 alone
among the different dimensions exhibits closeness of data whereas electrodes 7 and 13 never have
T values which drop below the critical level.


ThP OM l)


Uinmuan t * -. itfotrodt 1


t ip* r dq )

DinaSlon 9 -.E& 5IWrd*T 7


Dim1noleg 1� *�, ElStode 1


7lb 41"Mi** 1

1irmnsron 10 -9. Vectrods 1i


Table 2. The two graphs in the middle of the table represent entrainment among dimensions, 9-8 and 10
- 9 respectively. Both of these alpha are taken from electrode 1. Above them are electrode 1 plots
from different dimensions and below them are differing electrode plots for the same dimension.
This demonstrates that only these specific dimensions and electrodes are significant.


CONCLUSIONS


The primary goal of this project is to determine whether or not the empirical measure of entropy can
be used to describe our nonlinear system - the EEG signal. From the data presented above, it is


L A ---


DImnnWaln S 3 4, ElctUO 1


IMmdon g- 5, Elmide $






evident that, for certain dimensions and certain electrodes, the entropy of the system is shown to

be descriptive of what we already recognize as true - that there is entrainment among electrodes in

the EEG signal. With our time series analysis, entropy was determined for many different

dimensions. These entropy values were compared to each other and a T test was performed to

determine the closeness of the data. With visual inspection, we see that in dimensions 10-9 and 9-8,

the entropy sets can be considered significantly close - we cannot statistically differentiate between

the two data sets. This leads to the overall conclusion that entropy, like the Lyapunov exponent, can

be used to measure the chaoticity of our nonlinear system - the EEG system.


REFERENCES


1. lasemidis, Leonidas D., Paralos, Panos M., Sackellares, J. Chris, et. al. (2002): Seizure Warning

Algorithm Based on Spatiotemporal Dynamics of Intracranial EEG, In: Mathematical Programming



2. lasemidis, L. D., Principe, J. C., Sackellares, J. C. (1999): Measurement and Quantification

of Spatiotemporal Dynamics of Human Epileptic Seizures, In: Nonlinear Signal Processing in Medicine



3. lasemidis, Leonidas D., Sackellares, J. Chris (1996): Chaos Theory and Epilepsy, In: The Neuroscientist


--top--



Back to the Journal of Undergraduate Research


College of Liberal Arts and Sciences I University Scholars Program I University of Florida I


� University of Florida, Gainesville, FL 32611; (352) 846-2032.


* *gM UNIVERSITY of
UF FLORIDA
r>]i?;otlindahtn Kr Tirr Gakar Nalljfn