# The Dynamics of EEG Entropy Page: 2

This
**article**
is part of the collection entitled:
UNT Scholarly Works and
was provided to Digital Library
by the UNT College of Arts and Sciences.

#### Extracted Text

The following text was automatically extracted from the image on this page using optical character recognition software:

Xk (t) is constructed from the differenced data points in the following way

k+t

Xk (t) j, k = 1,2,..,N- t +1 (1)

to obtain M = N- t +1 replicas of a stochastic trajectory by means of overlapping windows of length t. This ensemble

of trajectories, generated by the EEG time series, is used to construct the histogram using the number of trajectories

falling in a specified interval to estimate the pdf p(x, t). The pdf is then used to calculated the information entropy, a

quantity introduced in discrete form for coding information by Shannon [14] and in continuous form for studying the

problem of noise and messages in electrical filters by Wiener [15]. We use the latter form here,

S(t) = - p(x, t) log2 p(x, t)dx. (2)

The entropy S(t) of Eq. (2) assumes a simple analytical form if the pdf of the diffusion process satisfies the scaling

relation:

)p(xt F -> S(t) C + log2 o(t) (3)

where C=- f F(y) log2 F(y)dy is a constant and a (t) for a gaussian diffusion process is the time-dependent standard

deviation a(t). More generally an a- stable Levy process also scales in this way, in which case a(t) is more general

than the standard deviation of the underlying process.

If the time series were scaling, as assumed in a number of analyses of EEG data [2], the 'variance' would be a(t)octs

and the entropy graphed versus time on log-linear graph paper would increase linearly with slope b. Consequently,

the way in which the entropy for a time series scales is indicative of the scaling behavior of the time series. Note that

in a simple diffusive process this index is equal to the one obtained from the second moment, that is, b=H. However,

in general, even when there is scaling 6-H and in the case of EEG time series we establish that there is no scaling at

all.

We now consider EEG signals of 20 awake individuals in the absence of external stimulations (quiet, closed eyes).

EEG signals were recorded using the 10-20 international recording scheme. For 8 individuals only the channels

O1,02,C3 and C4 were recorded, for the remaining 12 individuals all the channels are available. To have a coherent

database, we restrict our analysis to the channels O1,02,C3 and C4, which are the channels traditionally used in sleep

studies. The sampling frequency of all EEG records is 250Hz. Durations of EEG records vary from 55s to 400s with

an average duration of 128s.

Fig. 1 shows the DE of the EEG increments for the somnographic channels O1,02,C3 and C4 of a single individual.

We see how for each channel the DE: 1) reaches a saturation level for each channel, 2) has an "alpha" (~7.6 HZ in the

case of this individual) modulation which is attenuated with time, and 3) has a small amplitude residual asymptotic

modulation. The early-time modulation, with variable frequency in the alpha range and variable amplitude, is observed

in the somnographic channels for all subjects. The saturation effect is present in all channels for all subjects and it

should be pointed out that this saturation is neither a consequence of the finite length of the time series, nor of the

finite amplitude of the EEG signal. In fact if the data points were randomly rearranged, thereby destroying any long

time correlation in the time series, the EEG entropy does not saturate. Consequently, this saturation effect is due to

brain dynamics and is not an artifact of the data processing. Robinson [16] observed this saturation in the calculation

of the EEG second moment and interpreted it as being due to dendritic filtering. The inset in Fig. 1 depicts the pdfs

psat (x), after the entropy saturation is attained. These distributions have approximatively exponential tails.

III. EEG MODEL

The simplest dynamic model, which includes fluctuations, modulation and dissipation, in short, all the properties

displayed in Fig. 1, has the form of a Langevin equation. We assume a dissipative linear dynamic process X(t), i.e.,

an Ornstein-Uhlenbeck process, with a periodic driver having a random amplitude and frequency and an additive

random force rj (t) which is a delta correlated Gaussian process of strength D:

dX(t) = -AX(t) + (t) + ZAjx [Iy,s] sin [2 fjt] (4)

j=0

The coefficient A>0 defines a negative feedback, x[Ij,s]= when its argument is in the time interval Ij,s =[jt5, (j + 1)t5]

and is zero otherwise, and t5 is the 'stability' time after which a new frequency fj and a new amplitude Aj are selected.

## Upcoming Pages

Here’s what’s next.

## Search Inside

This article can be searched. **Note: **Results may vary based on the legibility of text within the document.

## Citing and Sharing

Basic information for referencing this web page. We also provide extended guidance on usage rights, references, copying or embedding.

### Reference the current page of this Article.

Ignaccolo, Massimiliano; Latka, Miroslaw; Jernajczyk, Wojciech; Grigolini, Paolo & West, Bruce J. The Dynamics of EEG Entropy, article, March 5, 2009; [Berlin, Germany]. (digital.library.unt.edu/ark:/67531/metadc132967/m1/2/: accessed May 28, 2017), University of North Texas Libraries, Digital Library, digital.library.unt.edu; crediting UNT College of Arts and Sciences.