Entropy is a concept equally applicable to deterministic as well as stochastic processes. An entropy S is defined in natural time, which exhibits positivity, concavity and Lesche’s (experimental) stability. The entropy S- deduced from analyzing in natural time the time series obtained upon time reversal, is in general different from S, thus the entropy in natural time does satisfy the condition to be “causal” (while the variance κ1 = x 2 x 2 does not). The physical meaning of the change ΔS ≡ S-S- of the entropy in natural time under time reversal, which is of profound importance for the study of the dynamical evolution of a complex system, is discussed. For a fractional Brownian motion time series with self-similarity exponent H close to unity, as well as for an on– off intermittency model when the critical value is approached from below, both values of S and S- are smaller than the entropy Su~ 0.0966 of a “uniform” distribution. When a (natural) time window of length l is sliding through a time series, the entropy S exhibits fluctuations, a measure of which is the standard deviation δS. Complexity measures are introduced that quantify the δS variability upon changing the length scale l as well as the extent to which δS is affected when shuffling the consecutive events randomly (for l = const.). In a similar fashion, complexity measures can be defined for the fluctuations of the quantity ΔS whose standard deviation is designated σ[ΔS]. For the case that Qk are independent and identically distributed positive random variables, as in the case of data shuffled randomly, their σ/μ value is interrelated with δS and σ[ΔS].