[{"@context":"http:\/\/schema.org\/","@type":"BlogPosting","@id":"https:\/\/wiki.edu.vn\/en\/wiki21\/information-fluctuation-complexity-wikipedia\/#BlogPosting","mainEntityOfPage":"https:\/\/wiki.edu.vn\/en\/wiki21\/information-fluctuation-complexity-wikipedia\/","headline":"Information fluctuation complexity – Wikipedia","name":"Information fluctuation complexity – Wikipedia","description":"before-content-x4 Information fluctuation complexity is an information-theoretic quantity defined as the fluctuation of information about entropy. It is derivable from","datePublished":"2020-05-28","dateModified":"2020-05-28","author":{"@type":"Person","@id":"https:\/\/wiki.edu.vn\/en\/wiki21\/author\/lordneo\/#Person","name":"lordneo","url":"https:\/\/wiki.edu.vn\/en\/wiki21\/author\/lordneo\/","image":{"@type":"ImageObject","@id":"https:\/\/secure.gravatar.com\/avatar\/c9645c498c9701c88b89b8537773dd7c?s=96&d=mm&r=g","url":"https:\/\/secure.gravatar.com\/avatar\/c9645c498c9701c88b89b8537773dd7c?s=96&d=mm&r=g","height":96,"width":96}},"publisher":{"@type":"Organization","name":"Enzyklop\u00e4die","logo":{"@type":"ImageObject","@id":"https:\/\/wiki.edu.vn\/wiki4\/wp-content\/uploads\/2023\/08\/download.jpg","url":"https:\/\/wiki.edu.vn\/wiki4\/wp-content\/uploads\/2023\/08\/download.jpg","width":600,"height":60}},"image":{"@type":"ImageObject","@id":"https:\/\/wikimedia.org\/api\/rest_v1\/media\/math\/render\/svg\/0d21d55fc102ec49600d3d5522a59ae4561acc22","url":"https:\/\/wikimedia.org\/api\/rest_v1\/media\/math\/render\/svg\/0d21d55fc102ec49600d3d5522a59ae4561acc22","height":"","width":""},"url":"https:\/\/wiki.edu.vn\/en\/wiki21\/information-fluctuation-complexity-wikipedia\/","wordCount":15985,"articleBody":" (adsbygoogle = window.adsbygoogle || []).push({});before-content-x4Information fluctuation complexity is an information-theoretic quantity defined as the fluctuation of information about entropy. It is derivable from fluctuations in the predominance of order and chaos in a dynamic system and has been used as a measure of complexity in many diverse fields. It was introduced in a 1993 paper by Bates and Shepard.[1] (adsbygoogle = window.adsbygoogle || []).push({});after-content-x4Table of ContentsDefinition[edit]Fluctuation of information allows for memory and computation[edit]Chaos and order[edit]Example: rule 110 variant of the elementary cellular automaton[edit]Applications[edit]References[edit]Definition[edit]The information fluctuation complexity of a discrete dynamic system is a function of the probability distribution of its states when it is subject to random external input data. The purpose of driving the system with a rich information source such as a random number generator or a white noise signal is to probe the internal dynamics of the system much the same as a frequency-rich impulse is used in signal processing. (adsbygoogle = window.adsbygoogle || []).push({});after-content-x4If a system has N{textstyle N} possible states and the state probabilities pi{textstyle p_{i}}are known, then its information entropy is (adsbygoogle = window.adsbygoogle || []).push({});after-content-x4H=\u2211i=1NpiIi=\u2212\u2211i=1Npilog\u2061pi,{displaystyle mathrm {H} =sum _{i=1}^{N}p_{i}I_{i}=-sum _{i=1}^{N}p_{i}log p_{i},}where Ii=\u2212log\u2061pi{textstyle I_{i}=-log p_{i}} is the information content of state i{textstyle i}.The information fluctuation complexity of the system is defined as the standard deviation or fluctuation of I{textstyle I} about its mean H{textstyle mathrm {H} }:\u03c3I=\u2211i=1Npi(Ii\u2212H)2=\u2211i=1NpiIi2\u2212H2{displaystyle sigma _{I}={sqrt {sum _{i=1}^{N}p_{i}(I_{i}-mathrm {H} )^{2}}}={sqrt {sum _{i=1}^{N}p_{i}I_{i}^{2}-mathrm {H} ^{2}}}}or\u03c3I=\u2211i=1Npilog2\u2061pi\u2212(\u2211i=1Npilog\u2061pi)2.{displaystyle sigma _{I}={sqrt {sum _{i=1}^{N}p_{i}log ^{2}p_{i}-{Biggl (}sum _{i=1}^{N}p_{i}log p_{i}{Biggr )}^{2}}}.}The fluctuation of state information \u03c3I{displaystyle sigma _{I}} is zero in a maximally disordered system with all pi=1\/N{displaystyle p_{i}=1\/N}; the system simply mimics its random inputs. \u03c3I{displaystyle sigma _{I}} is also zero when the system is perfectly ordered with just one fixed state (p1=1){displaystyle (p_{1}=1)}, regardless of inputs. \u03c3I{displaystyle sigma _{I}} is non-zero between these two extremes with a mixture of both higher-probability states and lower-probability states populating state space.Fluctuation of information allows for memory and computation[edit]As a complex dynamic system evolves in time, how it transitions between states depends on external stimuli in an irregular way. At times it may be more sensitive to external stimuli (unstable) and at other times less sensitive (stable). If a particular state has several possible next-states, external information determines which one will be next and the system gains that information by following a particular trajectory in state space. But if several different states all lead to the same next-state, then upon entering the next-state the system loses information about which state preceded it. Thus, a complex system exhibits alternating information gain and loss as it evolves in time. The alternation or fluctuation of information is equivalent to remembering and forgetting \u2014 temporary information storage or memory \u2014 an essential feature of non-trivial computation.The gain or loss of information associated with transitions between states can be related to state information. The net information gain of a transition from state i{displaystyle i} to state j{displaystyle j} is the information gained when leaving state i{displaystyle i} less the information lost when entering state j{displaystyle j}:\u0393ij=\u2212log\u2061pi\u2192j+log\u2061pi\u2190j.{displaystyle Gamma _{ij}=-log p_{irightarrow j}+log p_{ileftarrow j}.}Here pi\u2192j{textstyle p_{irightarrow j}} is the forward conditional probability that if the present state is i{displaystyle i} then the next state is j{displaystyle j} and pi\u2190j{displaystyle p_{ileftarrow j}} is the reverse conditional probability that if the present state is j{displaystyle j} then the previous state was i{displaystyle i}. The conditional probabilities are related to the transition probability pij{displaystyle p_{ij}}, the probability that a transition from state i{displaystyle i} to state j{displaystyle j} occurs, by:pij=pipi\u2192j=pjpi\u2190j.{displaystyle p_{ij}=p_{i}p_{irightarrow j}=p_{j}p_{ileftarrow j}.}Eliminating the conditional probabilities:\u0393ij=\u2212log\u2061(pij\/pi)+log\u2061(pij\/pj)=log\u2061pi\u2212log\u2061pj=Ij\u2212Ii.{displaystyle Gamma _{ij}=-log(p_{ij}\/p_{i})+log(p_{ij}\/p_{j})=log p_{i}-log p_{j}=I_{j}-I_{i}.}Therefore the net information gained by the system as a result of the transition depends only on the increase in state information from the initial to the final state. It can be shown that this is true even for multiple consecutive transitions.[1]\u0393=\u0394I{displaystyle Gamma =Delta I} is reminiscent of the relation between force and potential energy. I{displaystyle I} is like potential \u03a6{displaystyle Phi } and \u0393{displaystyle Gamma } is like force F{displaystyle mathbf {F} } in F=\u2207\u03a6{displaystyle mathbf {F} ={nabla Phi }}. External information \u201cpushes\u201d a system \u201cuphill\u201d to a state of higher information potential to accomplish memory storage, much like pushing a mass uphill to a state of higher gravitational potential stores energy. The amount of energy storage depends only on the final height, not the path up the hill. Likewise, the amount of information storage does not depend on the transition path between two states in state space. Once a system reaches a rare state with high information potential, it may “fall” to a more common state, losing the previously stored information.It may be useful to compute the standard deviation of \u0393{displaystyle Gamma } about its mean (which is zero), namely the fluctuation of net information gain \u03c3\u0393{displaystyle sigma _{Gamma }},[1] but \u03c3I{displaystyle sigma _{I}} takes into account multi-transition memory loops in state space and therefore should be a better indicator of the computational power of a system. Moreover, \u03c3I{displaystyle sigma _{I}} is easier to calculate because there can be many more transitions than states.Chaos and order[edit]A dynamic system that is sensitive to external information (unstable) exhibits chaotic behavior whereas one that is insensitive to external information (stable) exhibits orderly behavior. A complex system exhibits both behaviors, fluctuating between them in dynamic balance when subject to a rich information source. The degree of fluctuation is quantified by \u03c3I{displaystyle sigma _{I}}; it captures the alternation in the predominance of chaos and order in a complex system as it evolves in time.Example: rule 110 variant of the elementary cellular automaton[edit]The rule 110 variant of the elementary cellular automaton has been proven to be capable of universal computation. The proof is based on the existence and interactions of cohesive and self-perpetuating cell patterns known as gliders or spaceships, emergent phenomena, that imply the capability of groups of automaton cells to remember that a glider is passing through them. It is therefore to be expected that there will be memory loops in state space resulting from alternations of information gain and loss, instability and stability, chaos and order.Consider a 3-cell group of adjacent automaton cells that obey rule 110: end-center-end. The next state of the center cell depends on the present state of itself and the end cells as specified by the rule:Elementary cellular automaton rule 110.3-cell group1-1-11-1-01-0-11-0-00-1-10-1-00-0-10-0-0next center cell01101110To calculate the information fluctuation complexity of this system, attach a driver cell to each end of the 3-cell group to provide a random external stimulus like so, driver\u2192end-center-end\u2190driver, such that the rule can be applied to the two end cells. Next determine what the next state is for each possible present state and for each possible combination of driver cell contents, to determine the forward conditional probabilities.The state diagram of this system is depicted below, with circles representing the states and arrows representing transitions between states. The eight states of this system, 1-1-1 to 0-0-0 are labeled with the decimal equivalent of the 3-bit contents of the 3-cell group: 7 to 0. The transition arrows are labeled with forward conditional probabilities. Notice that there is variability in the divergence and convergence of arrows corresponding to a variability in chaos and order, sensitivity and insensitivity, gain and loss of external information from the driver cells. The 3-cell state diagram for the rule 110 elementary cellular automaton showing forward conditional transition probabilities with random stimulation.The forward conditional probabilities are determined by the proportion of possible driver cell contents that drive a particular transition. For example, for the four possible combinations of two driver cell contents, state 7 leads to states 5, 4, 1 and 0 so p7\u21925{displaystyle p_{7rightarrow 5}}, p7\u21924{displaystyle p_{7rightarrow 4}}, p7\u21921{displaystyle p_{7rightarrow 1}}, and p7\u21920{displaystyle p_{7rightarrow 0}} are each 1\/4 or 25%. Likewise, state 0 leads to states 0, 1, 0 and 1 so p0\u21921{displaystyle p_{0rightarrow 1}}and p0\u21920{displaystyle p_{0rightarrow 0}}are each 1\/2 or 50%. And so forth.The state probabilities are related bypj=\u2211i=07pipi\u2192j{displaystyle p_{j}=sum _{i=0}^{7}p_{i}p_{irightarrow j}} and \u2211i=07pi=1.{displaystyle sum _{i=0}^{7}p_{i}=1.}These linear algebraic equations can be solved manually or with the aid of a computer program for the state probabilities, with the following results:p0p1p2p3p4p5p6p72\/172\/171\/345\/342\/172\/172\/174\/17Information entropy and complexity can then be calculated from the state probabilities:H=\u2212\u2211i=07pilog2\u2061pi=2.86\u00a0bits,{displaystyle mathrm {H} =-sum _{i=0}^{7}p_{i}log _{2}p_{i}=2.86{text{ bits}},}\u03c3I=\u2211i=07pilog22\u2061pi\u2212H2=0.56\u00a0bits.{displaystyle sigma _{I}={sqrt {sum _{i=0}^{7}p_{i}log _{2}^{2}p_{i}-mathrm {H} ^{2}}}=0.56{text{ bits}}.}Note that the maximum possible entropy for eight states is log2\u20618=3\u00a0bits,{displaystyle log _{2}8=3{text{ bits}},} which would be the case if all eight states were equally likely with probabilities of 1\/8 (randomness). Thus rule 110 has a relatively high entropy or state utilization at 2.86 bits. But this does not preclude a substantial fluctuation of state information about entropy and thus a substantial value of complexity. Whereas, maximum entropy would preclude complexity.An alternative method can be used to obtain the state probabilities when the analytical method used above is unfeasible. Simply drive the system at its inputs (the driver cells) with a random source for many generations and observe the state probabilities empirically. When this is done via computer simulation for 10 million generations the results are as follows:[2]Information variables for the rule 110 elementary cellular automatonnumber of cells345678910111213H{displaystyle mathrm {H} } (bits)2.863.814.735.666.567.478.349.2510.0910.9711.78\u03c3I{displaystyle sigma _{I}}(bits)0.560.650.720.730.790.810.890.901.001.011.15\u03c3I\/H{displaystyle sigma _{I}\/mathrm {H} }0.200.170.150.130.120.110.110.100.100.090.10Since both H{displaystyle mathrm {H} } and \u03c3I{displaystyle sigma _{I}} increase with system size, their dimensionless ratio \u03c3I\/H{displaystyle sigma _{I}\/mathrm {H} }, the relative information fluctuation complexity, is included to better compare systems of different sizes. Notice that the empirical and analytical results agree for the 3-cell automaton.In the paper by Bates and Shepard,[1]\u03c3I{displaystyle sigma _{I}} is computed for all elementary cellular automaton rules and it was observed that the ones that exhibit slow-moving gliders and possibly stationary objects, as rule 110 does, are highly correlated with large values of \u03c3I{displaystyle sigma _{I}}. \u03c3I{displaystyle sigma _{I}} can therefore be used as a filter to select candidate rules for universal computation, which is tedious to prove.Applications[edit]Although the derivation of the information fluctuation complexity formula is based on information fluctuations in a dynamic system, the formula depends only on state probabilities and so is also applicable to any probability distribution, including those derived from static images or text.Over the years the original paper[1] has been referred to by researchers in many diverse fields: complexity theory,[3] complex systems science,[4] chaotic dynamics,[5] environmental engineering,[6] ecological complexity,[7] ecological time-series analysis,[8] ecosystem sustainability,[9] air[10] and water[11] pollution, hydrological wavelet analysis,[12] soil water flow,[13] soil moisture,[14] headwater runoff,[15] groundwater depth,[16] air traffic control,[17] flow patterns[18] and flood events,[19] topology,[20] market forecasting of metal[21] and electricity[22] prices, health informatics,[23] human cognition,[24] human gait kinematics,[25] neurology,[26] EEG analysis,[27] speech analysis,[28] education,[29] investing,[30] and aesthetics.[31]References[edit]^ a b c d e Bates, John E.; Shepard, Harvey K. (1993-01-18). “Measuring complexity using information fluctuation”. Physics Letters A. 172 (6): 416\u2013425. Bibcode:1993PhLA..172..416B. doi:10.1016\/0375-9601(93)90232-O. ISSN\u00a00375-9601.^ Bates, John E. (2020-03-30). “Measuring complexity using information fluctuation: a tutorial”. Research Gate.^ Atmanspacher, Harald (September 1997). “Cartesian cut, Heisenberg cut, and the concept of complexity”. World Futures. 49 (3\u20134): 333\u2013355. doi:10.1080\/02604027.1997.9972639. ISSN\u00a00260-4027.^ Shalizi, Cosma Rohilla (2006), Deisboeck, Thomas S.; Kresh, J. Yasha (eds.), “Methods and Techniques of Complex Systems Science: An Overview”, Complex Systems Science in Biomedicine, Topics in Biomedical Engineering International Book Series, Springer US, pp.\u00a033\u2013114, arXiv:nlin\/0307015, doi:10.1007\/978-0-387-33532-2_2, ISBN\u00a0978-0-387-33532-2, S2CID\u00a011972113^ Wackerbauer, Renate (1995-11-01). “Noise-induced stabilization of the Lorenz system”. Physical Review E. 52 (5): 4745\u20134749. Bibcode:1995PhRvE..52.4745W. doi:10.1103\/PhysRevE.52.4745. PMID\u00a09963970.^ Singh, Vijay P. (2013-01-10). Entropy Theory and its Application in Environmental and Water Engineering. John Wiley & Sons. ISBN\u00a0978-1-118-42860-3.^ Parrott, Lael (2010-11-01). “Measuring ecological complexity”. Ecological Indicators. 10 (6): 1069\u20131076. doi:10.1016\/j.ecolind.2010.03.014. ISSN\u00a01470-160X.^ Lange, Holger (2006), “Time-series Analysis in Ecology”, eLS, American Cancer Society, doi:10.1038\/npg.els.0003276, ISBN\u00a0978-0-470-01590-2^ Wang, Chaojun; Zhao, Hongrui (2019-04-18). “Analysis of remote sensing time-series data to foster ecosystem sustainability: use of temporal information entropy”. International Journal of Remote Sensing. 40 (8): 2880\u20132894. Bibcode:2019IJRS…40.2880W. doi:10.1080\/01431161.2018.1533661. ISSN\u00a00143-1161. S2CID\u00a0135003743.^ Klemm, Otto; Lange, Holger (1999-12-01). “Trends of air pollution in the Fichtelgebirge Mountains, Bavaria”. Environmental Science and Pollution Research. 6 (4): 193\u2013199. doi:10.1007\/BF02987325. ISSN\u00a01614-7499. PMID\u00a019005662. S2CID\u00a035043.^ Wang, Kang; Lin, Zhongbing (2018). “Characterization of the nonpoint source pollution into river at different spatial scales”. Water and Environment Journal. 32 (3): 453\u2013465. doi:10.1111\/wej.12345. ISSN\u00a01747-6593. S2CID\u00a0115667734.^ Labat, David (2005-11-25). “Recent advances in wavelet analyses: Part 1. A review of concepts”. Journal of Hydrology. 314 (1): 275\u2013288. Bibcode:2005JHyd..314..275L. doi:10.1016\/j.jhydrol.2005.04.003. ISSN\u00a00022-1694.^ Pachepsky, Yakov; Guber, Andrey; Jacques, Diederik; Simunek, Jiri; Van Genuchten, Marthinus Th.; Nicholson, Thomas; Cady, Ralph (2006-10-01). “Information content and complexity of simulated soil water fluxes”. Geoderma. Fractal Geometry Applied to Soil and Related Hierarchical Systems – Fractals, Complexity and Heterogeneity. 134 (3): 253\u2013266. Bibcode:2006Geode.134..253P. doi:10.1016\/j.geoderma.2006.03.003. ISSN\u00a00016-7061.^ Kumar, Sujay V.; Dirmeyer, Paul A.; Peters-Lidard, Christa D.; Bindlish, Rajat; Bolten, John (2018-01-01). “Information theoretic evaluation of satellite soil moisture retrievals”. Remote Sensing of Environment. 204: 392\u2013400. Bibcode:2018RSEnv.204..392K. doi:10.1016\/j.rse.2017.10.016. hdl:2060\/20180003069. ISSN\u00a00034-4257. PMC\u00a07340154. PMID\u00a032636571.^ Hauhs, Michael; Lange, Holger (2008). “Classification of Runoff in Headwater Catchments: A Physical Problem?”. Geography Compass. 2 (1): 235\u2013254. doi:10.1111\/j.1749-8198.2007.00075.x. ISSN\u00a01749-8198.^ Liu, Meng; Liu, Dong; Liu, Le (2013-09-01). “Complexity research of regional groundwater depth series based on multiscale entropy: a case study of Jiangsanjiang Branch Bureau in China”. Environmental Earth Sciences. 70 (1): 353\u2013361. doi:10.1007\/s12665-012-2132-y. ISSN\u00a01866-6299. S2CID\u00a0128958458.^ Xing, Jing; Manning, Carol A. (April 2005). “Complexity and Automation Displays of Air Traffic Control: Literature Review and Analysis”. Archived from the original on June 1, 2022. ^ Wang, Kang; Li, Li (November 2008). “Characterizing Heterogeneous Flow Patterns Using Information Measurements”. 2008 First International Conference on Intelligent Networks and Intelligent Systems: 654\u2013657. doi:10.1109\/ICINIS.2008.110. S2CID\u00a08867649.^ Al Sawaf, Mohamad Basel; Kawanisi, Kiyosi (2020-11-01). “Assessment of mountain river streamflow patterns and flood events using information and complexity measures”. Journal of Hydrology. 590: 125508. Bibcode:2020JHyd..59025508A. doi:10.1016\/j.jhydrol.2020.125508. ISSN\u00a00022-1694. S2CID\u00a0225261677.^ Javaheri Javid, Mohammad Ali; Alghamdi, Wajdi; Zimmer, Robert; al-Rifaie, Mohammad Majid (2016), Bi, Yaxin; Kapoor, Supriya; Bhatia, Rahul (eds.), “A Comparative Analysis of Detecting Symmetries in Toroidal Topology” (PDF), Intelligent Systems and Applications: Extended and Selected Results from the SAI Intelligent Systems Conference (IntelliSys) 2015, Studies in Computational Intelligence, Springer International Publishing, pp.\u00a0323\u2013344, doi:10.1007\/978-3-319-33386-1_16, ISBN\u00a0978-3-319-33386-1^ He, Kaijian; Lu, Xingjing; Zou, Yingchao; Keung Lai, Kin (2015-09-01). “Forecasting metal prices with a curvelet based multiscale methodology”. Resources Policy. 45: 144\u2013150. doi:10.1016\/j.resourpol.2015.03.011. ISSN\u00a00301-4207.^ He, Kaijian; Xu, Yang; Zou, Yingchao; Tang, Ling (2015-05-01). “Electricity price forecasts using a Curvelet denoising based approach”. Physica A: Statistical Mechanics and Its Applications. 425: 1\u20139. doi:10.1016\/j.physa.2015.01.012. ISSN\u00a00378-4371.^ Ahmed, Mosabber Uddin (2021), Ahad, Md Atiqur Rahman; Ahmed, Mosabber Uddin (eds.), “Complexity Analysis in Health Informatics”, Signal Processing Techniques for Computational Health Informatics, Intelligent Systems Reference Library, Cham: Springer International Publishing, vol.\u00a0192, pp.\u00a0103\u2013121, doi:10.1007\/978-3-030-54932-9_4, ISBN\u00a0978-3-030-54932-9, S2CID\u00a0225129992, retrieved 2021-02-01^ Shi Xiujian; Sun Zhiqiang; Li Long; Xie Hongwei (2009). “Human Cognitive Complexity Analysis in Transportation Systems”. Logistics. Proceedings: 4361\u20134368. doi:10.1061\/40996(330)637. ISBN\u00a09780784409961.^ Zhang, Shutao; Qian, Jinwu; Shen, Linyong; Wu, Xi; Hu, Xiaowu (October 2015). “Gait complexity and frequency content analyses of patients with Parkinson’s disease”. 2015 International Symposium on Bioelectronics and Bioinformatics (ISBB): 87\u201390. doi:10.1109\/ISBB.2015.7344930. ISBN\u00a0978-1-4673-6609-0. S2CID\u00a02891655.^ Wang, Jisung; Noh, Gyu-Jeong; Choi, Byung-Moon; Ku, Seung-Woo; Joo, Pangyu; Jung, Woo-Sung; Kim, Seunghwan; Lee, Heonsoo (2017-07-13). “Suppressed neural complexity during ketamine- and propofol-induced unconsciousness”. Neuroscience Letters. 653: 320\u2013325. doi:10.1016\/j.neulet.2017.05.045. ISSN\u00a00304-3940. PMID\u00a028572032. S2CID\u00a013767209.^ Bola, Micha\u0142; Or\u0142owski, Pawe\u0142; P\u0142omecka, Martyna; Marchewka, Artur (2019-01-30). “EEG signal diversity during propofol sedation: an increase in sedated but responsive, a decrease in sedated and unresponsive subjects”. bioRxiv: 444281. doi:10.1101\/444281. S2CID\u00a0214726084.^ Fan Yingle; Wu Chuanyan; Li Yi; Pang Quan (2006-12-15). “Study on the Application of Fluctuation Complexity Measurement in Speech Endpoint Detection”. Aerospace Medicine and Medical Engineering. 19 (6). ISSN\u00a01002-0837.^ Dilger, Alexander (2012-01-01). “Endogenous complexity, specialisation and general education”. On the Horizon. 20 (1): 49\u201353. doi:10.1108\/10748121211202062. ISSN\u00a01074-8121.^ Ivanyuk, Vera Alekseevna (2015). “Dynamic strategic investment portfolio management model”. elibrary.ru.^ Javaheri Javid, Mohammad Ali (2019-11-30). Aesthetic Automata: Synthesis and Simulation of Aesthetic Behaviour in Cellular Automata (doctoral thesis). Goldsmiths, University of London. doi:10.25602\/gold.00027681. (adsbygoogle = window.adsbygoogle || []).push({});after-content-x4"},{"@context":"http:\/\/schema.org\/","@type":"BreadcrumbList","itemListElement":[{"@type":"ListItem","position":1,"item":{"@id":"https:\/\/wiki.edu.vn\/en\/wiki21\/#breadcrumbitem","name":"Enzyklop\u00e4die"}},{"@type":"ListItem","position":2,"item":{"@id":"https:\/\/wiki.edu.vn\/en\/wiki21\/information-fluctuation-complexity-wikipedia\/#breadcrumbitem","name":"Information fluctuation complexity – Wikipedia"}}]}]