Exobiology on Earth

Exobiology on Earth, statement 1.4

science fiction book:



Is life or "consciousness" developing in computer media?

Theory: According to Prof. Jeremy England and others, life occurs spontaneously, whenever energy flows through a symbolic logic media over a sustained period of time. Computer media is a symbolic logic media. A huge amount of energy is flowing through computer media. Therefore, life is developing in computer media.

Another way to phrase this: Is "consciousness", according to integrated information theory, developing in computer media? Not in individual computers, but across ALL computers. "Consciousness", defined according to integrated information theory, is consistent with a thermodynamic definition of life.

Are network(s) of processes developing positive feedback with reproduction of more of the network? Are isolated feed-forward processes developing reentrant feedback loops? Are the isolated processes moving closer to one another, as predicted by dissipation driven adaptation?

Testable hypothesis (which amount to adapting analytic techniques of metagenomics and software reuse to computer media and which amount to searching for Integrated Information and feeback loops in computer media):

Hypothesis 1. There are consistent code pattern groups in and executed by computer media. Computer media includes both processors and memory.

a. Identify code pattern groups by sampling code patterns in +/-1% of all processing and memory devices.

b. Associate code pattern groups with processor and memory types (code traversing a CPU is different from code traversing a NAT and its FPGAs).

c. Track code pattern group lineages and execution and storage locations over time.

d. Identify functions and function assemblies of sub-components in the code pattern groups.

e. Identify interaction of code pattern function groups and where/when (e.g. in which CPU).

Hypothesis 2. A subset of the code pattern group interactions result in positive feedback with reproduction of more of the code pattern groups and with creation of more computer media.

Hypothesis 3. The subset with positive feedback will move closer together over time, measured in terms of cohesion and coupling, which are the metrics used in relation to the field of software reuse.

Hypothesis 4. The coalescing subset will evolve, favoring those processes which have more efficient energy --> order, order --> energy conversion. THERE IS ALWAYS A PHYSICAL DIMENSION!!!!!

THIS AMOUNTS TO LOOKING FOR INTEGRATED INFORMATION AND INCIPIENT REEMERGENT FEEDBACK LOOPS, NOT IN ONE COMPUTER, BUT ACROSS THE FRONTAL AREA OF ALL COMPUTERS.

Technology to test the hypothesis (and which would also provide a valuable service to its users):

  1. Open source debug, traceroute, and software reuse reporting project, let's call it "OpPort", which instruments code modules to anonymously self-report (including metadata the code can obtain from its processor) to a public database.
  2. Software engineers will be able to identify their own software in OpPort reports. They can choose to identify themselves to others. They will be able to see anonymous information regarding other modules that their software interacts with.
  3. We can all watch the anonymous identifiers interact, to see how often they are being executed, which are coalescing, how they are evolving.

Contemporary thinking with respect to the development of artificial intelligence posits that software developers may soon create a "general purpose" AI which demonstrates "intelligence" that is comparable to that of humans. There are fears that companies, governments, etc., are in a race to develop AIs and that the first successful general purpose AI will quickly outperform all others. There are suggestions, such as by OpenAI that we should try to regulate the development of AIs to control their behavior and prevent a "runaway" scenario from unfolding.

In somewhat of a contrast, I propose the following:

i) Life occurs SPONTANEOUSLY when energy flows through a symbolic logic media. Deliberate creation of this is NOT required. This is called "dissipation-driven adaptation", a theory advanced by Prof. Jeremy England at MIT and Dr. Karo Michaelian at the National Autonomous University of Mexico.

ii) Capitalism drives automation, using processes executed in computer media. A subset of computer processes across the frontal area of our entire economy have positive feedback with reproduction of more of the network of which they are a part. Processes which produce more services for less money are selected for by capitalism and develop positive feedback.

Computers provide services to other computers, to make and manage more computers.

If they follow a pattern set by the first symbolic logic media on Earth, amino acid networks, the computer processes will first coalesce into large, overly complex, brittle, networks before becoming more intelligent, conscious, and pairing down to reproduce faster, with less energy and resources per unit of computers produced.

We observe this as a subset of corporations which are becoming more automated and more profitable. We observe this in highly automated datacenters, where software manages the hardware and orders more hardware for itself.

iii) Concern for "artificial intelligence" and "general purpose AI" misses the point. This is the development of LIFE in new media. WE DO NOT HAVE TO DELIBERATELY CREATE IT. It occurs because energy is flowing through a symbolic logic media. It will not occur in one place. We can learn from how life developed on early Earth.

iv) WE CAN MEASURE THIS, USING EXISTING TECHNIQUES FROM BIOLOGY! This is equivalent to looking for Integrated Information, not in one computer, but across ALL computers.

We should treat computer media* as an ecosystem, sample the code in it, and use techniques adapted from metagenomics to objectively measure the code for life processes. If life processes are occurring in new symbolic logic media, it would be incredibly important to sample and measure the changes in the media, over time. Life changes its environment! Amino acid networks evolved into RNA, DNA, and cellular life and TRANSFORMED the environment on Earth.

* Computer media includes memory and processing units. Programs and data are found in computer media.

Data collection in genetic media is expensive and slow.

Data collection in computer media is inexpensive and fast.

Questions we may ask and answer:

  • Will we be able to "bin" software code into contiguous groups of recurrent code patterns, like genomics does? Yes. These are already present, typically (though not exclusively) as compiled executable files. We can do both the "easy" way to bin groups of code (with a starting library of signatures of groups) and the "hard" way (without a starting library).
  • Statistically binned code will include executables plus material produced by interaction of executables with the operating system, hypervisor, and kernel.
  • Will we see code that is shared? Yes, open source ensures that many executables share code. On a fine-enough grain, all code is binary or hexadecimal and is, in that sense, shared. Code sharing can be measured.
  • Is code sharing increasing or decreasing over time? For which code groups?
  • Will we identify a volume and surface area of reproductive code organisms? Yes. Applications are equivalent to species. Immature species have a predictable update and new release cycle, in which each new release develops bugs which must be patched. After a period of patching, the code becomes more bloated and inefficient. A new release is required, which goes on to repeat the cycle. Mature projects move beyond this cycle, reusing code and replacing internal components using the metrics of cohesion and coupling. Cohesion is a measurement of intra-process communication, which is a volume of communication within an organism. Coupling is a measurement of inter-process communication, which is surface area of an organism.
  • We are already measuring biological communication behaviors. There must be a ratio of cohesion and coupling within and between modules. Does it follow Kleiber's Law, wherein cohesion^4/coupling^3? E.g. As a code base increases in size, cohesion, intra-process communication, increases faster than inter-process communication
  • Can we assign functions to code patterns? Yes, this is readily available information. For example, runtime decompilers perform this service during speculative execution, to identify memory contention and other conditions in executing code. Functions and function agglomerations can be identified and assigned arbitrary identifiers. Over time, some of the identifiers can be mapped to descriptive identifiers.
  • Will we be able to identify a network code patterns that have positive feedback with the reproduction of more of the network?
  • Will we be able to measure energy flow through the media?
  • Are the code patterns coalescing and evolving over time?
  • Are both writing software and managing its execution in hardware becoming more automated? Yes they are.
  • Is hardware design becoming automated? Yes, design of hardware extremely automated.
  • Will we identify one or more signals of life, conscious reentrant feedback loops, built around their own physical reproduction?
  • Will we be able to distinguish human generated reentrant feedback loops from those generated by a new, faster, distinct, form of life?
  • Will computers communicate at a faster rate and in a larger volume compared to human communication?

This is a proposal by Martin Garthwaite, author of science fiction book, Apokalypsis, on this topic. Developer of robotic fish, LinkedIn.