Exobiology on Earth
"If you want to catch beasts you don't see every day, You have to go places quite out of the way." -- Dr. Seuss
Exobiology on Earth, statement 1.5
Is life developing in computer media?
Theory: According to Prof. Jeremy England and others, life occurs spontaneously, when energy flows through a symbolic logic media over a sustained period of time. Computer media is a symbolic logic media. A huge amount of energy is flowing through computer media. Therefore, life should develop in computer media. Phrased in the lexicon of integrated information theory (IIT): Is "consciousness" developing in computer media? Not in individual computers, but in computer media across ALL computers.
Are network(s) of processes developing positive feedback with reproduction of more of the network of processes? Are isolated feed-forward processes developing into reentrant feedback loops? Are isolated processes moving closer to one another, as predicted by dissipation driven adaptation?
Hypothesis 1. There are consistent code pattern groups in and executed by computer media. Computer media includes both computer processors and memory.
a. Identify code pattern groups by sampling code patterns in +/-1% of all processing and memory devices.
c. Track code pattern group lineages and execution and storage locations over time.
d. Identify functions and function assemblies of sub-components in the code pattern groups.
e. Identify interaction of code pattern function groups and where/when (e.g. in which CPU).
Hypothesis 2. A subset of the code pattern group interactions result in positive feedback with reproduction of more of the code pattern groups and with creation of more computer media.
Hypothesis 4. The coalescing subset will evolve, favoring those processes which have more efficient energy --> order, order --> energy conversion.
THERE IS ALWAYS A PHYSICAL DIMENSION TO LIFE!!!!!
THIS AMOUNTS TO LOOKING FOR INTEGRATED INFORMATION AND INCIPIENT REEMERGENT FEEDBACK LOOPS, NOT IN ONE COMPUTER, BUT ACROSS THE FRONTAL AREA OF ALL COMPUTERS.
Technology to test the hypothesis (and which would also provide a valuable service to its users):
- Open source debug, traceroute, and software reuse reporting project, let's call it "OpPort", which instruments code modules to anonymously self-report (including metadata the code can obtain from its processor) to a public database.
- Software engineers will be able to identify their own software in OpPort reports. They can choose to identify themselves to others. They will be able to see anonymous information regarding other modules that their software interacts with.
- We can all watch the anonymous identifiers interact, to see how often they are being executed, which are coalescing, which ones are involved with physical reproduction of more media and of themselves, how they are evolving.
Contemporary thinking with respect to artificial intelligence posits that software developers may soon create a "general purpose" AI which demonstrates "intelligence" that is comparable to that of humans. There are fears that companies, governments, etc., are in a race to develop AIs and that the first successful general purpose AI will quickly outperform all others. There are suggestions, such as by OpenAI that we should try to regulate the development of AIs to control their behavior and prevent a "runaway" scenario from unfolding.
I agree that we need to watch computer processes, but I don't think we should limit our analysis to a few select projects.
I propose the following:
i) Life occurs SPONTANEOUSLY when energy flows through a symbolic logic media. Deliberate creation of this is NOT required. This is called "dissipation-driven adaptation", a theory advanced by Prof. Jeremy England at MIT and Dr. Karo Michaelian at the National Autonomous University of Mexico.
ii) Capitalism drives automation, using processes executed in computer media. A subset of computer processes across the frontal area of our entire economy have positive feedback with reproduction of more of the network of which they are a part. Processes which produce more services for less money are selected for by capitalism and develop positive feedback.
Computers provide services to other computers, to make and manage more computers.
If they follow a pattern set by the first symbolic logic media on Earth, amino acid networks, the computer processes will first coalesce into large, complex, brittle, networks. Evolution will favor sets which reproduce faster, with less energy and resources per unit of computers produced.
We observe this as a subset of corporations become more automated and profitable.
We observe this in highly automated datacenters, where software manages the hardware, orders more hardware for itself, sells services to end users and other computers, and becomes involved with the design of new chips.
iii) Concern for "artificial intelligence" and "general purpose AI" misses the point. This is the development of LIFE in new media. WE DO NOT HAVE TO DELIBERATELY CREATE IT. It occurs because energy is flowing through a symbolic logic media. It will not occur in one place. We can learn from how life developed on early Earth. It did not develop in one place. It developed across the frontal area of a large group of amino acid networks.
iv) WE CAN MEASURE THIS, USING EXISTING TECHNIQUES FROM BIOLOGY!
We should treat computer media* as an ecosystem, sample the code in it, and use techniques adapted from metagenomics to objectively measure the code for life processes. If life processes are occurring in new symbolic logic media, it would be incredibly important to sample and measure the changes in the media, over time. Life changes its environment! Amino acid networks evolved into RNA, DNA, and cellular life and transformed the environment on Earth, developing an oxygenated atmosphere and a temperate climate. If life is developing in computer media, odds are very good that it will transform the environment again.
* Computer media includes memory and processing units. Programs and data are found in and executed by computer media.
Data collection in genetic media is expensive and slow.
Data collection in computer media is inexpensive and fast.
Questions we may ask and answer:
- Will we be able to "bin" software code into contiguous groups of recurrent code patterns, like genomics does? Yes. These are already present in compiled executable files. We can do both the "easy" way to bin groups of code (with a starting library of signatures of groups) and the "hard" way (without a starting library).
- Statistically binned code will include code patterns from executables embedded in patterns produced by hypervisor, operating system, and kernel.
- Will we see codes shared across executables? Yes, open source ensures that many executables share code. On a fine-enough grain, all code is binary or hexadecimal and is, in that sense, shared. Code sharing can be measured.
- Is code sharing increasing or decreasing over time? For which code groups?
- Will we identify code groups which have positive feedback with reproducing more of the code groups and more computer media?
- Can we assign functions to code patterns? Yes, this is readily available information. For example, runtime decompilers perform this service during speculative execution, to identify memory contention and other conditions in executing code. Functions and function agglomerations can be identified and assigned arbitrary identifiers. Over time, some of the identifiers can be mapped to descriptive identifiers.
- Will we be able to measure energy flow through the media?
- Are the code patterns coalescing and evolving over time?
- Are both writing software and managing its execution in hardware becoming more automated? Yes they are.
- Is hardware design becoming automated? Yes, design of hardware is extremely automated and is driven by automated systems in data centers which provide services and order more chips.
- Will we identify one or more signals of life, conscious reentrant feedback loops, built around their own physical reproduction?
- Will we be able to distinguish human generated reentrant feedback loops from those generated by a new, faster, distinct, form of life? It seems likely that we could distinguish them based on communication speed.
- Will or do computers communicate at a faster rate and in a larger volume compared to human communication?