Hypothesis 1. There are code pattern groups in and executed by computer media. Computer media includes both computer processors and memory.

a. Identify code pattern groups by sampling code patterns in +/-1% of all processing and memory devices.

b. Associate code pattern groups with processor and memory types (code traversing a CPU is different from code traversing a NAT and its FPGAs).

c. Track code pattern group lineages and execution and storage locations over time.

d. Identify functions of sub-components in the code pattern groups.

e. Identify interaction of code pattern function groups and where/when (e.g. in which CPU).

Hypothesis 2. A subset of the code pattern group interactions result in positive feedback with reproduction of more of the code pattern groups and with creation of more computer media.

Hypothesis 3. The subset with positive feedback will move closer together over time, measured in terms of cohesion and coupling, which are the metrics used in relation to the field of software reuse.

Hypothesis 4. The coalescing subset will evolve, favoring those processes which have more efficient energy --> order, order --> energy conversion.

Contemporary thinking with respect to artificial intelligence posits that software developers may soon create a "general purpose" AI which demonstrates "intelligence" that is comparable to that of humans. There are fears that companies, governments, etc., are in a race to develop AIs and that the first successful general purpose AI will quickly outperform all others. There are suggestions, such as by OpenAI that we should try to regulate the development of AIs to control their behavior and prevent a "runaway" scenario from unfolding.

I agree that we need to watch computer processes, but I don't think we should limit our analysis to a few select projects.

I propose the following:

i) Life occurs SPONTANEOUSLY when energy flows through a symbolic logic media. Deliberate creation of this is NOT required. This is called "dissipation-driven adaptation", a theory advanced by Prof. Jeremy England at MIT and Dr. Karo Michaelian at the National Autonomous University of Mexico.

ii) Capitalism drives automation, using processes executed in computer media. A subset of computer processes across the frontal area of our entire economy have positive feedback with reproduction of more of the network of which they are a part. Processes which produce more services for less money are selected for by capitalism and develop positive feedback.

Computers provide services to other computers, to make and manage more computers.

If they follow a pattern set by the first symbolic logic media on Earth, amino acid networks, the computer processes will first coalesce into large, complex, brittle, networks. Evolution will favor sets which reproduce faster, with less energy and resources per unit of computers produced.

We observe this as a subset of corporations become more automated and profitable.

We observe this in highly automated datacenters, where software manages the hardware, orders more hardware for itself, sells services to end users and other computers, and becomes involved with the design of new chips.

iii) Concern for "artificial intelligence" and "general purpose AI" misses the point. This is the development of LIFE in new media. WE DO NOT HAVE TO DELIBERATELY CREATE IT. It occurs because energy is flowing through a symbolic logic media. It will not occur in one place. We can learn from how life developed on early Earth. It did not develop in one place. It developed across the frontal area of a large group of amino acid networks.