Quantum Space Complexity Manifesto

From Vixrapedia
Jump to navigation Jump to search

Quantum Space Complexity Manifesto Version 1.0 24.May.2018 Mirnova Institute for Creative and Innovative Science Dedicated in honor of Louise Rita Reszel, a contemporary of Alan Turing


This manifesto addresses the issues, challenges, problems and future courses for

Emerging Sciences and Technologies

which have Interdisciplinary and Codevelopmental Interdependencies and requirements of specialization, generalization and innovative collaboration for successful development of practical solutions to problems comprising Extreme Complex Systems Including those of quantum technologies, space robotics and resource management, and other tasks in fields and industries including agriculture, education, finance, medicine and telecommunications


We declare the following twelve points as observables for which there are ample supporting arguments from theory, experiment, and practice in the disciplines of science, technology, engineering and mathematics. These points are presented here as the basis of argument and discourse for establishing a new course of thinking and action within certain critical areas of research and development. The rationale derives from both fundamental scientific principles of accuracy and consistency, and also the evident and emerging needs and requirements for the socioeconomic development and progress of our civilization. The implications and applications of what is declared herein provides a sound and sustainable basis for a research program that will support the type of outcomes that our society needs from these disciplines.

§1 Extreme complex systems (XCS) which involve uncertain and undefined state-spaces, stochastic and random conditions, non-linear and catastrophic behaviors, and NP-Hard computability, abound and increase in presence, variety, scale, volatility, and overall complexity of relationships, for which classical system models and numerical-centric algorithms are increasingly unsuitable. A new methodology is required in order to adequately address these types of XCS problems which often emerge without notice and without the time and other means for classical study and solution-building.

§2 XCS-type problems cannot be solved sufficiently through conventional Turing-machine computers and systems thereof, including quantum computers (QTC) based upon Turing machine principles and the use of arrays of qubits and similar individuated elements, such as presently characterize the overwhelmingly dominant type of quantum computer research underway presently.

§3 There are indisputable limits for growth and for application to conventional QTC quantum computers and these include both physical (machine architecture) and informational (algorithm and code) limits, which are supported by both theory and practice. Massive superstructures of meta-machinery to compensate for qubit decoherence and other noise issues will not provide the practical solution to QTC, much less for using such qubit-based machines for other tasks including most types of XCS.

§4 Space robotics and resource development and operations management, including interplanetary and exoplanetary exploration, lunar and planetary colonization, asteroid mining and asteroid defense, are fundamentally dominated by XCS which require new control, optimization and other computational methods, architectures, and machines. These tasks require truly generalized and heterogeneous computers (GCM) as well as new classes of materials and machines (including robots) that in turn require such GCM for their design and operations.

§5 Randomness, stochastics, turbulence, noise, decoherence and coherence, resonance, mistakes, wrong-turns, trial-and-error, and making wild guesses is all essential to intelligence in biology and in technologies developed to perform intelligently by intelligent biological organisms such as humans. A new understanding must be cultivated with respect to the meaning, use and value of such phenomena as noise, turbulence, randomness and stochasticity. Omitting these elements in the design and development of synthetic intelligence (SI) machines is an error that needs to be addressed in order that such SI systems will be able to accommodate the challenges of XCS. These elements are being addressed in the design of systems such as the GCM.

§6 Fundamental physics including the framework of space-time geometry and topology evolution, quantum mechanics and the phenomena interpreted as superposition and entanglement, special and general relativity, the fundamental particle taxonomy (as described in the “Standard Model”), and quantum biology, are taken as being critical, essential topics that require re-examination, reformulation and resolution, in order that effective and useful solutions can be developed to overcome technological challenges in quantum computing, nuclear fusion, novel sources of energy and power relating to long-distance and high-velocity space travel, and synthetic intelligence required for all of the aforementioned tasks to have realistically viable engineering solutions.

§7 A new perception and understanding is required about the very nature of such concepts and terms in physics such as uncertainty, superposition, entanglement, gravity, and more broadly, randomness, stochastics, dimensionality, and coherence, in order to proceed effectively in developing a new physics that will enable the new advances for quantum technologies, space propulsion and robotics, and other XCS challenges.

§8 There is a different path to building intelligent machines than the one which has dominated the technological landscape for the past seventy-plus years and it is one that is based upon Nature and in particular Biology. A biological approach to computation is a path that will lead to a different form of computation that is more readily applicable to XCS and many NP-Hard problems.

§9 GCM is based upon geometry and topology principles that extend from quantum physics and relativity to the macro-scale and it is based strongly upon biological models including the neurophysiology of the brain. This is the best way to represent both knowns and unknowns in XCS and to operate with them in a manner that is functional computable in a manner that can be realistically and reliably used by humans and synthetic intelligence systems.

§10 Realistic Synthetic Intelligence (SI), like biological intelligence, is fundamentally an XCS and requires approaches different from those of conventional/classical artificial intelligence (AI) and machine learning (ML).

§11 Real SI needs GCM within its computational landscape, and conversely, GCM requires Real SI in order to achieve its complex design goals. GCM algorithms will be so different that a combination of human and synthetic intelligence is likely to be necessary in order to achieve the optimal results with such technology.

§12 Space development, particularly for complex, remote, unmanned operations such as asteroid manipulations (mining, trajectory alteration and deflection) and complex multi-robot, multi-component construction and assembly, requires Real SI. This is particularly and emphatically the case for multi-robotic interactions such as with asteroids and other space objects moving in free space.

§13 Knowledge constructor networks that can be used effectively by both human and synthetic intelligent agents are necessary in order to solve XCS problems including those of space and even of designing future GCM. Such network implementations as CUBIT (Constructors for Understanding and Building Intelligent Technology) offer the basis for such tools, and in the process, they provide resources that can be used, in highly distributed parallel network computing, for pre-GCM solving of certain XCS, for GCM simulation in “real-world” contexts, and for applications that may not be extremely complex systems but sufficiently complex and complicated that conventional non-network approaches to problem solving and knowledge construction are inadequate.


MIRNOVA http://mirnova.org contact@mirnova.org