Читать книгу The Prime Network - Gerard G. Nahum - Страница 13

Оглавление

8 THE MACHINE

ONCE MR. GREGORY UNDERSTOOD THE WAY THE Network was organized and how it operated, he still needed to develop a way to examine its nodes and their information content. Moreover, he needed to have a means to deliver energy to selected nodes to modify the amount of information they contained to make the Network’s outputs generate the effects he wanted in four dimensions. Given the Network’s vast size, scale, and complexity, this was no easy task.

Fortunately, he knew that everything that happened in the Network needed to conform to the laws of physics. One consequence was that all of its nodes had spontaneous energy emissions that depended on their energy density; the greater it was, the farther the frequency distribution of what the nodes radiated was shifted to the right. By looking across the electromagnetic spectrum for the relative differences in the emissions from the Network’s different regions, Mr. Gregory could monitor it for hot spots that indicated where its energy density was the highest. Once he identified those, the next step was to focus on specific regions of nodes with more resolution to assess what types of functions they were involved in and to learn what they were about to precipitate.

There was a catch, however. Because the Network was arranged in layers that were convoluted and overlapping in a multitude of dimensions, a correction always needed to be made for how close the observed hot spots were to what was of interest—not only in space and in time but also in dimensionality and scale. Mr. Gregory called this a node’s “Network distance” from whatever the specific items of interest were in four dimensions. If the nodes were very distant from where their effects would occur, their influences would be delayed and could undergo significant changes before they actually happened. However, if the nodes were closer, their influences would transit the intervening distance more quickly and with a lesser degree of change. That was when they exerted their most predictable influences in terms of location, timing, and magnitude.

Based on the available technology, Mr. Gregory settled on a variant of an interferometer as the base device from which to design his machine to visualize the Network. The instrument could compare the differences in the frequencies and amplitudes of electromagnetic signals, and he modified it so that it could be coupled to the lens he invented. That allowed him to focus it on the emissions that originated from different regions of the Network to determine which nodes were the “hottest” compared to others in terms of their energy density. Moreover, he provided it with an exceptionally wide-angle aperture that could be adjusted to varying focal lengths and tuned to different scales in multiple dimensions.

In addition to measuring the energy density of the Network’s nodes, the device could also evaluate the nodes’ energy content relative to their capacity, which was a crucial parameter that depended on two things: how much information the node contained and how much it could accommodate before becoming saturated. When a node reached its saturation point, it would become unstable so that its structure could no longer be maintained, and it would break apart into its constituent parts. The result was that it disappeared from the Network’s structure, as did all of the connections it had to other nodes. This process had the effect of remodeling the Network and, with it, both its topology and the dynamics of its information transfer. Just as importantly, the free energy within each node that flashed out of existence was liberated so that it could be recycled and used to power the rest of the Network’s activities. Mr. Gregory likened the process to using the energy from the braking of a rolling car to charge its battery for future use.

When he was finished modifying his machine, Mr. Gregory was able to survey the Network’s nodes and classify them according to the spectrum of their emissions. He then went on to establish thresholds for identifying which ones contained enough information to be approaching their saturation points. That was the key, because when that happened, they would soon need to either disgorge some of their information to the other nodes they were connected to or break up into their more basic parts.

Next, he coupled the device to a power source so that he could focus beams of photons with varying energies on particular collections of nodes. He knew that he didn’t need to deliver much energy to change the information distribution of the Network; the trick was to get it to just the right nodes at the right times—that was what it took to have the proper messages filter through the Network to get them to where they needed to be so that they would have the effects he wanted. That permitted him to do one of two things: either react preemptively to what was about to happen in four dimensions or provide inputs to the Network to change the course of what would happen before it ever did.

Still, in order for him to use the machine properly, he needed to address the Network’s enormous number of nodes and connections. Because they were so tiny, trillions of different elements fit into every cubic millimeter of space—and that was just in the four dimensions everyone knew about. Moreover, all of the nodes and their connectors were layered both on and through what existed in four dimensions so that additional strata within the higher-dimensional Network were intertwined with them in complex ways. That resulted in the Network having a highly convoluted topology that spanned a dizzying range of scales and dimensions. As a consequence, there was no such thing as a one-to-one correspondence anywhere in the Network other than at a local level—that of so-called nearest neighbors. Beyond that, the correspondences and relationships became progressively more complex and difficult to comprehend.

Of course, there were too many nodes in the Network for Mr. Gregory to scan them all the time, so the approach he took to monitoring its activity was pragmatic: he focused in on areas that were of particular interest based on what was happening in four dimensions. For that, he concentrated on nodal regions that contributed to near-term effects involving local manifestations that were either of particular interest or problematic. There wasn’t much reason to do more, because if everything else that occurred in four dimensions was acceptable, there wasn’t a need to exert any influence over those portions of the Network that were responsible for generating those particular features. Accordingly, Mr. Gregory took a problem-oriented approach to his interrogations, with the trigger being that something in the layers of the Network close to what was happening in four dimensions either wasn’t developing along the proper lines or needed to be known in advance. It was in those particular instances that regions of the underlying Network needed to be monitored and modified, at least potentially.

Up to that point, the portions of the machine that Mr. Gregory had created represented the front and back ends of what he needed. Like many scientific thinkers before him, he kept a notebook in which he recorded his successes and failures. That day, he wrote,

The machine is finally coming together. Its front end is now complete, which is its sensor portion. That part monitors the Network to get information about the states of its nodes. Its back end is now also operational, which is its effector arm. It’s what allows photons to be injected back into the Network to be absorbed by nodes in a targeted fashion to modify the trajectory of where the Network is heading and what the results of its projection in four dimensions will be. But still missing is the intervening portion between the two, which is the guts of the machine—the processing layer between its front- and back-end elements that analyzes the incoming information about the Network and directs the machine’s outputs to influence its states in the ways I want.

The Prime Network

Подняться наверх