Images Stored "within" a Single Photon, Slowing Light, and a Novel Method and Computational Device for Calculating Prime Factors May Prove P=NP
Ultra-Dense Optical Storage -- on One Photon from PhysOrg.com:
Researchers at the University of Rochester have made an optics breakthrough that allows them to encode an entire image's worth of data into a photon, slow the image down for storage, and then retrieve the image intact.
This and other recent breakthroughs in our understanding of the nature of light and the experimental research and techniques developed to slow down and trap light for optical storage and for the investigation of the quantum nature of light and it's relationship to information theory and optical computing, have shown that individual photons are capable, through interference patterns produced by stencil meshes, of storing a "shadow" that contains the information that the photon has been "imprinted" with, by means of the stencil, of being data carriers for much more information than previously thought possible. The only way to capture that information is by slowing down and trapping the photon, so that the interference pattern or "shadow" is stored and trapped along with the photon, or more correctly the wave packet that is in a certain interference pattern is frozen like a snapshot. The information is retrieved by reversing the process.
The fact that the smallest carrier of information (practially, although the smallest unit of information would be considered a quantum state, of which a photon has many) has now been magnified, implies that the amount of information in the universe is much greater, exponentially so, than ever imagined before.
This physical result implies a few things about the nature of reality that I conjecture here.
- That information shadows, are the cause of dark energy and dark matter, and make up most of the mass in the universe.
- That information shadows are the cause of vacuum energy. If a collection of interacting photons interfere together and produce a shadow of the quantum states that match a particular virtual particle, that this is the result of the "image" or the "shadow" of the resulting interaction. If trapped and slowed down, the virtual particle would be found to arise from a single photon, or the interaction of groups of photons.
- Shadows of interference from small groups of photons can be used to create virtual particles even without the energy required to create it. This doesn't violate thermodynamics, because the information is energy and the potential energy always existed, but requires the correct interaction of the photons to produce a virtual particle. The virtual particle, if slowed down in cesium gas with the photons used to create the quantum state that produces this particle, will be observable as if it were a real particle and interact within the contained as real particles do, but would do so at perhaps an extremely slow pace, and we could observe these interactions and capture the energy output from interactions. (Perhaps, however the interaction would not be slowed down, but because of relativity, the virtual particles would appear to interact as speeds observed in nature when the corresponding "real" or non-virtual particles interact.)
- An optical computing device can be created which uses a 2 or 3 dimensional mesh (depending on the space being computed) to quickly locate two prime factors of any number, if they exist. The Americal Journal of Physics, I believe (must find reference) proved that a method could be used to do this, and my own research and design of a device that would do this were only considered impractical because in order for large, and significant digits (1024 bit and on) that a computing device would have to have a representational index for every bit in a grid, and designing a device large enough, even if built from individual atoms would be larger than the sun and only for relatively small numbers. If individual photons have a upper limit on the information that their "shadow" can represent than eventually an upper limit exists on what a practical computing device could solve, but right now, there is no evidence that there is ANY loss in the signal to noise ratio of the information, so this is hugely scalable, perhaps without measure. We know that Quantum Cryptography means that encryption using prime factors for keys is broken ... however this shows that CLASSICAL COMPUTING DEVICES CAN BE MADE TO SOLVE QUANTUM PROBLEMS such as this, by leveraging the effects of quantum behaviour, but performing the actual calculation on a classical device using novel optical computing algorithms. If there is no theoretical limit on the amount of information that a photon shadow may represent, then P=NP, because the non-polynomial component becomes polynomial and vice-versa. The function that transforms P into NP is the mesh, so if say the "shadow image" of the interference were a polynomial, the photon is non-polynomial.
For instance, if you have a 2D graph labelled on the x and y axis, with all the integer whole numbers between 1/2 of the number you are factoring on the x, and again on the y-axis designed such that a square of light is projected on the grid equal to the area of the number being factored, then decrease x while increasing y such that you keep the area of the square constantly equal to the number to factor, then as the dimension of the area changes, the x and y values will hit integer values and cause the interference pattern to change at the integer value. Whenever both x and y are "lit" by the resulting interaction, that is a factor of the integer being factored. A classical test for primality of the X and Y value at that point are measured and if both are prime the keys are found. This kind of physical computing operation is now possible, because the "integer" marked "grid" may be made with "shadows" of interference from a mesh, say units 1 - 100,000,000,0 are marked (since we are able to make meshes at the nanoscale, the size of the grid would be limited by whatever physical means we produce it, but since many photons, one after another can "build" the grid, and have a billion or more integers represented by one "projection" slowed down and constructed together to make a virtual apparatus that will perform the calculation and emit a virtual particle at the points of interaction in order to tell the measuring device that an integer is hit. This sensor would detect the interaction which would carry back the value of the unit in the virtual grid.
Upvoted you