Skip to main content

Kafka and data handling processes. (Grey persons in grey offices)




The offices are always grey. Everybody who is working there is grey, and their mission is to control the data. 

When we are staring at the film "Kafkaesque" above this text, we might realize that data handling is similar to bureaucracy. God is the thing that controls the data process and creates the protocol for handling data. For the system, the superuser who has the right for exterminating the system is the god. If we are thinking of the computer as the Kafkas's bureau. 

We are facing the problem of row-type computing. If the event handler is the man who is walking in the rows the event handler must always deliver its data. Without depending does the data contain enough information or not. When the event handler is taken the data, there is no way to stop the operation until the process ends. 

Order is the key element in computing. If there is chaos in the system, there is no way to make anything. The answer to the chaos problem is simple. Somewhere in the system sits the master controller. The superuser is using the "master's voice". And voices that "everybody halt" and orders the data handlers to read their data at the same time. Or the order can be given in the form, that the data handlers are reading the things that the data handling form requires. 

That means everybody should check do they have all the necessary papers with them. And if every single paper contains a serial number helps it to give new papers if some of those files are missing. And if somebody delivers the data from that bureau, the controller knows what paper is delivered to the street. And because all data handlers are handling the individual data type that helps to track the person who delivered unauthorized data. 

If the system uses two lines there is easy to see the differences in data. In the two-line data handling process every data handler has a pair. And if data that those handlers are carrying contains differences there is an error. The two-line data handling works that when the router brings data in the system, it also doubles the data. 

That makes it possible to see if there are some errors by benefiting from the differences in the data structures. In that system, everything is doubled including the rowing rooms. In that room, the data is delivered to the desk, and then the superuser compiles the papers or data tables and finds out are they identical. If they are identical there are no errors in the data lines. The data is traveling in rows and each paper is combined in the same entirety, which is like film. 

If the data travel in a row, where are many curves the other data handlers can check is the data that the data handler carries right and does it have the right form. But that requests that every data handler has the same data, and in that case, the system can check the traveling data. The check can be made that every single data handler is yelling at the same time. What kind of data should be in their papers. 

Or the persons can send the number of the papers that are delivered back to begin. And that thing is called checksum. The checksum tells is there some papers dropped to the floor. And if every paper is numbered the people can know immediately what paper is missing. So if something is missing the data handler can ask for the missing paper from the colleague who stands at the door. And that means that there is no need for remaking the entire paper stack. Only the missing papers are needed to make. 

If all data handlers are talking at different times, the result is chaos, where nobody can check the data. The man who is keeping himself hunger is the data handler, which is not used. The useless data is removed and the mark of that data turns weaker and weaker all the time when data stands useless. The data security might seem very well done if there are many doors between observer and data. But if the doors are not locked they are useless. 

We can think that the data that is traveling in the system is like black and white bugs. The system is like the giant labyrinth that must know which way it should route those data bites. At every corner in that labyrinth is the router. Or the person who knows where to guide each bite of data. When data is traveling in a mixed form that thing seems like a mess or chaos. But when the routers are sorting that thing turns data into an understandable form. 

Comments

Popular posts from this blog

Quantum breakthrough: stable quantum entanglement at room temperature.

"Researchers have achieved quantum coherence at room temperature by embedding a light-absorbing chromophore within a metal-organic framework. This breakthrough, facilitating the maintenance of a quantum system’s state without external interference, marks a significant advancement for quantum computing and sensing technologies". (ScitechDaily, Quantum Computing Breakthrough: Stable Qubits at Room Temperature) Japanese researchers created stable quantum entanglement at room temperature. The system used a light-absorbing chromophore along with a metal-organic framework. This thing is a great breakthrough in quantum technology. The room-temperature quantum computers are the new things, that make the next revolution in quantum computing. This technology may come to markets sooner than we even think. The quantum computer is the tool, that requires advanced operating- and support systems.  When the support system sees that the quantum entanglement starts to reach energy stability. I

The anomalies in gravity might cause dark energy.

"Physicists at UC Berkeley immobilized small clusters of cesium atoms (pink blobs) in a vertical vacuum chamber, then split each atom into a quantum state in which half of the atom was closer to a tungsten weight (shiny cylinder) than the other half (split spheres below the tungsten). (ScitechDaily, Beyond Gravity: UC Berkeley’s Quantum Leap in Dark Energy Research) By measuring the phase difference between the two halves of the atomic wave function, they were able to calculate the difference in the gravitational attraction between the two parts of the atom, which matched what is expected from Newtonian gravity. Credit: Cristian Panda/UC Berkeley" (ScitechDaily, Beyond Gravity: UC Berkeley’s Quantum Leap in Dark Energy Research) Researchers at Berkeley University created a model that can explain the missing energy of the universe. The idea is that the particles and their quantum fields are whisk-looking structures. Those structures form the superstrings that are extremely thi

Neon and time crystals can be the new tools for quantum computing.

"New research investigates the electron-on-solid-neon qubit, revealing that small bumps on solid neon surfaces create stable quantum states, enabling precise manipulation. This research, supported by multiple foundations, emphasizes the importance of optimizing qubit fabrication, moving us closer to practical quantum computing solutions." (ScitechDaily, Quantum Riddle Solved? How Solid Neon Qubits Could Change Computing Forever) Researchers created a superposition in solid neon. And those neon ions, where the system creates superposition in their surfaces.  Making it possible to manipulate those atoms. The atom-based qubit has one problem. Orbiting electrons cause turbulence in their quantum fields. The thing that can solve the problem is to use the quantum fields for the superposition.  If the system can position electrons at a certain point, it can make a small hill to the atom's surface. And the system can use that thing for making quantum superposition between the mos