This essay has been submitted by a student. This is not an example of the work written by professional essay writers.
Uncategorized

QUANTUM KEY DISTRIBUTION

This essay is written by:

Louis PHD Verified writer

Finished papers: 5822

4.75

Proficient in:

Psychology, English, Economics, Sociology, Management, and Nursing

You can get writing help to write an essay on these topics
100% plagiarism-free

Hire This Writer

 

 

 

 

 

 

 

 

 

 

 

 

QUANTUM KEY DISTRIBUTION

 

TABLE OF CONTENT

  • INTRODUCTION……………………………………………………………………………………
    • 0bjectives of Quantum Key Distribution
    • Quantum key distribution protocols;
      • BB84 Protocol
      • Decay state,
      • Security proof protocol
      • E91 Protocol: Artur Ekert (1991)
      • Quantum bit error

2.0 SCOPE

2.1. Synchronization

2.2 Clock synchronization

2.3 Data start synchronization

3.0 Characterization And Analysis Of Time Synchronization for Quantum Key Distribution                                4.0  Characterization And Analysis Of Error Correct methodologies for Quantum Key Distribution

5.0 RESEARCH AND IMPROVEMENT OF QKD SYSTEMS

6.0 CONCLUSION

 

 

 

 

 

QUANTUM KEY DISTRIBUTION (QKD)

  • INTRODUCTION

            Have you ever know known that the Quantum key distribution was first proposed back in the 1970s? Unluckily, despite the idea being so incredible, it came to light in the 1980s after a whole decade and took another decade till the 1990s when the first connection was made to entanglement, such that the physicists started to acquire more interests. Luckily, the idea is favored such that since then, its progress has been outstanding and remarkable. It has been in existence for over 15 years now, and perhaps it’s ranked as one of the Quantum most mature technologies (Bennett et al., 2012).

The Quantum distribution key (QKD) refers to the method of secure communication and uses a cryptographic protocol that involves quantum mechanics components. Also, QKD can be called the generation of a private key, which two people share by the use of an authenticated and classical-quantum channel. Incorrectly, its often referred to as cryptography quantum, because of its common cryptographic type of quantum activity. QKD is an essential and very unique property; that is, two communication users can sense the presence of a third person who might be tiring to access some message in the key. Generally, this has been made possible by an essential aspect of quantum mechanics (Bienfang et al., 2014). Quantum mechanics is the general process that causes disturbance in the system when the quantum system is measured. Typically, the third party who is trying to hear the conversation on the key must be capable of measuring it, hence providing anomalies that are detectable. Using quantum entanglements and sending a message through states of quantum helps in implementing the communication channel, which can detect an eavesdropper. A guaranteed secure key is produced in case the eavesdropping level goes below a certain threshold. Critically, this means that there will be no active secure key, and from this point, communication will be aborted (Suchat et al., 2017).

According to Bonato et al. (2019), it’s good to note the main shortcoming when using Quantum key distributor; it relies much on using a classical authenticated communication channel. In the current society, the use of this type of circuit is taken to mean an act of exchanging the systematic text, which has an adequate length or free documents, which is highly secure. Luckily, when one is well informed, it’s possible to acquire authenticated and communications that are secure without the use of a Quantum key distributor. According to Bruce Schneier, QKD is very expensive, making it less useful. In addition, QKD produces, distributes text, but cannot be used in information transmission. Thus, by use of any encryption algorithm, this key helps in coding or decoding messages, and in turn, transmits the signal over an acceptable channel of communication. Notably, the algorithms can be provably secure by the use of a secret random key (Schneier, 2015).

According to Schneier (2015), during the traditional times, cryptographic algorithms that were generally used in the network communication environments heavily relied on computational assumptions and mathematical models, which actually were unsafe and helpful to attackers. On the contrary, today, quantum key distribution promises a critical agreement that is secure as a result of the availability of reliable mechanical quantum systems. In our current modern society, with the introduction of the era of telecommunications and the use of the internet, information has been made very valuable. In this regard, we need to be cautious and understand the need to protect information from theft, for example, loss of secret information to the eavesdropper (Cho, 2005).

Today, most of the transactions have been encrypted to offer protection though this way, and it’s not proved to provide security from a computational attack. Frequently, the standards used herein proven vulnerable to mind blogging due to the use of encryption algorithms. Therefore, the primary reason for the purpose of quantum cryptography is to ensure you have the single-photon transmission in order to supply the random key material hence the ability to remove the threat from an undetected eavesdropper. Additionally, it can be used together with encryption using symmetric key algorithms such as standard advanced algorithm encryption (Bouquet et al., 2018).

  • Objectives of Quantum Key Distribution

            QKD aims at tackling the following objectives; 1. Establishment of the first QKD- enabled experimentation platform; in this, it focuses on its evaluation in various industrial sectors, provides and gives vertical supply chain from the physical layer and the application layer, open accessible softwares standards and drive innovation for future European cryptographic solutions. 2. Having standardized interfaces, for instance, by making sure there is interoperability in the OKD eco-system, ensuring there is a horizontal essential management layer that links OKD from different suppliers in a similar node, etc. 3. OKD focuses on o[peration of use-cases coming from secure societies’ needs. 4. Having a range of use cases, for instance, healthcare, safe and digital societies, which involves inter and intra datacenter communication, e-government, financial services, authentication, and space application, among others. 5. Focuses on having a robust, open, modular, fully monitored, and reliability tested facility. 6. Having standardized quantum cryptography and efforts in security certification. 7. Laying the foundation for a Pan-European Quantum network, and 8. To kick-start a very competitive OKD European industry (Merolla et al., 2017).

  • Quantum key distribution protocols

            Critically, communication in quantum channels revolves around the encryption of messages in indivisible states, and it is different from the established connection. Basically, Quantum key distribution uses specific properties to ensure its security by the use of quantum states. In this regard, we are going to describe the methods of QKD. This method is divided into two parts: One, measure, and prepares agreements; its act of measuring an essential part of mechanics quantum. Besides, it’s seen as the result of inaccurate quantum, which is used to detecting eavesdroppers during communication and, more useful, in calculating the amount of intercepted information. Two, entanglement based protocols; this refers to linking together two or more separate objects to represent joined quantum conditions, but do not represent single objects (Shor & Preskill, 2015).

 

QKD protocol means that performing measurement for one thing always alters the other. When linked pairs of items are used by two peoples, and someone else happens to intercept, the whole system is changed, thus revealing the presence of a third party. Furthermore, these methods are divided to form 3 types of agreements, namely continuous, discrete, and distributed sequence coded information. Separate or distinct variables agreements were the earliest invented first and are highly used. It is in this plan, Alice wants to send a private text to Bob. Alice starts by using two strings with bits, and then she encodes the two series as a product tensor for qubits (Shor & Preskill, 2015).

1.2.1 BB84-Protocol

            BB84 protocol is a scheme of quantum key distribution that was made by Charles Bennett and Gilles Brassard in 1984. Critically, BB84 is the first cryptography quantum agreement and was mainly safe but relied heavily upon quantum property, which asserts that receiving a massage is only achievable by causing disturbance to the signal. Usually, it explained as an approach of communicating securely via a private key between two parties for purposes of use in one-time pad encryption (Shor & Preskill, 2015).

1.2.2 Decoy state protocol

            It’s one of the most widely used quantum key distribution. Practically, systems of QKD regularly use multi-photon sources, and in contrast to BB84 standards protocol, thus making them easily affected to attacks by photon number splitting. Notably, this will limit the secure rate of transmission or maximum length of channels in practical Quantum critical distribution systems. Decoy state addresses the fundamental weakness of QKD sound systems through the use of multiple levels of intensity at the transmitter’s source (Shor & Preskill, 2015). This means that qubits are conveyed by Alice through the use of randomly chosen levels of intensity, which brings about varying statistics photon numbers in the entire channel. At the end of transmission Alice publicly says the intensity level used during conveying of each qubit. For a successful photon number splitting requires preservation of the error bit rate at the side of the receiver and can be achieved by the use of multiple statistics photon numbers. Additionally, through monitoring bit error rate with each level of intensity, the two parties will be able to sense a photon number splitting attack which has a high increased secure rate of transmission or maximum lengths of channels, thus making quantum critical distribution systems very favorable for practical application (Lo et al., 2015).

1.2.3 Security Proof Protocols

            Mostly, security proof protocols aim at having secured communications over public networks. Typically, they are designed for use in bank transfers over the internet, thus providing private channels or having remote areas authenticated. In addition, their work is to make sure there is confidentiality, authentication, or privacy even during times when networks are being controlled by mind-blogging people who may intercept, forge information or also send new messages. The specification of this kind of protocol usually is short and natural; therefore, there is a need to design a more secure protocol that is very hard to develop, and flaws may arise after several years. One of the famous examples is from ‘man-in-the-middle’ disturbance that was done by G. Lowe against the public key protocol of Needham-Schroder (Shor & Preskill, 2015)

According to Lo et al. (2015), when proving that a key exchange protocol is secure, it is important to clearly state security property, which is appropriate, accurate, and is sufficient to guarantee the usability of the text. Basically, various approaches in the cryptographic literature have been proposed with the inclusion of an undetectable key concept, which holds that version produced in a text exchange agreement should not be distinguished from the randomly chosen one coming from one distribution. Thus, this an essential requirement also the pleasing goal in protocols of the text exchange. As such, it is necessary to develop a compositional method capable of demonstrating the secure sounds cryptographically in the vital exchange agreements, come up with the appropriate specification of generating acceptable keys, and use the approach a demonstrative sample protocol.

1.2.4 E 91 Protocol: Artur Ekert (1991)

            In this, Artur Ekert plan uses conjoined pairs of a photon which can be made by Alice and Bob or be created at different source not excluding eavesdropper Eve. Basically, these photons are distributed in order to possibly allow Alice and Ben to have a pair of photons each. Mainly, this plan relies on two entanglement properties (Lo et al., 2015). One, co-joined states are mutually related to the sense that when  Alice and Bob happen to determine if their particles have either horizontal or vertical polarization, automatically they can find a similar answer with a probability of 100%. On the other hand, the same result will be found if both once again happens to measure another complementary polarization pair. This means Alice and Bob will have a perfect synchronization direction. Nevertheless, these particular results are totally random, meaning, Alice will be in a position to estimate if her polarization (also Bob) will either horizontal or vertical. Two, any effort made by Eve to eavesdrop alters correlations such that Alice and Bob will be able to detect it (Shor & Preskill, 2015).

1.2.5 Quantum Bit Error Protocol

            Practically, a bit error rate is unnecessarily connected to destructive eavesdropping. The destructive eavesdropping that is the presence of false alarms usually is caused by of lack of efficient detection, loss in transmission, lack of perfect entanglement sources, and various other factors. Additionally, the presence of error correction is an effective post-processing method since it tolerates implementation limits and also secures accurate essential transmission. Notably, the presence of third-party contamination determinant facilitates in the setting of a desirable bit error rate threshold, which decides if one can abandon prior communications. In this protocol, two approaches are used (Shor & Preskill, 2015).

 

                                                                     2.0 SCOPE

2.1 Synchronization

            Notably, in the field of computer study, synchronization is one of two distinct though relative concepts, meaning, the timing of data, and synchronization process.  As such, the synchronization process is the idea in which multiple methods teams up in a specific place, showing commitment or agreement towards achieving a definite pattern of activity. On the other hand, synchronization of data is the form of having multiple datasets copies that are incoherence with one another in order to maintain the integrity of data. Also, the timing of process primitives is mainly applied for purposes of implementing the timing of data (Pjonkin et al., 2017).

Synchronization is needed in the following main areas: During forks and joints, when a job gets to a fork point, splitting into N sub-jobs takes place, which then is acted upon by N tasks. After the sub-jobs are acted upon, each of them waits for other sub-jobs to complete processing. After this, they are brought together again as they leave the system. Evidently, parallel programming needs synchronization since, as we can see, all parallel processes await other processes to be carried out. The Producer-consumer relationship states that the consumer’s process directly depends on producer processes until the production of necessary data. In exclusive use of resources, this means that multiple methods depending on the available support, which they need access to, and the same time operating system is supposed to ensure that at a given time, only one processor can access it. This helps in reducing concurrency (Force, 2017).

Bienfang et al. (2014) assert that thread or process synchronization is a mechanism that ensures there is more than one concurrent thread but doesn’t carry out specific segments of the program referred to as critical section at the same time. The use of synchronized techniques helps in controlling processes access to the essential part. In an event where one thread is executing the critical section, the next thread holds on until the first thread completes. Failure to put in place appropriate synchronization techniques may lead to a race condition making the values of the variable unpredictable and also vary in accordance with the time of context switches of the threads or processes. Synchronization helps in conflict avoidance while accessing the shared resource. In addition, it is essential to consider the order in which specific processes or threads are executed (Pjonkin et al., 2017).

2.2 Clock Synchronization

In computer functions, every computer has a real-time that counts crystal oscillations. The computer software clock uses this hardware clock to mark the current time. However, the hardware clock has some drifts; in that, the frequency of the watch may vary hence giving incorrect time. As such, any given two clocks most likely will provide slightly different time at any given point. Different between two clock times is referred to as skew. Clock synchronization basically revolves around a temporal understanding of concurrent events produce processes. Notably, clock synchronization helps during synchronization of the sender and receiver of the text, controlling joint activity, and serializing simultaneous access to objects that are shared. The fundamental goal in clock synchronization is to ensure there is an agreement between the multiple unrelated processes and be able to make consistent decisions about the arrangements of events in a system (Ma et al., 2017).

Basically, there are several approaches to synchronization of physical clocks. One external timing means that every computer in the system is synchronized by the use of an external source of time, eg. UTC signal. Internal synchronization, on the other hand, means that every computer in the system is synchronized with each other, but the time is not factually accurate with respect to UTC. For instance, in an order that gives certain bounds on message transmission time, synchronization is basically straightforward because the upper and lower limits are known for given transmission time. In addition, network time protocol is another approach or method for clocks synchronization that applies a hierarchical architecture. In this regard, the top level of the hierarchy are servers connected to a UTC time source as a GPS unit (Force, 2017).

 

2.3 Data Start Synchronization

            Ma et al. (2017) say that data synchronization refers to the continuous process of information synchronizing among devices and the ability to keep updating changes between those devices in order to maintain consistency within systems. Additionally, synchronization of data ensures accurate, compliant, and secure data and successful team and customer experiences. Also, it helps ensure there is congruency between the different data endpoints and the data source. As data comes in its cleaned, errors are corrected, duplication is done, and consistency enhancement before application. The main two types of data synchronization are; local and remote data synchronization. Local synchronization involves the use of computers and devices next to one another, while remote synchronization takes place through a mobile network. Data synchronization aims at always ensuring data records are consistent. Timing in this regard can be useful in encryption for synchronizing vital public servers (Bienfang et al., 2014).

3.0 Characterization And Analysis Of Time Synchronization for Quantum Key Distribution

Driven by the need to cut the cost of installation and cost of maintenance of structural health monitoring (SHM) systems, wireless sensors network (WSN) is increasingly becoming popular. In addition, perfect synchronization of time amongst wireless sensors is critical factors for enabling the use of lower costs, lower power during structural health monitoring applications that are based on output modal of analytic structures. In this paper, the theoretical framework for the analysis of the impact created by delays in time in the measured system feedback on the reconstruction of mode shapes using the technique popular frequency domain decompositions (FDD). This approach estimates the direct change in shape mode values in reference to sensor synchronicity (Kabeya, 2014).

Through advancements of the technology in mechanical micro-electro systems, communication without wire connections, and electronics that are digital, which led to the growth in costs that are low, low power, sensor nodes that are multifunctional and basically not big in size to provide free communication for small distances. These nodes of the sensor can be applied by armed forces environment as well as for commercial purposes. Notably, each of the applications herein may need a separate precision level between nodes’ sensors. For instance, the time exactness required to secure target tracking and time variables. Besides, it is possible to apply the time multiple division access (TMDA) for accessing channel and hence switching the radio off in order to conserve power. Therefore, synchronization of time is essential in different types of activities that need precision and proper coordination. Incase these actions fail to use the time given, the timing of time can be located above sensor networks (Pjonkin et al., 2017).

4.0 Characterization And Analysis Of Error Correct methodologies for Quantum Key Distribution

            In many cases, error correction is highly considered in QKD. In order to make sure that Alice and Bob don’t end up getting different keys unwillingly, we need to consider several measures. In correction protocol and before the two-run the error, they usually have to sacrifice some of their bits in order to be able to measure the error rate roughly. For purposes of reducing the chance of ending with keys that are different in agreed levels, its several evident bits have to be given away as a sacrifice. Besides, incase Bob and Alice happen to get correct values before correcting the protocol of the rate of the error (Nakassis & Mink, 2012). Basically, confirmation can take place through the utilization of different check codes properties in which their density is low. Therefore, a comparison of methods shows the possibility of making the sacrifice of small bits without going against security protocol rules. The improvements depend on the rate of the error and length of the block, but mostly for Id Quantique Clavis collection. In addition, we show for systems that have large fluctuations in their rate of error. Its most suitable and favorable to combine the two methods (Wang et al. 2018).

In quantum key distribution, two distinct parties are able to communicate via a quantum channel. This makes it possible for Bob and Alice to communicate, and immediately after delivering, all information about the key is taken away. They can be able to regain back their keys by use of error bit correction protocol. It is evident that the laws of quantum mechanics can assist in proving the entire protocol of security keys. Lacking proper equipment and possible actions taken Eve in the communication channel, it will be impossible to correct errors made by Bob and Alice. Hence, there is a need to carry out error correction in order to acquire identical keys. In many cases, classical communication is used on an authenticated media or channel. This is because the discussion herein reveals some of the crucial information concerning the key (Nakassis & Mink, 2012)

.           Wang et al. (2018) state that QKD enables two peoples to develop a shared secret key with no limits placed upon adversary’s power calculation. Notably, reconciliation of error protocols that have been established and are capable of preserving security as the receiver and sender are allowed to correct and bring together errors from respective texts.  Always, cascade protocol is the most popular and efficient though it has high communication complexity making its workability very low. On the other hand, we have the winnow protocol that helps reduce communication complexity over cascade, though it’s disadvantageous because of introducing errors. Additionally, the use of LDPC codes is proven and is capable of reconciling errors at rates that are higher than that of cascade and Winnow though it has more significant calculation complexity. Effects of lack of accurate estimation of error, distribution of non-uniform error, and critical lengths that vary for identical input of key strings are evaluated by the use of the above three protocols. Then, analyses of the results are performed in order to characterize the weaknesses and strengths of every protocol (Nakassis & Mink, 2012)

5.0 RESEARCH AND IMPROVEMENT OF QKD SYSTEMS

                In reality, for any number of decades, people have used codes to keep their confidential data. However, with the growth and advancement of the use of the internet and recent changes about the website things, our health and financial information, as well as commerce and countrywide confidential information, are gradually transmitted via the website. Therefore, safety in communication is paramount. For instance, in conventional algorithms, the security of communications depends heavily relies on the encryption text. For example, if parties of people, Alice and Bob, happen to use together with a long undefined string of confidential bits, then they are be assured of complete safety through encoding their information by use of an accepted range of one-type-pad encoding systematic plan (Merolla et al., 2017).

In this, the central question will be on how Bob and Alice shared a safe communication channel? The challenge incurred is referred to as the text distribution challenge. In addition, at this point, all approaches to deliver a secure text are fundamentally unsafe mainly because an eavesdropper cannot be prevented in classical physics, Eve, to steal the text transmission process between Alice to Bob. Having a standard asymmetric or public cryptographical text that solves the problem for the supply of key through reliance from computational assumptions like the hardness of factoring. Additionally, such plans don’t provide secure theoretical information due to their vulnerability to the time ahead advancements in algorithms, not forgetting the presence of a computer large scale quantum (Suchat et al., 2017).

6.0 CONCLUSION

In conclusion, the impacts and benefits of the quantum distribution key are basically evident in our modern society. This is a beneficial technology that has indeed replaced traditional cryptography. Its full adoption is indeed a continuous open debate. Its main setback is highly dependent on having a conventional sufficient communications media. Having a classical authenticated communication channels means there is exchanged asymmetric keys of verified distance or adequate secure level of general information (Lo et al., 2015). Having such information already available, it is possible for one to have authenticated and secure communication channels without the use of Quantum key distribution, for example, by use of the counter Mode of the advanced encryption standard.

Gravimetry or mineral exploration is taken as another potential area for growth and development of quantum sensors, which can be placed in space through fundamental physics experiments. Also, quantum satellite communications are already paving the way for the development of quantum technologies in the area. In regard to the current pace of growth in the whole world privacy of correspondence can be maintained only where there are powerful quantum computers, and in the same point drawing the concept of having a worldly global quantum internet closer to the fulfillment of something worked for a long time (Galbraith, 2012).

KEY REFERENCES

Pjonkin, Anton, Konstantin Rumyantsev, and Pradeep Singh. (2017). “Synchronization In Quantum Key Distribution Systems.” Cryptography 1.3 (2017): 18. Web.

Wang, Xiangyu et al., 2018. “High-Speed Error Correction For Continuous-Variable Quantum Key Distribution With Multi-Edge Type LDPC Code.” Scientific Reports 8.1 (2018): n. pag. Web.

Schneier, B. (2015). Unique Algorithms for Protocols. Applied Cryptography, Second Edition: Protocols, Algorithms, and Source Code in C, 527-557.

Bouquet, P., and Kunz-Jacques, S., 2018. High-Performance Error Correction For Quantum Key Distribution Using Polar Codes. [online] arXiv.org. Available at: <https://arxiv.org/abs/1204.5882>

Force, N. T. S. T. (2017). Time Synchronization in the Electric Power System. Technical report, North American Synchrophasor Initiative.

Nakassis, A., & Mink, A. (2012, May). LDPC error correction in the context of Quantum Key Distribution. In Quantum Information and Computation X (Vol. 8400, p. 840009). International Society for Optics and Photonics.

Bienfang, J. C., Gross, A. J., Mink, A., Hershman, B. J., Nakassis, A., Tang & Hagley, E. W. (2014). Quantum key distribution with 1.25 Gbps clock synchronization. Optics Express12(9), 2011-2016.

Bennett, C. H., Bessette, F., Brassard, G., Salvail, L., & Smolin, J. (2012). Experimental quantum cryptography. Journal of cryptology5(1), 3-28.

Ma, L., Mink, A., Xu, H., Slattery, O., & Tang, X. (2017). Experimental demonstration of an active quantum key distribution network with over Gbps clock synchronization. IEEE Communications Letters11(12), 1019-1021.

Kabeya, M. (2014). Experimental realization of quantum key distribution (Doctoral dissertation).

Lo, H. K., Ma, X., & Chen, K. (2015). Decoy state quantum key distribution. Physical review letters94(23), 230504.

Cho, S. (2005). A distributed time-driven simulation method for enabling real-time manufacturing shop floor control. Computers & Industrial Engineering49(4), 572-590.

Galbraith, S. D. (2012). Mathematics of public-key cryptography. Cambridge University Press.

Bonato, C., Tomaello, A., Da Deppo, V., Naletto, G., & Villoresi, P. (2019). Feasibility of satellite quantum key distribution. New Journal of Physics11(4), 045017.

Suchat, S., Khunnam, W., & Yupapin, P. P. (2017). Quantum key distribution via an optical wireless communication link for telephone networks. Optical Engineering46(10), 100502.

Shor, P. W., & Preskill, J. (2015). The simple proof of security of the BB84 quantum key distribution protocol. Physical review letters85(2), 441.

Merolla, J. M., Duraffourg, L., Goedgebuer, J. P., Soujaeff, A., Patois, F., & Rhodes, W. T. (2017). Integrated Quantum key distribution system using single sideband detection. The European Physical Journal D-Atomic, Molecular, Optical and Plasma Physics18(2), 141-146.

 

 

  Remember! This is just a sample.

Save time and get your custom paper from our expert writers

 Get started in just 3 minutes
 Sit back relax and leave the writing to us
 Sources and citations are provided
 100% Plagiarism free
error: Content is protected !!
×
Hi, my name is Jenn 👋

In case you can’t find a sample example, our professional writers are ready to help you with writing your own paper. All you need to do is fill out a short form and submit an order

Check Out the Form
Need Help?
Dont be shy to ask