fbpx
Wikipedia

Generalized relative entropy

Generalized relative entropy (-relative entropy) is a measure of dissimilarity between two quantum states. It is a "one-shot" analogue of quantum relative entropy and shares many properties of the latter quantity.

In the study of quantum information theory, we typically assume that information processing tasks are repeated multiple times, independently. The corresponding information-theoretic notions are therefore defined in the asymptotic limit. The quintessential entropy measure, von Neumann entropy, is one such notion. In contrast, the study of one-shot quantum information theory is concerned with information processing when a task is conducted only once. New entropic measures emerge in this scenario, as traditional notions cease to give a precise characterization of resource requirements. -relative entropy is one such particularly interesting measure.

In the asymptotic scenario, relative entropy acts as a parent quantity for other measures besides being an important measure itself. Similarly, -relative entropy functions as a parent quantity for other measures in the one-shot scenario.

Definition edit

To motivate the definition of the  -relative entropy  , consider the information processing task of hypothesis testing. In hypothesis testing, we wish to devise a strategy to distinguish between two density operators   and  . A strategy is a POVM with elements   and  . The probability that the strategy produces a correct guess on input   is given by   and the probability that it produces a wrong guess is given by  .  -relative entropy captures the minimum probability of error when the state is  , given that the success probability for   is at least  .

For  , the  -relative entropy between two quantum states  and   is defined as

 

From the definition, it is clear that  . This inequality is saturated if and only if  , as shown below.

Relationship to the trace distance edit

Suppose the trace distance between two density operators   and   is

 

For  , it holds that

a)  

In particular, this implies the following analogue of the Pinsker inequality[1]

b)  

Furthermore, the proposition implies that for any  ,   if and only if  , inheriting this property from the trace distance. This result and its proof can be found in Dupuis et al.[2]

Proof of inequality a) edit

Upper bound: Trace distance can be written as

 

This maximum is achieved when   is the orthogonal projector onto the positive eigenspace of  . For any POVM element   we have

 

so that if  , we have

 

From the definition of the  -relative entropy, we get

 

Lower bound: Let   be the orthogonal projection onto the positive eigenspace of  , and let   be the following convex combination of   and  :

 

where  

This means

 

and thus

 

Moreover,

 

Using  , our choice of  , and finally the definition of  , we can re-write this as

 
 

Hence

 

Proof of inequality b) edit

To derive this Pinsker-like inequality, observe that

 

Alternative proof of the Data Processing inequality edit

A fundamental property of von Neumann entropy is strong subadditivity. Let   denote the von Neumann entropy of the quantum state  , and let   be a quantum state on the tensor product Hilbert space  . Strong subadditivity states that

 

where   refer to the reduced density matrices on the spaces indicated by the subscripts. When re-written in terms of mutual information, this inequality has an intuitive interpretation; it states that the information content in a system cannot increase by the action of a local quantum operation on that system. In this form, it is better known as the data processing inequality, and is equivalent to the monotonicity of relative entropy under quantum operations:[3]

 

for every CPTP map  , where   denotes the relative entropy of the quantum states  .

It is readily seen that  -relative entropy also obeys monotonicity under quantum operations:[4]

 ,

for any CPTP map  . To see this, suppose we have a POVM   to distinguish between   and   such that  . We construct a new POVM   to distinguish between   and  . Since the adjoint of any CPTP map is also positive and unital, this is a valid POVM. Note that  , where   is the POVM that achieves  . Not only is this interesting in itself, but it also gives us the following alternative method to prove the data processing inequality.[2]

By the quantum analogue of the Stein lemma,[5]

 
 
 

where the minimum is taken over   such that  

Applying the data processing inequality to the states   and   with the CPTP map  , we get

 

Dividing by   on either side and taking the limit as  , we get the desired result.

See also edit

References edit

  1. ^ Watrous, J. Theory of Quantum Information, Fall 2013. Ch. 5, page 194 https://cs.uwaterloo.ca/~watrous/CS766/DraftChapters/5.QuantumEntropy.pdf[permanent dead link]
  2. ^ a b Dupuis, F.; Krämer, L.; Faist, P.; Renes, J. M.; Renner, R. (2013). "Generalized Entropies". XVIIth International Congress on Mathematical Physics. WORLD SCIENTIFIC. pp. 134–153. arXiv:1211.3141. doi:10.1142/9789814449243_0008. ISBN 978-981-4449-23-6. S2CID 118576547.
  3. ^ Ruskai, Mary Beth (2002). "Inequalities for quantum entropy: A review with conditions for equality". Journal of Mathematical Physics. 43 (9). AIP Publishing: 4358–4375. arXiv:quant-ph/0205064. Bibcode:2002JMP....43.4358R. doi:10.1063/1.1497701. ISSN 0022-2488. S2CID 3051292.
  4. ^ Wang, Ligong; Renner, Renato (15 May 2012). "One-Shot Classical-Quantum Capacity and Hypothesis Testing". Physical Review Letters. 108 (20): 200501. arXiv:1007.5456. Bibcode:2012PhRvL.108t0501W. doi:10.1103/physrevlett.108.200501. ISSN 0031-9007. PMID 23003132. S2CID 3190155.
  5. ^ Dénez Petz (2008). "8". Quantum Information Theory and Quantum Statistics. Theoretical and Mathematical Physics. Berlin, Heidelberg: Springer Berlin Heidelberg. Bibcode:2008qitq.book.....P. doi:10.1007/978-3-540-74636-2. ISBN 978-3-540-74634-8.

generalized, relative, entropy, displaystyle, epsilon, relative, entropy, measure, dissimilarity, between, quantum, states, shot, analogue, quantum, relative, entropy, shares, many, properties, latter, quantity, study, quantum, information, theory, typically, . Generalized relative entropy ϵ displaystyle epsilon relative entropy is a measure of dissimilarity between two quantum states It is a one shot analogue of quantum relative entropy and shares many properties of the latter quantity In the study of quantum information theory we typically assume that information processing tasks are repeated multiple times independently The corresponding information theoretic notions are therefore defined in the asymptotic limit The quintessential entropy measure von Neumann entropy is one such notion In contrast the study of one shot quantum information theory is concerned with information processing when a task is conducted only once New entropic measures emerge in this scenario as traditional notions cease to give a precise characterization of resource requirements ϵ displaystyle epsilon relative entropy is one such particularly interesting measure In the asymptotic scenario relative entropy acts as a parent quantity for other measures besides being an important measure itself Similarly ϵ displaystyle epsilon relative entropy functions as a parent quantity for other measures in the one shot scenario Contents 1 Definition 2 Relationship to the trace distance 2 1 Proof of inequality a 2 2 Proof of inequality b 3 Alternative proof of the Data Processing inequality 4 See also 5 ReferencesDefinition editTo motivate the definition of the ϵ displaystyle epsilon nbsp relative entropy D ϵ r s displaystyle D epsilon rho sigma nbsp consider the information processing task of hypothesis testing In hypothesis testing we wish to devise a strategy to distinguish between two density operators r displaystyle rho nbsp and s displaystyle sigma nbsp A strategy is a POVM with elements Q displaystyle Q nbsp and I Q displaystyle I Q nbsp The probability that the strategy produces a correct guess on input r displaystyle rho nbsp is given by Tr r Q displaystyle operatorname Tr rho Q nbsp and the probability that it produces a wrong guess is given by Tr s Q displaystyle operatorname Tr sigma Q nbsp ϵ displaystyle epsilon nbsp relative entropy captures the minimum probability of error when the state is s displaystyle sigma nbsp given that the success probability for r displaystyle rho nbsp is at least ϵ displaystyle epsilon nbsp For ϵ 0 1 displaystyle epsilon in 0 1 nbsp the ϵ displaystyle epsilon nbsp relative entropy between two quantum statesr displaystyle rho nbsp and s displaystyle sigma nbsp is defined as D ϵ r s log 1 ϵ min Q s 0 Q I and Q r ϵ displaystyle D epsilon rho sigma log frac 1 epsilon min langle Q sigma rangle 0 leq Q leq I text and langle Q rho rangle geq epsilon nbsp dd dd From the definition it is clear that D ϵ r s 0 displaystyle D epsilon rho sigma geq 0 nbsp This inequality is saturated if and only if r s displaystyle rho sigma nbsp as shown below Relationship to the trace distance editSuppose the trace distance between two density operators r displaystyle rho nbsp and s displaystyle sigma nbsp is r s 1 d displaystyle rho sigma 1 delta nbsp dd dd For 0 lt ϵ lt 1 displaystyle 0 lt epsilon lt 1 nbsp it holds that a log ϵ ϵ 1 ϵ d D ϵ r s log ϵ ϵ d displaystyle log frac epsilon epsilon 1 epsilon delta quad leq quad D epsilon rho sigma quad leq quad log frac epsilon epsilon delta nbsp dd dd In particular this implies the following analogue of the Pinsker inequality 1 b 1 ϵ ϵ r s 1 D ϵ r s displaystyle frac 1 epsilon epsilon rho sigma 1 quad leq quad D epsilon rho sigma nbsp dd dd Furthermore the proposition implies that for any ϵ 0 1 displaystyle epsilon in 0 1 nbsp D ϵ r s 0 displaystyle D epsilon rho sigma 0 nbsp if and only if r s displaystyle rho sigma nbsp inheriting this property from the trace distance This result and its proof can be found in Dupuis et al 2 Proof of inequality a edit Upper bound Trace distance can be written as r s 1 max 0 Q 1 Tr Q r s displaystyle rho sigma 1 max 0 leq Q leq 1 operatorname Tr Q rho sigma nbsp dd dd This maximum is achieved when Q displaystyle Q nbsp is the orthogonal projector onto the positive eigenspace of r s displaystyle rho sigma nbsp For any POVM element Q displaystyle Q nbsp we have Tr Q r s d displaystyle operatorname Tr Q rho sigma leq delta nbsp dd dd so that if Tr Q r ϵ displaystyle operatorname Tr Q rho geq epsilon nbsp we have Tr Q s Tr Q r d ϵ d displaystyle operatorname Tr Q sigma geq operatorname Tr Q rho delta geq epsilon delta nbsp dd dd From the definition of the ϵ displaystyle epsilon nbsp relative entropy we get 2 D ϵ r s ϵ d ϵ displaystyle 2 D epsilon rho sigma geq frac epsilon delta epsilon nbsp dd dd Lower bound Let Q displaystyle Q nbsp be the orthogonal projection onto the positive eigenspace of r s displaystyle rho sigma nbsp and let Q displaystyle bar Q nbsp be the following convex combination of I displaystyle I nbsp and Q displaystyle Q nbsp Q ϵ m I 1 ϵ m Q displaystyle bar Q epsilon mu I 1 epsilon mu Q nbsp dd dd where m 1 ϵ Tr Q r 1 Tr Q r displaystyle mu frac 1 epsilon operatorname Tr Q rho 1 operatorname Tr Q rho nbsp This means m 1 ϵ m Tr Q r displaystyle mu 1 epsilon mu operatorname Tr Q rho nbsp dd dd and thus Tr Q r ϵ m 1 ϵ m Tr Q r ϵ displaystyle operatorname Tr bar Q rho epsilon mu 1 epsilon mu operatorname Tr Q rho epsilon nbsp dd dd Moreover Tr Q s ϵ m 1 ϵ m Tr Q s displaystyle operatorname Tr bar Q sigma epsilon mu 1 epsilon mu operatorname Tr Q sigma nbsp dd dd Using m 1 ϵ m Tr Q r displaystyle mu 1 epsilon mu operatorname Tr Q rho nbsp our choice of Q displaystyle Q nbsp and finally the definition of m displaystyle mu nbsp we can re write this as Tr Q s ϵ 1 ϵ m Tr Q r 1 ϵ m Tr Q s displaystyle operatorname Tr bar Q sigma epsilon 1 epsilon mu operatorname Tr Q rho 1 epsilon mu operatorname Tr Q sigma nbsp ϵ 1 ϵ d 1 Tr Q r ϵ 1 ϵ d displaystyle epsilon frac 1 epsilon delta 1 operatorname Tr Q rho leq epsilon 1 epsilon delta nbsp dd dd dd dd dd Hence D ϵ r s log ϵ ϵ 1 ϵ d displaystyle D epsilon rho sigma geq log frac epsilon epsilon 1 epsilon delta nbsp dd dd Proof of inequality b edit To derive this Pinsker like inequality observe that log ϵ ϵ 1 ϵ d log 1 1 ϵ d ϵ d 1 ϵ ϵ displaystyle log frac epsilon epsilon 1 epsilon delta log left 1 frac 1 epsilon delta epsilon right geq delta frac 1 epsilon epsilon nbsp dd dd Alternative proof of the Data Processing inequality editA fundamental property of von Neumann entropy is strong subadditivity Let S s displaystyle S sigma nbsp denote the von Neumann entropy of the quantum state s displaystyle sigma nbsp and let r A B C displaystyle rho ABC nbsp be a quantum state on the tensor product Hilbert space H A H B H C displaystyle mathcal H A otimes mathcal H B otimes mathcal H C nbsp Strong subadditivity states that S r A B C S r B S r A B S r B C displaystyle S rho ABC S rho B leq S rho AB S rho BC nbsp dd dd where r A B r B C r B displaystyle rho AB rho BC rho B nbsp refer to the reduced density matrices on the spaces indicated by the subscripts When re written in terms of mutual information this inequality has an intuitive interpretation it states that the information content in a system cannot increase by the action of a local quantum operation on that system In this form it is better known as the data processing inequality and is equivalent to the monotonicity of relative entropy under quantum operations 3 S r s S E r E s 0 displaystyle S rho sigma S mathcal E rho mathcal E sigma geq 0 nbsp dd dd for every CPTP map E displaystyle mathcal E nbsp where S w t displaystyle S omega tau nbsp denotes the relative entropy of the quantum states w t displaystyle omega tau nbsp It is readily seen that ϵ displaystyle epsilon nbsp relative entropy also obeys monotonicity under quantum operations 4 D ϵ r s D ϵ E r E s displaystyle D epsilon rho sigma geq D epsilon mathcal E rho mathcal E sigma nbsp dd dd for any CPTP map E displaystyle mathcal E nbsp To see this suppose we have a POVM R I R displaystyle R I R nbsp to distinguish between E r displaystyle mathcal E rho nbsp and E s displaystyle mathcal E sigma nbsp such that R E r E R r ϵ displaystyle langle R mathcal E rho rangle langle mathcal E dagger R rho rangle geq epsilon nbsp We construct a new POVM E R I E R displaystyle mathcal E dagger R I mathcal E dagger R nbsp to distinguish between r displaystyle rho nbsp and s displaystyle sigma nbsp Since the adjoint of any CPTP map is also positive and unital this is a valid POVM Note that R E s E R s Q s displaystyle langle R mathcal E sigma rangle langle mathcal E dagger R sigma rangle geq langle Q sigma rangle nbsp where Q I Q displaystyle Q I Q nbsp is the POVM that achieves D ϵ r s displaystyle D epsilon rho sigma nbsp Not only is this interesting in itself but it also gives us the following alternative method to prove the data processing inequality 2 By the quantum analogue of the Stein lemma 5 lim n 1 n D ϵ r n s n lim n 1 n log min 1 ϵ Tr s n Q displaystyle lim n rightarrow infty frac 1 n D epsilon rho otimes n sigma otimes n lim n rightarrow infty frac 1 n log min frac 1 epsilon operatorname Tr sigma otimes n Q nbsp D r s lim n 1 n log 1 ϵ displaystyle D rho sigma lim n rightarrow infty frac 1 n left log frac 1 epsilon right nbsp D r s displaystyle D rho sigma nbsp dd dd dd dd dd dd dd dd dd dd where the minimum is taken over 0 Q 1 displaystyle 0 leq Q leq 1 nbsp such that Tr Q r n ϵ displaystyle operatorname Tr Q rho otimes n geq epsilon nbsp Applying the data processing inequality to the states r n displaystyle rho otimes n nbsp and s n displaystyle sigma otimes n nbsp with the CPTP map E n displaystyle mathcal E otimes n nbsp we get D ϵ r n s n D ϵ E r n E s n displaystyle D epsilon rho otimes n sigma otimes n geq D epsilon mathcal E rho otimes n mathcal E sigma otimes n nbsp dd dd Dividing by n displaystyle n nbsp on either side and taking the limit as n displaystyle n rightarrow infty nbsp we get the desired result See also editEntropic value at risk Quantum relative entropy Strong subadditivity Classical information theory Min entropyReferences edit Watrous J Theory of Quantum Information Fall 2013 Ch 5 page 194 https cs uwaterloo ca watrous CS766 DraftChapters 5 QuantumEntropy pdf permanent dead link a b Dupuis F Kramer L Faist P Renes J M Renner R 2013 Generalized Entropies XVIIth International Congress on Mathematical Physics WORLD SCIENTIFIC pp 134 153 arXiv 1211 3141 doi 10 1142 9789814449243 0008 ISBN 978 981 4449 23 6 S2CID 118576547 Ruskai Mary Beth 2002 Inequalities for quantum entropy A review with conditions for equality Journal of Mathematical Physics 43 9 AIP Publishing 4358 4375 arXiv quant ph 0205064 Bibcode 2002JMP 43 4358R doi 10 1063 1 1497701 ISSN 0022 2488 S2CID 3051292 Wang Ligong Renner Renato 15 May 2012 One Shot Classical Quantum Capacity and Hypothesis Testing Physical Review Letters 108 20 200501 arXiv 1007 5456 Bibcode 2012PhRvL 108t0501W doi 10 1103 physrevlett 108 200501 ISSN 0031 9007 PMID 23003132 S2CID 3190155 Denez Petz 2008 8 Quantum Information Theory and Quantum Statistics Theoretical and Mathematical Physics Berlin Heidelberg Springer Berlin Heidelberg Bibcode 2008qitq book P doi 10 1007 978 3 540 74636 2 ISBN 978 3 540 74634 8 Retrieved from https en wikipedia org w index php title Generalized relative entropy amp oldid 1161300805, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.