fbpx
Wikipedia

Tschuprow's T

  

Tschuprow's T

In statistics, Tschuprow's T is a measure of association between two nominal variables, giving a value between 0 and 1 (inclusive). It is closely related to Cramér's V, coinciding with it for square contingency tables. It was published by Alexander Tschuprow (alternative spelling: Chuprov) in 1939.[1]

Definition

For an r × c contingency table with r rows and c columns, let   be the proportion of the population in cell   and let

  and  

Then the mean square contingency is given as

 

and Tschuprow's T as

 

Properties

T equals zero if and only if independence holds in the table, i.e., if and only if  . T equals one if and only there is perfect dependence in the table, i.e., if and only if for each i there is only one j such that   and vice versa. Hence, it can only equal 1 for square tables. In this it differs from Cramér's V, which can be equal to 1 for any rectangular table.

Estimation

If we have a multinomial sample of size n, the usual way to estimate T from the data is via the formula

 

where   is the proportion of the sample in cell  . This is the empirical value of T. With   the Pearson chi-square statistic, this formula can also be written as

 

See also

Other measures of correlation for nominal data:

Other related articles:

References

  1. ^ Tschuprow, A. A. (1939) Principles of the Mathematical Theory of Correlation; translated by M. Kantorowitsch. W. Hodge & Co.
  • Liebetrau, A. (1983). Measures of Association (Quantitative Applications in the Social Sciences). Sage Publications

tschuprow, displaystyle, sqrt, frac, sqrt, statistics, measure, association, between, nominal, variables, giving, value, between, inclusive, closely, related, cramér, coinciding, with, square, contingency, tables, published, alexander, tschuprow, alternative, . T ϕ 2 r 1 c 1 displaystyle T sqrt frac phi 2 sqrt r 1 c 1 Tschuprow s TIn statistics Tschuprow s T is a measure of association between two nominal variables giving a value between 0 and 1 inclusive It is closely related to Cramer s V coinciding with it for square contingency tables It was published by Alexander Tschuprow alternative spelling Chuprov in 1939 1 Contents 1 Definition 1 1 Properties 1 2 Estimation 2 See also 3 ReferencesDefinition EditFor an r c contingency table with r rows and c columns let p i j displaystyle pi ij be the proportion of the population in cell i j displaystyle i j and let p i j 1 c p i j displaystyle pi i sum j 1 c pi ij and p j i 1 r p i j displaystyle pi j sum i 1 r pi ij Then the mean square contingency is given as ϕ 2 i 1 r j 1 c p i j p i p j 2 p i p j displaystyle phi 2 sum i 1 r sum j 1 c frac pi ij pi i pi j 2 pi i pi j and Tschuprow s T as T ϕ 2 r 1 c 1 displaystyle T sqrt frac phi 2 sqrt r 1 c 1 Properties Edit T equals zero if and only if independence holds in the table i e if and only if p i j p i p j displaystyle pi ij pi i pi j T equals one if and only there is perfect dependence in the table i e if and only if for each i there is only one j such that p i j gt 0 displaystyle pi ij gt 0 and vice versa Hence it can only equal 1 for square tables In this it differs from Cramer s V which can be equal to 1 for any rectangular table Estimation Edit If we have a multinomial sample of size n the usual way to estimate T from the data is via the formula T i 1 r j 1 c p i j p i p j 2 p i p j r 1 c 1 displaystyle hat T sqrt frac sum i 1 r sum j 1 c frac p ij p i p j 2 p i p j sqrt r 1 c 1 where p i j n i j n displaystyle p ij n ij n is the proportion of the sample in cell i j displaystyle i j This is the empirical value of T With x 2 displaystyle chi 2 the Pearson chi square statistic this formula can also be written as T x 2 n r 1 c 1 displaystyle hat T sqrt frac chi 2 n sqrt r 1 c 1 See also EditOther measures of correlation for nominal data Cramer s V Phi coefficient Uncertainty coefficient Lambda coefficientOther related articles Effect sizeThis article needs additional citations for verification Please help improve this article by adding citations to reliable sources Unsourced material may be challenged and removed Find sources Tschuprow s T news newspapers books scholar JSTOR October 2011 Learn how and when to remove this template message References Edit Tschuprow A A 1939 Principles of the Mathematical Theory of Correlation translated by M Kantorowitsch W Hodge amp Co Liebetrau A 1983 Measures of Association Quantitative Applications in the Social Sciences Sage Publications Retrieved from https en wikipedia org w index php title Tschuprow 27s T amp oldid 878279875, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.