Research output: Chapter in Book/Conference proceedings/Edited volume › Conference contribution › Scientific › peer-review

**Non-iterative heteroscedastic linear dimension reduction for two-class data; from Fisher to Chernoff.** / Loog, M; Duin, RPW.

Research output: Chapter in Book/Conference proceedings/Edited volume › Conference contribution › Scientific › peer-review

Loog, M & Duin, RPW 2002, Non-iterative heteroscedastic linear dimension reduction for two-class data; from Fisher to Chernoff. in T Caelli, A Amin, RPW Duin, M Kamel & D de Ridder (eds), *Structural, Syntactic, and Statistical Pattern Recognition, Proceedings.* Lecture Notes in Computer Science, vol. 2396, Springer, Berlin, pp. 488-496, Joint IAPR International Workshops SSPR'02 and SPR'02 (Windsor, Canada), 6/08/02.

Loog, M., & Duin, RPW. (2002). Non-iterative heteroscedastic linear dimension reduction for two-class data; from Fisher to Chernoff. In T. Caelli, A. Amin, RPW. Duin, M. Kamel, & D. de Ridder (Eds.), *Structural, Syntactic, and Statistical Pattern Recognition, Proceedings *(pp. 488-496). (Lecture Notes in Computer Science; Vol. 2396). Springer.

Loog M, Duin RPW. Non-iterative heteroscedastic linear dimension reduction for two-class data; from Fisher to Chernoff. In Caelli T, Amin A, Duin RPW, Kamel M, de Ridder D, editors, Structural, Syntactic, and Statistical Pattern Recognition, Proceedings. Berlin: Springer. 2002. p. 488-496. (Lecture Notes in Computer Science).

@inproceedings{c1c5d9cc7e9a4e1f978b941acaa7f0e5,

title = "Non-iterative heteroscedastic linear dimension reduction for two-class data; from Fisher to Chernoff",

abstract = "Linear discriminant analysis (LDA) is a traditional solution to the linear dimension reduction (LDR) problem, which is based on the maximization of the between-class scatter over the within-class scatter. This solution is incapable of dealing with heteroscedastic data in a proper way, because of the implicit assumption that the covariance matrices for all the classes are equal. Hence, discriminatory information in the difference between the covariance matrices is not used and, as a consequence, we can only reduce the data to a single dimension in the two-class case. We propose a fast non-iterative eigenvector-based LDR technique for heteroscedastic two-class data, which generalizes, and improves upon LDA by dealing with the aforementioned problem. For this purpose, we use the concept of directed distance matrices, which generalizes the between-class covariance matrix such that it captures the differences in (co)variances.",

keywords = "conference contrib. refereed, ZX CWTS JFIS < 1.00",

author = "M Loog and RPW Duin",

note = "ISSN 0302-9743, phpub 29; null ; Conference date: 06-08-2002 Through 09-08-2002",

year = "2002",

language = "Undefined/Unknown",

isbn = "3-540-44011-9",

publisher = "Springer",

pages = "488--496",

editor = "T Caelli and A Amin and RPW Duin and M Kamel and {de Ridder}, D",

booktitle = "Structural, Syntactic, and Statistical Pattern Recognition, Proceedings",

}

TY - GEN

T1 - Non-iterative heteroscedastic linear dimension reduction for two-class data; from Fisher to Chernoff

AU - Loog, M

AU - Duin, RPW

N1 - ISSN 0302-9743, phpub 29

PY - 2002

Y1 - 2002

N2 - Linear discriminant analysis (LDA) is a traditional solution to the linear dimension reduction (LDR) problem, which is based on the maximization of the between-class scatter over the within-class scatter. This solution is incapable of dealing with heteroscedastic data in a proper way, because of the implicit assumption that the covariance matrices for all the classes are equal. Hence, discriminatory information in the difference between the covariance matrices is not used and, as a consequence, we can only reduce the data to a single dimension in the two-class case. We propose a fast non-iterative eigenvector-based LDR technique for heteroscedastic two-class data, which generalizes, and improves upon LDA by dealing with the aforementioned problem. For this purpose, we use the concept of directed distance matrices, which generalizes the between-class covariance matrix such that it captures the differences in (co)variances.

AB - Linear discriminant analysis (LDA) is a traditional solution to the linear dimension reduction (LDR) problem, which is based on the maximization of the between-class scatter over the within-class scatter. This solution is incapable of dealing with heteroscedastic data in a proper way, because of the implicit assumption that the covariance matrices for all the classes are equal. Hence, discriminatory information in the difference between the covariance matrices is not used and, as a consequence, we can only reduce the data to a single dimension in the two-class case. We propose a fast non-iterative eigenvector-based LDR technique for heteroscedastic two-class data, which generalizes, and improves upon LDA by dealing with the aforementioned problem. For this purpose, we use the concept of directed distance matrices, which generalizes the between-class covariance matrix such that it captures the differences in (co)variances.

KW - conference contrib. refereed

KW - ZX CWTS JFIS < 1.00

UR - http://link.springer.de/link/service/series/0558/bibs/2396/23960508.htm

M3 - Conference contribution

SN - 3-540-44011-9

SP - 488

EP - 496

BT - Structural, Syntactic, and Statistical Pattern Recognition, Proceedings

A2 - Caelli, T

A2 - Amin, A

A2 - Duin, RPW

A2 - Kamel, M

A2 - de Ridder, D

PB - Springer

CY - Berlin

Y2 - 6 August 2002 through 9 August 2002

ER -

ID: 3506765