Contentsfeature selectionfeature selection criteriasearch algorithms for feature selectionsuboptimal search algorithmsexample application studyfurther developmentssummarylinear feature extractionprincipal components analysiskarhunenlo transformationfactor analysisexample application studyfurther developmentssummarymultidimensional scalingclassical scalingmetric multidimensional scalingordinal scalingalgorithmsmultidimensional scaling for feature extractionexample application studyfurther developmentssummaryapplication studiessummary and discussionrecommendationsnotes and referencesexercises clusteringintroductionhierarchical methods single-link method complete-link method sum-of-squares method general agglomerative algorithm properties hierarchical classification example application study summaryquick partitionsmixture models model description example application studysum-of-squares methods clustering criteria clustering algorithms vector quantisation example application study further developments summarycluster validity introduction distortion measures choosing the number clusters identifying genuine clustersapplication studiessummary and discussionrecommendations notes and referencesexercisesmodel selection separate training and test sets cross-validation the bayesian viewpoint akaikes information criterionlearning with unreliable classificationmissing dataoutlier detection and robust proceduresmixed continuous and discrete variablesstructural risk minimisation and the vapnikchervonenkisdimension bounds the expected risk the vapnikchervonenkis dimensionmeasures dissimilaritynumeric variablesnominal and ordinal variablesbinary variablessummarydistances between distributionsmethods based prototype vectorsmethods based probabilistic distanceprobabilistic dependencediscussionparameter estimationparameter estimationproperties estimatorsmaximum likelihoodproblems with maximum likelihoodbayesian estimates
xii |
|
307 | ||
---|---|---|---|---|
9.2 |
|
|||
9.2.1 | 308 | |||
9.2.2 | 311 | |||
9.2.3 | 314 | |||
9.2.4 | 317 | |||
9.2.5 |
|
317 | ||
9.2.6 |
|
318 | ||
9.3 | 318 | |||
9.3.1 | 319 | |||
9.3.2 | 329 | |||
9.3.3 | 335 | |||
9.3.4 |
|
342 | ||
9.3.5 |
|
343 | ||
9.3.6 | 344 | |||
9.4 | 344 | |||
9.4.1 | 345 | |||
9.4.2 | 346 | |||
9.4.3 |
|
347 | ||
9.4.4 |
|
350 | ||
9.4.5 | 351 | |||
9.4.6 | 352 | |||
9.4.7 | 353 | |||
9.4.8 | 353 | |||
9.5 |
|
354 | ||
9.6 |
|
355 | ||
9.7 | 355 | |||
9.8 | 356 | |||
357 | ||||
361 | ||||
10.1 |
|
361 | ||
10.2 |
|
362 | ||
364 | ||||
367 | ||||
368 | ||||
368 | ||||
|
369 | |||
|
370 | |||
370 | ||||
10.3 | 371 | |||
10.4 | 372 | |||
372 | ||||
|
374 | |||
10.5 |
|
374 | ||
375 | ||||
376 | ||||
382 |
CONTENTS | xiii | |||||
---|---|---|---|---|---|---|
394 | ||||||
|
395 | |||||
|
395 | |||||
10.6 | 396 | |||||
396 | ||||||
397 | ||||||
397 | ||||||
|
399 | |||||
10.7 |
|
400 | ||||
10.8 | 402 | |||||
10.9 | 404 | |||||
405 | ||||||
406 | ||||||
11 Additional topics | 409 | |||||
11.1 |
|
409 | ||||
|
410 | |||||
410 | ||||||
411 | ||||||
411 | ||||||
11.2 | 412 | |||||
11.3 |
|
413 | ||||
11.4 |
|
414 | ||||
11.5 | 415 | |||||
11.6 | 416 | |||||
416 | ||||||
|
417 | |||||
A Measures of dissimilarity | 419 | |||||
A.1 |
|
419 | ||||
A.1.1 | 419 | |||||
A.1.2 | 423 | |||||
A.1.3 | 423 | |||||
A.1.4 | 424 | |||||
A.2 |
|
425 | ||||
A.2.1 |
|
425 | ||||
A.2.2 | 425 | |||||
A.2.3 | 428 | |||||
A.3 | 429 | |||||
B | 431 | |||||
B.1 |
|
431 | ||||
B.1.1 |
|
431 | ||||
B.1.2 | 433 | |||||
B.1.3 | 434 | |||||
B.1.4 | 434 |