Language:EN
Pages: 1
Rating : ⭐⭐⭐⭐⭐
Price: $10.99
Page 1 Preview
contentsfeature selectionfeature selection criteri

Contentsfeature selectionfeature selection criteriasearch algorithms for feature selectionsuboptimal search algorithmsexample application studyfurther developmentssummarylinear feature extractionprincipal components analysiskarhunenlo transformationfactor analysisexample application studyfurther developmentssummarymultidimensional scalingclassical scalingmetric multidimensional scalingordinal scalingalgorithmsmultidimensional scaling for feature extractionexample application studyfurther developmentssummaryapplication studiessummary and discussionrecommendationsnotes and referencesexercises clusteringintroductionhierarchical methods single-link method complete-link method sum-of-squares method general agglomerative algorithm properties hierarchical classification example application study summaryquick partitionsmixture models model description example application studysum-of-squares methods clustering criteria clustering algorithms vector quantisation example application study further developments summarycluster validity introduction distortion measures choosing the number clusters identifying genuine clustersapplication studiessummary and discussionrecommendations notes and referencesexercisesmodel selection separate training and test sets cross-validation the bayesian viewpoint akaikes information criterionlearning with unreliable classificationmissing dataoutlier detection and robust proceduresmixed continuous and discrete variablesstructural risk minimisation and the vapnikchervonenkisdimension bounds the expected risk the vapnikchervonenkis dimensionmeasures dissimilaritynumeric variablesnominal and ordinal variablesbinary variablessummarydistances between distributionsmethods based prototype vectorsmethods based probabilistic distanceprobabilistic dependencediscussionparameter estimationparameter estimationproperties estimatorsmaximum likelihoodproblems with maximum likelihoodbayesian estimates

xii

CONTENTS

307
9.2

Feature selection

9.2.1
308
9.2.2
311
9.2.3
314
9.2.4
317
9.2.5

Further developments

317
9.2.6

Summary

318
9.3
318
9.3.1
319
9.3.2
329
9.3.3
335
9.3.4

Example application study

342
9.3.5

Further developments

343
9.3.6
344
9.4
344
9.4.1
345
9.4.2
346
9.4.3

Ordinal scaling

347
9.4.4

Algorithms

350
9.4.5
351
9.4.6
352
9.4.7
353
9.4.8
353
9.5

Application studies

354
9.6

Summary and discussion

355
9.7
355
9.8
356
357
361
10.1

Introduction

361
10.2

Hierarchical methods

362
364
367
368
368

10.2.5 Properties of a hierarchical classification

369

10.2.6 Example application study

370
370
10.3
371
10.4
372
372

10.4.2 Example application study

374
10.5

Sum-of-squares methods

374
375
376
382
CONTENTS xiii
394

10.5.5 Further developments

395

10.5.6 Summary

395
10.6
396
396
397
397

10.6.4 Identifying genuine clusters

399
10.7

Application studies

400
10.8
402
10.9
404
405
406
11 Additional topics 409
11.1

Model selection

409

11.1.1 Separate training and test sets

410
410
411
411
11.2
412
11.3

Missing data

413
11.4

Outlier detection and robust procedures

414
11.5
415
11.6
416
416

11.6.2 The Vapnik–Chervonenkis dimension

417
A Measures of dissimilarity 419
A.1

Measures of dissimilarity

419
A.1.1
419
A.1.2
423
A.1.3
423
A.1.4
424
A.2

Distances between distributions

425
A.2.1

Methods based on prototype vectors

425
A.2.2
425
A.2.3
428
A.3
429
B
431
B.1

Parameter estimation

431
B.1.1

Properties of estimators

431
B.1.2
433
B.1.3
434
B.1.4
434

You are viewing 1/3rd of the document.Purchase the document to get full access instantly

Immediately available after payment
Both online and downloadable
No strings attached
How It Works
Login account
Login Your Account
Place in cart
Add to Cart
send in the money
Make payment
Document download
Download File
img

Uploaded by : Samiha Deshmukh

PageId: ELID718FCC