Machine Learning in Computer Vision [Tntvillage.Scambioetico]
Visit this link: More info on this torrent
Machine Learning in Computer Vision by sunart
======>Screenshots <======
Title: Machine Learning in Computer Vision Author: N. SEBE - IRACOHEN - ASHUTOSH GARG - THOMAS S. HUANG Editore: Springer Years: 2005 Language:Inglese Formato file:PDF Grandezza del file: 6,51 MB
======> Descrizione <======
Foreword xi
Preface xiii
1. INTRODUCTION 1
1 Research Issues on Learning in Computer Vision 2
2 Overview of the Book 6
3 Contributions 12
2. THEORY:
PROBABILISTIC CLASSIFIERS 15
1 Introduction 15
2 Preliminaries and Notations 18
2.1 Maximum Likelihood Classification 18
2.2 Information Theory 19
2.3 Inequalities 20
3 Bayes Optimal Error and Entropy 20
4 Analysis of Classification Error of Estimated (Mismatched)
Distribution 27
4.1 Hypothesis Testing Framework 28
4.2 Classification Framework 30
5 Density of Distributions 31
5.1 Distributional Density 33
5.2 Relating to Classification Error 37
6 Complex Probabilistic Models and Small Sample Effects 40
7 Summary 41
3. THEORY:
GENERALIZATION BOUNDS 45
1 Introduction 45
2 Preliminaries 47
3 A Margin Distribution Based Bound 49
3.1 Proving the Margin Distribution Bound 49
4 Analysis 57
4.1 Comparison with Existing Bounds 59
5 Summary 64
4. THEORY:
SEMI-SUPERVISED LEARNING 65
1 Introduction 65
2 Properties of Classification 67
3 Existing Literature 68
4 Semi-supervised Learning Using Maximum Likelihood
Estimation 70
5 Asymptotic Properties of Maximum Likelihood Estimation
with Labeled and Unlabeled Data 73
5.1 Model Is Correct 76
5.2 Model Is Incorrect 77
5.3 Examples: Unlabeled Data Degrading Performance
with Discrete and Continuous Variables 80
5.4 Generating Examples: Performance Degradation with
Univariate Distributions 83
5.5 Distribution of Asymptotic Classification Error Bias 86
5.6 Short Summary 88
6 Learning with Finite Data 90
6.1 Experiments with Artificial Data 91
6.2 Can Unlabeled Data Help with Incorrect Models?
Bias vs. Variance Effects and the Labeled-unlabeled
Graphs 92
6.3 Detecting When Unlabeled Data Do Not Change the
Estimates 97
6.4 Using Unlabeled Data to Detect Incorrect Modeling
Assumptions 99
7 Concluding Remarks 100
5. ALGORITHM:
MAXIMUM LIKELIHOOD MINIMUM ENTROPY HMM 103
1 Previous Work 103
2 Mutual Information, Bayes Optimal Error, Entropy, and
Conditional Probability 105
3 Maximum Mutual Information HMMs 107
3.1 Discrete Maximum Mutual Information HMMs 108
3.2 Continuous Maximum Mutual Information HMMs 110
3.3 Unsupervised Case 111
4 Discussion 111
4.1 Convexity 111
4.2 Convergence 112
4.3 Maximum A-posteriori View of Maximum Mutual
Information HMMs 112
5 Experimental Results 115
5.1 Synthetic Discrete Supervised Data 115
5.2 Speaker Detection 115
5.3 Protein Data 117
5.4 Real-time Emotion Data 117
6 Summary 117
6. ALGORITHM:
MARGIN DISTRIBUTION OPTIMIZATION 119
1 Introduction 119
2 A Margin Distribution Based Bound 120
3 Existing Learning Algorithms 121
4 The Margin Distribution Optimization (MDO) Algorithm 125
4.1 Comparison with SVM and Boosting 126
4.2 Computational Issues 126
5 Experimental Evaluation 127
6 Conclusions 128
7. ALGORITHM:
LEARNING THE STRUCTURE OF BAYESIAN
NETWORK CLASSIFIERS 129
1 Introduction 129
2 Bayesian Network Classifiers 130
2.1 Naive Bayes Classifiers 132
2.2 Tree-Augmented Naive Bayes Classifiers 133
3 Switching between Models: Naive Bayes and TAN Classifiers 138
4 Learning the Structure of Bayesian Network Classifiers:
Existing Approaches 140
4.1 Independence-based Methods 140
4.2 Likelihood and Bayesian Score-based Methods 142
5 Classification Driven Stochastic Structure Search 143
5.1 Stochastic Structure Search Algorithm 143
5.2 Adding VC Bound Factor to the Empirical Error
Measure 145
6 Experiments 146
6.1 Results with Labeled Data 146
6.2 Results with Labeled and Unlabeled Data 147
7 Should Unlabeled Data Be Weighed Differently? 150
8 Active Learning 151
9 Concluding Remarks 153
8. APPLICATION:
OFFICE ACTIVITY RECOGNITION 157
1 Context-Sensitive Systems 157
2 Towards Tractable and Robust Context Sensing 159
3 Layered Hidden Markov Models (LHMMs) 160
3.1 Approaches 161
3.2 Decomposition per Temporal Granularity 162
4 Implementation of SEER 164
4.1 Feature Extraction and Selection in SEER 164
4.2 Architecture of SEER 165
4.3 Learning in SEER 166
4.4 Classification in SEER 166
5 Experiments 166
5.1 Discussion 169
6 Related Representations 170
7 Summary 172
9. APPLICATION:
MULTIMODAL EVENT DETECTION 175
1 Fusion Models: A Review 176
2 AHierarchical Fusion Model 177
2.1 Working of the Model 178
2.2 The Duration Dependent Input Output Markov Model 179
3 Experimental Setup, Features, and Results 182
4 Summary 183
10. APPLICATION:
FACIAL EXPRESSION RECOGNITION 187
1 Introduction 187
2 Human Emotion Research 189
2.1 Affective Human-computer Interaction 189
2.2 Theories of Emotion 190
2.3 Facial Expression Recognition Studies 192
3 Facial Expression Recognition System 197
3.1 Face Tracking and Feature Extraction 197
3.2 Bayesian Network Classifiers: Learning the
“Structure” of the Facial Features 200
4 Experimental Analysis 201
4.1 Experimental Results with Labeled Data 204
4.1.1 Person-dependent Tests 205
4.1.2 Person-independent Tests 206
4.2 Experiments with Labeled and Unlabeled Data 207
5 Discussion 208
11. APPLICATION:
BAYESIAN NETWORK CLASSIFIERS FOR FACE DETECTION 211
1 Introduction 211
2 Related Work 213
3 Applying Bayesian Network Classifiers to Face Detection 217
4 Experiments 218
5 Discussion 222
References 225
Index 237
======> Note <======
Orari di seed: tutti i pomeriggi escluso Sabato e Domenica
Banda: 15 KB/s
Visit http://www.tntvillage.scambioetico.org/ |