# ECE471-571 Lecture 1 - Introduction

ECE471-571 Pattern Recognition Lecture 3 Discriminant Function and Normal Density Hairong Qi, Gonzalez Family Professor Electrical Engineering and Computer Science University of Tennessee, Knoxville http://www.eecs.utk.edu/faculty/qi Email: [email protected] Pattern Classification Statistical Approach Supervised Basic concepts: Baysian decision rule (MPP, LR, Discri.) Non-Statistical Approach Unsupervised Basic concepts: Distance Agglomerative method Parameter estimate (ML, BL) k-means Non-Parametric learning (kNN) Winner-takes-all LDF (Perceptron)

Kohonen maps NN (BP) Mean-shift Decision-tree Syntactic approach Support Vector Machine Deep Learning (DL) Dimensionality Reduction FLD, PCA Performance Evaluation ROC curve (TP, TN, FN, FP) cross validation Stochastic Methods local opt (GD) global opt (SA, GA) Classifier Fusion majority voting NB, BKS Bayes Decision Rule P (w j | x)=

Maximum Posterior Probability Likelihood Ratio p(x | w j )P (w j ) p(x) For a given x, if P (w1 | x)> P (w 2 | x), then x belongs to class 1, otherwise, 2. If , then x belongs to class 1, otherwise, 2. 3 Discrimimant Function One way to represent pattern classifier- use discriminant functions gi(x) The classifier will assign a feature vector x to class w i if gi (x)> g j (x) For two-class cases, g( x) =g1 (x) - g2 (x) =P ( w1 | x) - P ( w 2 | x) 4 Multivariate Normal Density

p( x) = 1 1 T -1 exp - ( x - m) S ( x - m) d/2 1/ 2 2 ( 2p ) S x : d - component column vector m : d - component mean vector S : d - by - d covariance matrix S : determinant S -1 : inverse x1 m1 x = , m = xd m d 2 s 11 s 1d s 1 s 1d

S = = s d1 s dd s d1 s d 2 2 1 1 (x - m ) When d =1, p(x)= exp 2 2p s 2 s 5 Discriminant Function for Normal Density p(x | w)= 1 (2p )d / 2 S 1/ 2

1 T exp - (x - m ) S - 1 (x - m ) 2 gi (x)=ln p(x | wi )+ ln P (wi ) 1 =- (x 2 1 =- (x 2 T -1 d 1 m i ) S i (x - m i )- ln (2p )- ln S i + ln P (wi ) 2 2 T -1 1 m i ) S i (x - m i )- ln S i + ln P (wi ) 2 6 Case 1: Si=s2I The features are statistically independent, and have the same variance Geometrically, the samples fall in equal-size hyperspherical clusters

Decision boundary: hyperplane of d-1 dimension 1 s 2 0 0 s 2 S = , S =s 2d ,S- 1 = 1 0 s 2 0 2 s 7 Linear Discriminant Function and Linear Machine x x- gi (x)=-

x - mi 2 m i : the Euclidean norm (distance) 2 T m i =(x - m i ) (x - m i ) 2 + ln P (wi ) 2s T T T x x - 2m i x + m i m i =+ ln P (wi ) 2 2s m i T m iT m i gi (x)= 2 x + ln P (wi ) 2 s 2s 8 Minimum-Distance Classifier

When P(wi) are the same for all c classes, the discriminant function is actually measuring the minimum distance from each x to each of the c mean vectors gi (x)=- x - mi 2 2s 2 9 Case 2: Si = S The covariance matrices for all the classes are identical but not a scalar of identity matrix. Geometrically, the samples fall in hyperellipsoidal Decision boundary: hyperplane of d-1 dimension gi ( x) =ln p( x | w i ) + ln P ( w i ) 1 T -1 =- ( x - mi ) S i ( x - mi ) + ln P ( w i ) 2

Squared T - 1 T 1 T - 1 Mahalanobis =mi ( S ) x - mi S mi + ln P ( w i ) distance 2 10 Case 3: Si = arbitrary The covariance matrices are different from each category Quadratic classifier Decision boundary: hyperquadratic for 2-D Gaussian gi (x)=ln p(x | wi )+ ln P (wi ) 1 T -1 1 =- (x - m i ) S i (x - m i )- ln S i + ln P (w i ) 2 2 1 T - 1 T - 1 T 1 T - 1 1 =- x S i x + m i S i x - m i S i m i - ln S i + ln P (wi ) 2 2 2 ( ) 11 Case Study

a1 3.1 11.70 a b1 8.3 1.00 b c1 10.2 6.40 c u1 5.1 0.4 b a2 3.0 1.30 a b2 3.8 0.20 b c2 9.2 7.90 c u2 12.9 5.0 c a3 1.9 0.10 a b3 3.9 0.60 b c3 9.6 3.10 c u3 13.0 0.8 b a4 3.8 0.04 a b4 7.8 1.20 b c4 53.8 2.50 c u4 2.6 0.1 a

a5 4.1 1.10 a b5 9.1 0.60 b c5 15.8 7.60 c u5 30.0 0.1 o a6 1.9 0.40 a b6 15.4 3.60 b u6 20.5 0.8 o b7 7.7 1.60 b b8 6.5 0.40 b b9 5.7 0.40 b b10 13.6 1.60 b Calculate m Calculate S Derive the discriminant function gi(x) 12

## Recently Viewed Presentations

• Bonus(?pts) On my web page calendar, what happens this Friday… A. 1st test. B. rules quiz. C. free food day. D. 1st safety quiz. E. science fair assignment
• The deviation of an FM transmitter is set too low . T7B04Which of the following is a way to reduce or eliminate interference by an amateur transmitter to a nearby telephone? Put a filter on the amateur transmitter. Reduce the...
• VZÁJEMNÉ PŮSOBENÍ TĚLES Podmínky používání prezentace Stažení, instalace na jednom počítači a použití pro soukromou potřebu jednoho uživatele je zdarma.
• DC Stepper Motor Typical Use: Position Control Relative position (without feedback) Easy to control: # Steps moved = number of pulses in Desired Velocity Step Rate Inexpensive Good holding torque No brushes Size Range: 1.3"L 10 oz-in \$12. 9"L 2000...
• A long cylinder (radius r = b) initially at T = f(r) is exposed to a cooling medium which extracts heat uniformly from its surface. Imposed Boundary Temperature and Convection at the Boundary . in . Spherical Coordinates: quenching problem...
• A primary care provider reported very contentious discussions with Mr. Smith, who was a WW2 Veteran. Despite several minor accidents, becoming so lost he had to be brought home by the police, and losing his car in a parking lot,...
• The Inca empire ranged from Colombia to Chile and reached west to east from the Atacama desert to the Amazonian rain forest. The empire lived on top of the mountains and carved gardens into it's sides. They had roads and...
• An artist is a person engaged in an activity related to creating art, practicing the arts, or demonstrating an art. ... Each one of you are His master piece. Doyou think youareanArtist? ... Manga Book. Christian Artist in the market...