Will Bhattacharyya distance always increase when increasing the number of features used to describe two populations of images? -


i have 3 classes of images of human cells. have extracted 600+ features images , can separate classes quite using several features selected using random forest machine learning algorithm. when measure bhattacharyya distance between each class notice if add more features bhattacharyya distance gets larger (as long don't add duplicate feature). surely have new classes of cells @ in future tempting add in many features possible increase separation of present classes , enable separation of future unknown classes. fooling myself in thinking better able delineate 1 cell type adding more features , getting larger bhattacharyya distance between them?

i don't think issue specific distance measure more towards feature selection. think in general more independent features add more separation get. decrease within group variance perhaps point of separating group of similar objects creating on constrained problem.

in classification there possibility of creating overly constrained feature set. classifier may work on training data, fails in real world because in adding more features (constraints) made classifier selective. think how humans determine if different. image.

http://tx.english-ch.com/teacher/ackie/spot-the-difference-christmas-1.jpg

using color alone classifier find 2 differences (star vs red ribbon, on presents orange vs red ribbon). further constrain our problem , use size , find red ball. adding shape constraint find red diamond vs crescent, , 2 different snowflakes.

you can see 1 of these classifiers have adequately separated 2 images, combined have strong line between 2 images, isn't robust/flexible. if clone image on left , make yellow ball red? our highly constrained groups, we'd have assume belongs new group. if classified using our shape classifier, we'd still have 2 groups. group 1 has bows on top of trees , group 2 has stars. case did want? application specific. less features may mean classifier isn't selective (large variance within group/under constrained) , add more features classifier may become selective , in worse case treat every example own class (over constrained)


Comments

Popular posts from this blog

android - MPAndroidChart - How to add Annotations or images to the chart -

javascript - Add class to another page attribute using URL id - Jquery -

firefox - Where is 'webgl.osmesalib' parameter? -