Motivation
Indian classical dances are a combination of abhinaya(facial expressions), mudra(hand gestures) and nritta(rhythmic dance movements).
Every dancer has a unique style and the subtle differences escape the human eye. Motion capture makes it possible to recreate complex movements
easily and thus, it must be possible to capture those elusive traits and identify the dancer from their dance moves.
Method
This is a classification problem and will require a supervised learning algorithm, like support vector machines(SVM). It is not a binary classification,
since the goal is not to determine whether the activity captured involves dance movements or not. In this particular problem, we need to treat each
subject separately. Suppose we have 5 dancers(subjects); they will be assigned a class label from the set C = {0,1,2,3,4}. The training data will be grouped using these
class labels and the learning algorithm will then classify the testing data, based on the previously labeled training items. Since the procedure involves
multiple classes, it would be better to use a multiclass SVM training algorithm for building a model to group the testing data. If the data is too
complex for a hyperplane to separate it, kernel tricks might have to be incorporated as well.
Datasets
I plan to capture the dance movements of 4 or 5 subjects, using the VICON motion capture camera setup. Two sets of data will be obtained, one of
which will be used for training and the other for testing. The training data will be classified, based on the subjects. The CMU motion capture database
contains sample files capturing dance moves of ballet, lambada, lindyhop and so on. I wish to work with datasets on Indian classical dance moves and hence,
will do my own data collection.
Timeline
Motion capturing can be tricky and hence, by the milestone due date of May 11th, I hope to obtain a good and clean dataset and identify the features for
classifying the data.