||Action recognition is crucial for object manipulation in robotics. In recent years, Programming by Demonstration has been proposed as an approach where a robot is learning a task from human demonstration. In this work, a model-free approach is evaluated for action recognition based on low cost visual data. In particular, the approach classifies actions by observing the object interaction changes during actions, which are recorded as videos. Moreover, with image processing techniques extracting object interaction changes from image sequences, actions are represented with Event Tables, which are matrices. Thus, actions classification can be achieved by measuring the similarities of the Event Tables.
In this work we simulate the noisy results of image segmentation and use this to evaluate and propose new methods to compare Event Tables. Two similarity measure methods are evaluated: the Substring Match (SSM) method and the Bhattacharyya Distance (B-Distance) method. The results show that the B-Distance method is capable to classify actions with higher interruption level than the SSM method. In addition, a string kernel representation method is proposed. The string kernel method enables the classification using Support Vector Machine method.