With the increasing demand for automatic security systems capable of recognizing people from a far distance and with as less cooperation as possible, gait recognition emerged as a very popular behavioral biometric because it is remotely observable and unobtrusive. However, the complexity and the high variability of gait patterns limit the power of gait recognition algorithms and adversely affect their recognition rates. Aiming to improve the performance of gait recognition systems without sacrificing the main advantages of gait, in this thesis, I introduce a novel multimodal gait recognition system that combines the gait patterns of the subjects with the context data related to their behavioral and social patterns. To the best of my knowledge, this is one of the only examples that the social patterns of the subjects have been used as a source of information in a multimodal biometric system. This thesis introduces a well-defined framework for defining, modeling, learning, storing and matching context data in a gait recognition system. The proposed behavioral modeling and matching framework is very flexible and can easily be adapted to different applications and multimodal biometric systems. According to the conducted experiments, the proposed gait recognition system can achieve significant improvements in the performance at a very low computational cost. The comparison of the method with other existing methods in the same area shows that the proposed approach is applicable and effective.