By harnessing human intelligence, crowdsourcing can solve problems that are difficult for computers. A fundamental problem in crowdsourcing is truth inference, which decides how to infer the truth effectively. We propose MFICrowd, a novel truth inference framework which takes multi-factor into account for profiling workers accurately and improving answer accuracy effectively. Based on the diversity degree of task domains and the semantic similarity of candidate answers, we quantify task difficulty for modeling tasks and workers objectively and exactly. By integrating task domains, task difficulty and answer similarity into truth inference, MFICrowd aggregates answers from a group of workers effectively. The comprehensive experimental results on both simulated and real datasets show that our truth inference framework based on multi-factor is effective, and it outperforms existing state-of-the-art approaches in both answer accuracy and time efficiency.