There are many challenges to be overcome in achieving this task and make machine a better learner. One frequently mentioned problem is the scarcity of labeled data. Machine learning is made of up of several trainable modules. Little attention has been paid to describe how to assemble various trainable modules in order to address particular task.
Reasoning can be defined with two components.
One is algebra and another composition rules. Every model has to be defined with set of algebraic operations that can operate on the model. Then there are composition rules that establish a homo-morphic correspondence between the space of models and the space of questions of interest. The algebraic set of operations defined on representation space is more important than the representation itself. Algebraic manipulation of previously acquired knowledge in order to answer a new question stands out as essential task. Algebraic structures are more significant than the objects on which they operate. To operate, composition rules can be described with very different levels of sophistication.
Algebraic structures can also be constructed for probabilistic models. In order to construct an algebraic structure for probabilistic models, it is necessary to also discuss how probability distributions are parameterized. The probabilistic inference rules induce an algebraic structure on the space of conditional probability distribution models describing relations between arbitrary subsets of random variables. Probabilistic relational models can also be used.
The other kinds of reasoning that can be used are first order logic reasoning, probabilistic reasoning, causal reasoning, Newtonian mechanics, spatial reasoning and Non-falsifiable reasoning.
Summary post of the paper: Léon Bottou. 2014. From machine learning to machine reasoning. Mach. Learn. 94, 2 (February 2014), 133-149.