Decision trees are models that use divide and conquer techniques to split data into branches based on attribute values at each node. To construct a decision tree, an attribute is selected to label the root node using the maximum information gain approach. Then the process is recursively repeated on the branches. However, highly branching attributes may be favored. To address this, gain ratio is used which compensates for number of branches by considering split information. The example demonstrates calculating information gain and gain ratio to select the best attribute to split on at each node.