Social networks play a vital role in the spread of information through a population, and individuals in networks make important life decisions on the basis of the information to which they have access. In many cases, it is important to evaluate whether information is spreading fairly to all groups in a network. For instance, are male and female students equally likely to hear about a new scholarship? In this paper, we present the information unfairness criterion, which measures whether information spreads fairly to all groups in a network. We perform a thorough case study on the DBLP computer science co-authorship network with respect to gender. We then propose MaxFair, an algorithm to add edges to a network to decrease information unfairness, and evaluate on several real-world network datasets.