Memcache is a distributed in-memory data store designed to reduce database load for web applications by caching frequently used data across multiple machines. In a distributed web serving environment applications rely on many network services to complete each request. While faster processors have lowered computation time and available network bandwidth has increased, signal propagation delay is a constant and will become a larger proportion of latency in the future. We explore how data-locality with Memcache can be exploited to reduce latency and minimize core network traffic. A model is developed to predict how alternate Memcache configurations would perform for specific applications followed by an evaluation using the MediaWiki open-source web application in a miniature web farm setting. Our results verified our model and we observed a 66% reduction in core network traffic and a 23% reduction in Memcache response time under certain network conditions.