Practical Technology

for practical people.

Thanks for the Memcached…

| 0 comments

How do you handle lots and lots of database reads and writes? That’s a question some of the biggest websites on the planet, such as Wikipedia and Facebook, face every day. (What? You thought all those up-to-the-minute updates happened by magic?) Their answer?—?and the solution used by many other companies?—?is Memcached, pronounced mem-cash-dee.

Memcached is an open-source, high-performance, distributed, object-caching system. It’s commonly used as an in-memory key-value store for small chunks of arbitrary data. These can include strings and objects from database calls, API calls, or page rendering. No matter the data type, Memcached saves data as strings.

The net result? You can use it to speed up any storage-based interactions. But Memcached is also meant to speed up dynamic web applications by reducing your database load.

Memcached is best known for being used in social networks. Indeed Memcached was created by LiveJournal’s programmers to deal with the massive data I/O requirements of dynamic social-network applications. Now, you may not be running a social network on your servers?—?although you could with programs such as Elgg?—?but you can also use Memcached to give your virtual private servers a speed boost by caching your web-server-session data.

Fundamentally, Memcached works by sending any program’s first attempt to retrieve data to its cache before querying the database. Like all caches, the program fills itself with recently retrieved server data. Thus, when a user asks for data, Memcached provides it from fast memory, rather than the slower hard-drive-based database.

Memcached doesn’t use RAM dynamically. You must set the Memcached-daemon up at the start with the memory it will use for caching. Because of this, once you’ve dedicated memory to Memcached, it’s reserved expressly for its use.

Within the cache itself, Memcached keeps its data “fresh” by using a Least Recently Used (LRU) algorithm. With this technique, Memcached automatically deletes the item that hasn’t been used for the longest time when it starts to run out of space.

Set up properly, Memcached can take a long while before it runs out of room. Unlike the caches you may already have been using on servers or PCs, Memcached is a distributed cache. That means it caches its data across multiple servers. This means, in turn, you can grow it in size and transactional capacity to where it can handle the largest interactive applications, such as Facebook.

Memcached is also multi-threaded. That makes it easy to scale up its performance by boosting its CPU when necessary.

Now, you might ask yourself, “Why do I need Memcached for this when I can use, say, Alternative PHP Cache (APC) to cache both code and data?” The answer is, you don’t. For a small site or one containing mostly static data, you don’t need Memcached. But, if you do need it, if your site has many users running dynamic web programs constantly pulling and pushing data to a DBMS, then Memcached is your friend. Embrace it.

While Memcached is typically used with PHP, you can deploy it with almost any language. Its API is available for most popular languages.

This is an important point. Memcached is not something you can simply install and expect to get better performance from your WordPress site. As the Memcached GitHub site explains, it’s “a developer tool, not a ‘code accelerator,’ nor is it database middleware. If you’re trying to set up an application you have downloaded or purchased to use Memcached, read your app’s documentation.”

If that sounds like a lot of work, well, it can be, but Memcached can really boost your website’s performance once you get your hands around it. There are some people who say that other caching programs, usually Redis, are better. Redis is fine, but in my experience, for a simple, easy way to get complex web applications working efficiently, it’s hard to beat Memcached.

So, thanks… for reducing database bottlenecks, and speeding up dynamic web apps. And thanks… for freed up RAM with an LRU, streamlined deployment, and simplified dev design. Thanks for the Memcached.

This story first appeared in Linode-Cube.

Leave a Reply