Home
Reading
Searching
Subscribe
Sponsors
Statistics
Posting
Contact
Spam
Lists
Links
About
Hosting
Filtering
Features Download
Marketing
Archives
FAQ
Blog
 
Gmane
From: Steve Grimm <sgrimm <at> facebook.com>
Subject: Re: Largest production memcached install?
Newsgroups: gmane.comp.web.cache.memcached
Date: Thursday 3rd May 2007 18:32:50 UTC (over 10 years ago)
No clue if we¹re the largest installation, but Facebook has roughly 200
dedicated memcached servers in its production environment, plus a small
number of others for development and so on. A few of those 200 are hot
spares. They are all 16GB 4-core AMD64 boxes, just because that¹s where
the
price/performance sweet spot is for us right now (though it looks like 32GB
boxes are getting more economical lately, so I suspect we¹ll roll out some
of those this year.)

We have a home-built management and monitoring system that keeps track of
all our servers, both memcached and other custom backend stuff. Some of our
other backend services are written memcached-style with fully
interchangeable instances; for such services, the monitoring system knows
how to take a hot spare and swap it into place when a live server has a
failure. When one of our memcached servers dies, a replacement is always up
and running in under a minute.

All our services use a unified database-backed configuration scheme which
has a Web front-end we use for manual operations like adding servers to
handle increased load. Unfortunately that management and configuration
system is highly tailored to our particular environment, but I expect you
could accomplish something similar on the monitoring side using Nagios or
another such app.

All that said, I agree with the earlier comment on this list: start small
to
get some experience running memcached in a production environment. It¹s
easy
enough to expand later once you have appropriate expertise and code in
place
to make things run smoothly.

-Steve


On 5/3/07 8:06 AM, "Sam Lavery"  wrote:

> Does anyone know what the largest installation of memcached currently is?
 I'm
> considering putting it on 100+ machines(solaris/mod_perl), and would love
to
> hear any tips people have for managing a group of that size(and larger).
> Additionally, are there any particular patches I should try out for this
> specific platform?
>  
>  
> Thanks in advance,
> Sam
>  
>  
>  
>
 
CD: 3ms