Here is the problem I'm toying with in my head.

I am serving many ( 1000's) of large files (~20MB - 1GB, typically ~50MB) over the internet via Apache (on linux). I have multiple Apache servers distributed across the US that are connected by the Internet. Connection speeds between servers are approximately 10mbs. The servers all serve exactly the same files. The files originate from the 1 system and are distributed to each origin server.

I am investigating mechanisms to automate the distribution of these files to the Apache boxes. A steady rate of file arrival is required ( atleast fast enough to saturate the upstream link from the origin box to the Apache servers).

Idealy I would prefer some form of distributed file system that allowed me standard access to a file system on the origin box and that would transparently distribute to all the apache servers. It would also be nice to have some kind of event driven notification when a particular file was available on a particular server.

Anyone have any thoughts on this problem?