Are you using Amazon S3 as a content delivery network? Here're two handy tools to help you optimize you static contents:
First, a command line tool called "s3up" that sets all the appropriate headers, gzips your data when possible, and even runs your images through Yahoo!'s Smush.it service.
Second, "Autosmush" which scans S3 bucket every night, runs each file through Smush.it, looking for new, un-smushed images and replaces images with compressed versions.
Autosmush also allow to compress the huge backlog of existing images in Amazon account prior to using Smush.it.
Autosmush appends an x-amz-smushed HTTP header to every image it compresses (or images that can't be compressed further). This lets the script scan extremely quickly through your files, only sending new images to Smush.it and skipping ones it has already processed.