Detaching Attachments in Postfix

Here you will find the updated scripts for detaching attachments (detachments). This should work for any mail server that can run filters or milters. The detach script takes mail from STDIN and outputs to STDOUT. My examples are for Postfix. You can read a similar writeup at

After looking at a few different detachment options I modified Ryan Hamilton's detach scripts.

Downloads - 13K - 4.1K - 22k
index.cgi - 3.4K
delete.cgi - 458b


Detailed patch informaton

index.cgi and delete.cgi


For a simple Postfix setup these directions should be enough.
Aaron C. de Bruyn (from the link above) says:

In, I add '-o content_filter=detach' to the SMTP
service, and add the detach service further down in

detach	unix	-	n	n	-	-	pipe
  flags=Rq user=list argv=/usr/local/bin/detachit
  $(sender) $(recipient)


# detachit: Pipe postfix messages through detach
recip="[email protected]"
if [ "$#" -eq 1 ]; then
 /usr/local/bin/detach -d /var/www/webmail/detach --aggressive -w
 /usr/local/bin/detach -d /var/www/webmail/detach --aggressive -w
fi | /usr/sbin/sendmail -i -f $sender -- $recip
exit $?

I had a more complex setup and spent some time reading the Postfix filter docs. Those docs tell you to edit Do not do this. You will create a loop and your mail will be rejected with 'Too Many Hops'. This post by mouss explains why and how to fix it. Basically do not edit for this.

The requirements I was given:

So, my detachit script has:

My script will not work for you as-is. You will need to edit it - possibly heavily. At the top you will need to chage:

# $weburi = URI prefix, ex:
# $webdir = file path, ex: /var/www/files
# @domains = list of local domains (header check)
# @attachlist = list of recipients to never detach (command line args check - match from /etc/aliases)
# @detachlist = list of recipients to always detach (command line args check - match from /etc/aliases)
# $attach_size = minimum attachment size to detach (in bytes)

An example config:

my $weburi = '';
my $webdir = '/home/example/files';
my @domains = qw(;
my @attachlist = qw(;
my @detachlist = qw(;
my $attach_size = 3145728;

You also need to search for this and change it or comment it out. All incoming email goes through our Barracuda so flagging incoming mail was easy.

# Check for incoming mail (from barracuda)
if ($header =~ /\[10\.0\.0\.25\]/) {

Now that I had the scripts ready it was time for Postfix to use them. All email goes through ClamAV and I definitely want to scan the files before removing them so I needed to insert detachit after the virus scanner. To do this I added a section for detachit at the bottom of

detach    unix  -       n       n       -       -       pipe
  flags=Rq user=list null_sender=
  argv=/usr/local/bin/detachit -f ${sender} -- ${recipient}

The ClamAV filter was already defined in the smtp service.

smtp	inet	n	-	-	-	-	smtpd
-o content_filter=scan:

I left that alone. There was another section defined for the mail coming back.

# For injecting mail back into postfix from the filter	inet	n	-	n	-	16	smtpd
-o content_filter=

I had to make two changes to insert the filter without causing a loop. For the service I changed the content filter to:

-o content_filter=detach:dummy

For the pickup service I changed:

pickup	fifo	n	-	-	60	1	pickup


pickup	fifo	n	-	-	60	1	pickup
	-o content_filter=
	-o receive_override_options=no_header_body_checks

This setup means that mail sent through the local 'sendmail' binary will not have the attachments detached. There are no shell users on the mail server so that was not a problem. If webmail is used, make sure it connects via smtp and does not run a local sendmail binary.


There are two additional things that I do to help with the attachment space. I remove files more than 30 days old and I hardlink duplicate files. I use my modified hardlink script so that new duplicate files are not prematurely deleted. The original script came from here.



# Delete old files
COUNT=$(find /home/example/files -mindepth 2 -mtime +30 -print -delete | wc -l)
echo "find deleted $COUNT files"

# Hardlink duplicate files
python /usr/local/bin/ -v 0 -t /home/example/files/


I run that script once per night and pipe the output to a file. If you use index.cgi and delete.cgi you will want to set their permissions such that this script does not delete them after 30 days.
April 7 2012 output:

find: cannot delete /home/example/files/index/index.cgi: Permission denied
find: cannot delete /home/example/files/index/.htaccess: Permission denied
find: cannot delete /home/example/files/index/delete.cgi: Permission denied
find deleted 109 files

Hard linking Statistics:
Files Hardlinked this run:
Directories           : 2780
Regular files         : 1023
Comparisons           : 1619
Hardlinked this run   : 6
Total hardlinks       : 68
Bytes saved this run  : 31371571 (29.918 mebibytes)
Total bytes saved     : 420746864 (401.255 mebibytes)
Total run time        : 8.47399711609 seconds

It does not take much time and does its job.