It seems to be a common feature of naieve propoals for handling spam that they would cause an increase in load on other parts of the email system when working sucessfully.
I believe that the only parts of the system which should experience an increase are those parts directly associated with identifying and removing the messages from the system, be these filters or white/black lists or DNS (for SPF etc) or so on.
Increased resource consumption should be contained in only those parts of the system involved, and ideally to the extent that a direct correlation between the resource requirements and the volume of unwanted messages exists, allowing capacity to be estimated and provisioned acurately.
In fact I'm so suprised that this isn't recognised as a universal truth that I've written Danny's principle, in three parts;
- "I/ In any messaging system any components involved with identifying and removing unwanted messages should, when operating sucessully, create conditions in which no other parts of the system experience an increase in resource consumption as a direct consequence, and should tend to reduce consumption in some components as a consequence of removing unwanted messages."
- "II/ Any components identifying and removing unwanted messages from the system should reduce their resource consumption as the number of unwanted messages they are challenged with reduces"
- "III/ Resource requirements for components identifying and removing unwanted messages from the system should be designed to be predictable and based upon measureable attributes of the traffic."
This is an important practical consideration for the uptake of any technology.
Ideally resource consumption will be influenced only by the most basic attributes, such as message volume, but where it is influenced by more sophisticated attributes, such as message content or routing, it must be possible to predict requirements after analysis of current or comparable traffic.