On Wednesday, September 29, 2004, 5:13:00 PM, Rob McEwen wrote:
Regarding the top x number of most frequently queried blacklisted URIs, I was hoping that these could, perhaps, be listed in a simple text file that would be built once or twice a day in, hopefully, some kind of automated process. Because I don't use SpamAssassin, I don't know if the other ideas of implementing this would help me, but I could build a program which could periodically (once a day... every few hours) download this "most requested blocked" list and keep these blocked on my server as described. Would this be possible or practical?
Rob McEwen
I'm sure it could be done, but DNS is easier. Note also the repeated DNS queries on the same domain are hopefully cached by the local resolver at least within the positive caching TTL.
If the zone file expire time applies, then the queries to the authoritative name servers would be minimal. (If the TTL applies, then the queries could be pretty frequent for a commonly appearing spam URI domain.)
Perhaps this is an experiment you could try (to check the positive caching behavior) for us?
Jeff C. -- "If it appears in hams, then don't list it."