Hey all,
I've started phase two of GetURI's development, in the form of a CGI interface. So far, the interface supports lists of ham and spam domains, which will be especially handy for discussions on this list, and for those of you hand-checking lists of submitted domains.
I'll also probably at least add a message checker and maybe even an mbox upload feature, based on feedback and testing.
The CGI version produces the same output that the commandline version does. (In fact, it uses some clever tricks to communicate with the commandline version and provide progress to the browser).
http://ry.ca/cgi-bin/geturi.cgi
This will be *especially* useful for all of these "FP" discussions we've been having lately. Now, f'rinstance, if I proclaim topical7074rneds .com should be whitelisted (it shouldn't!!), I could just feed you this:
http://ry.ca/cgi-bin/geturi.cgi?domains=topical7074rneds.com&surbl=ws.su...
...which is ugly, and gets uglier in a hurry with more domains. So, I implemented caching and unique IDs, so you can just do the lookup yourself, and then cut and paste the resulting (potentially much shorter) link that will show up in your Address bar, like so:
http://ry.ca/cgi-bin/geturi.cgi?id=spam-X52p6q8FMYYEINMCsepAG1
These cached results (if referenced by id, as in the last URL) will be available for at least 30 days from their last access time. The page itself describes the caching algorithm in a bit more detail, for those that are interested.
Try it out, please, and make some noise about it. It's experimental (read: I started with an idea in my head and a blank file in my editor about 4.5 hours ago), so there are bound to be some interesting bugs. ;-)
- Ryan