We're all set to add a dumb-intelligent-agent capability to our site, where
people can tell us on a section-by-section basis that they want to be
notified when new content is posted. This notification would occur via
email, and would happen in batches twice a week so people weren't flooded
with lots of small bits of information. However, as an implementor I find
this an ugly way to go about this - the mail traffic will be incredible and
the amount of processing power to deal with each person's unique filter is
really going to add up. Instead, I'd like to provide a URL that people would
access on a regular basis which would tell them the same information - that
way we don't have to deal with bouncing mail or huge mail queues, etc. In
fact we already have a URL like that, called "What's New" accessible from the
home page (this is customized to the user, since we know where a given user
has visited, but not personalized by section preferences yet).
Anyways, it seemed to me that other sites have this kind of information flow
delimma - where the content exists at a common URL as a regular stream, and
where "add me to your hotlist and visit regularly" has to be said explicitly.
Just watching my own browsing habits and those of others, regularly visiting
the same URL's is just not something people remember to do.
It occurred to me that a more general solution to all this would be if
browsers implemented a cron-style auto-fetch functionality - where I could
say "fetch this URL every day at 3pm and let me know if it changes". The
browser would present the fetched pages in a menu the same way a mail reader
presents mail messages. I set my auto-fetch function to grab stock
quotes, the cover page of the SF Chronicle home page, and the Sherilyn Fenn
fanclub home page every night at 3am, and when I come into work that morning
I'll see a menu of that information the same way I sit down to the 50
messages on the WWW mailing lists. Furthermore, I'll only get the Sherilyn
Fenn home page *when*it's*changed*, and if it gets a 302 Redirect it'll
change the URL without even telling me. For those who don't leave their
netscapes running at night, it can keep a record of the last job it
performed and at launch perform the rest up to the current time.
So in this case, instead of storing people's section preferences in a
huge database on the server side, as we would be doing, the person would
define their own preferences directly in their browser, and this
functionality could be used on lots and lots of different sites. (Yeah,
I know our server doesn't do last-modifieds correctly, we'll fix that)
Furthermore, this could represent a model for mailing list distribution -
instead of a monolithic mail server that executes 20 separate sendmail
procs to deliver 20 list emails to the same user, the user just hits the
server when they actually read their mail. The browser can send
last-modified which would indicate the last time list messages were
fetched, and the server would send just the email posted since then.
Say goodbye to mailing list headaches like bounced email, bogged down
mail servers, etc etc!
I think functionality like this is what really will fulfill the promises
of "intelligent agents", not autonomous programs flinging their way
around the net. But that's a different religious war altogether.
Brian
--=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=--
brian@hotwired.com brian@hyperreal.com http://www.hotwired.com/Staff/brian/