[Prev][Next][Index][Thread]

AHA!!!



Okay, so CL-HTTP has "taken over" <http://lynch.lscorp.com/>" and wants
to compute everything...

But *THAT'S* where all my existing web pages are.  :-(

Now I *CAN* go through each of my subdirectories and do export-url on
them with :directory and :pathname arguemnts...


*BUUUUUT*... it would be really nice if I could convince CL-HTTP to do a
directory search after it tries to find a given page in its, er, magical
construction facility.  Then I could ease into this whole CL-HTTP build
pages on the fly concept.

Before I start digging into the code and trying to do this myself...

1)  Is this an existing feature I've missed?, or

2)  Is this just a dumb idea, because of ___________________

3)  Is this a great idea, but I should let somebody else do it because
it's way too hard for a beginner.

Thanks.

Sorry if I'm being too verbose getting this set up...


Follow-Ups: