[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: RAPI Spec Comments
On Mon, 13 Dec 1999, Hugo Rabson wrote:
> On Monday, December 13, 1999 4:14 PM, Aaron Turner <aturner@linuxkb.org>
> wrote:-
> [...]
> > I'd like the RAPI to support ALL the features of the
> > web interface. People who write apps to use the RAPI don't have to use
> > them, but it would be nice to offer it.
>
> I agree: we should offer as large a subset as possible of the search
> engine's interactive capabilities.
>
> Is the search engine's web page and/or cgi script near completion?
Both are very near completion and currently work pretty well. I need to
fix a few things here and there and add some more small features, but it's
basically done. Once you send me that username info I'll create your
account and you can see it for yourself.
> If the search engine's "Search LinuxKB" web page is finished, I could
> document the form's POST parameters (you'll use a form, right?) & write a
> tcl wrapper so that a programmer can execute a search by calling
> "search-linuxkb.tcl" (or write their own code based on the API).
Sure.
> If the API specs are based on the cgi script, that'll kill two birds with
> one stone: programmers will have access to all the facilities afforded to
> 'human' web-searchers, and I'll be able to get to work on a shell script to
> call the search engine.
That would be the easiest thing to do- especially since someone will need
to develop a ht://dig wrapper for the RAPI to format the output according
to the RAPI spec. (That's what Search.pm is for the web interface btw.
I'd prefer it NOT be a CGI, but rather either PHP or mod_perl. CGI's cost
an additional fork() and I'm trying to keep everything very optimized for
the search engine.
> So, which comes first? :) Do we write the cgi script & then document it for
> programmers, or do we come up with a good API & then write the cgi script to
> suit?
Depends on your personality I suppose. I think the easiest thing to do is
use the existing Search.pm CGI params or some variation on them as the
RAPI. That way all you have to do is sanity checks and then pass them to
htdig. That way you can concentrate on the RAPI output. Actually this
might be a good time to explain to everyone how Search.pm works.
Basically Search.pm is a mod_perl module/script which is called via the
URL: /modperl/search via simple_form.php3 or advanced_form.php3. Either
of these forms pass a variety of information via CGI params, such as:
- words to be searched for
- Form type (advanced/simple)
- current location (for search from here/drill down searching)
- Type of content to search (internal/external docs)
- Type of search (logical and/or, boolean)
- etc.
Search.pm then sets some shell env params which htsearch (the htdig cgi
search app) uses (I do this to save a fork() rather than using a shell
script to do it) and call htsearch specifing one of two configuration
files. These configuration files are part of htdig and control various
attributes of the output which relate back to the simple/advanced search
pages.
The output then gets suck into an array, which I then loop through one
line at a time in Search.pm. htsearch uses some template files of it's
own (wrapper.html and wrapper_adv.html) which I've modified with some html
comments <!-- xxx --> which Search.pm keys in on and does substitution in
the output stream.
Some example code which illustrates this:
# Create the maches per page dialog
if ($htout[$htout_cnt] =~ /<!-- matches_per_page -->/) {
print $htout[$htout_cnt];
my $query = new CGI;
print "Matches per page: ";
print $query->popup_menu(-name=>'matchesperpage',
-values=>['10', '25', '50'],
-default=>$params{'matchesperpage'});
next;
}
I'm do some other things too. Like I hack all the URL's that are hits so
that the words the user searched for get's passed to render.php3. This
way render.php3 can then highlight the words the user searched on when
they click on the link.
Search.pm also includes in sidebar.html, header.php3, and categories.php3.
I've even updated the last two so that when you click on a new category
(drill down) or move back (drill up) it remembers your last search words.
Later on, I'll modify it so it remembers all your search settings. One of
these days I'd like people to be able to save their search settings in the
DB so people can have their own defaults.
Anyways, that's the .02 tour.
--
Aaron Turner, Core Developer http://vodka.linuxkb.org/~aturner/
Linux Knowledge Base Organization http://linuxkb.org/
Because world domination requires quality open documentation.
aka: aturner@vicinity.com, aturner@pobox.com, ion_beam_head@ashtech.net