EnglishFrenchSpanish

Ad


OnWorks favicon

AutoSearchp - Online in the Cloud

Run AutoSearchp in OnWorks free hosting provider over Ubuntu Online, Fedora Online, Windows online emulator or MAC OS online emulator

This is the command AutoSearchp that can be run in the OnWorks free hosting provider using one of our multiple free online workstations such as Ubuntu Online, Fedora Online, Windows online emulator or MAC OS online emulator

PROGRAM:

NAME


AutoSearch -- a web-search tracking application

SYNOPSIS


AutoSearch [--stats] [--verbose] -n "Query Name" -s "query string" --engine engine [--mail
[email protected]] [--options "opt=val"]... [--filter "filter"] [--host host] [--port port]
[--userid bbunny --password c4rr0t5] [--ignore_channels KABC,KCBS,KNBC] qid

AutoSearch --VERSION AutoSearch --help AutoSearch --man

DESCRIPTION


AutoSearch performs a web-based search and puts the results set in qid/index.html.
Subsequent searches (i.e., the second form above) AutoSearch determine what changes (if
any) occured to the results sent since the last run. These incremental changes are
recorded in qid/YYYYMMDD.html.

AutoSearch is amenable to be run as a cron job because all the input parameters are saved
in the web pages. AutoSearch can act as a automated query agent for a particular search.
The output files are designed to be a set of web pages to easily display the results set
with a web browser.

Example:

AutoSearch -n 'LSAM Replication'
-s '"lsam replication"'
-e AltaVista
replication_query

This query (which should be all on one line) creates a directory replication_query and
fills it with the fascinating output of the AltaVista query on "lsam replication", with
pages titled ``LSAM Replication''. (Note the quoting: the single quotes in '"lsam
replication"' are for the shell, the double quotes are for AltaVista to search for the
phrase rather than the separate words.)

A more complicated example:

AutoSearch -n 'External Links to LSAM'
-s '(link:www.isi.edu/lsam or link:www.isi.edu/~lsam) -url:isi.edu'
-e AltaVista::AdvancedWeb
-o coolness=hot

This query does an advanced AltaVista search and specifies the (hypothetical) ``coolness''
option to the search engine.

OPTIONS


"qid"
The query identifer specifies the directory in which all the files that relate to this
query and search results will live. It can be an absolute path, or a relative path
from cwd. If the directory does not exist, it will be created and a new search
started.

"--stats"
Show search statistics: the query string, number of hits, number of filtered hits,
filter string, number of suspended (deleted) hits, previous set size, current set
size, etc.

"-v" or "--verbose"
Verbose: output additional messages and warnings.

"-n" or "--qn" or "--queryname"
Specify the query name. The query name is used as a heading in the web pages,
therefore it should be a 'nice' looking version of the query string.

"-s" or "--qs" or "--querystring"
Specify the query string. The query string is the character string which will be
submitted to the search engine. You may include special characters to group or to
qualify the search.

"-e" or "--engine"
Specify the search engine. The query string will be submitted to the user specified
search engine.

In many cases there are specialized versions of search engines. For example,
AltaVista::AdvancedWeb and AltaVista::News allow more powerful and Usenet searches.
See AltaVista or the man page for your search engine for details about specialized
variations.

"--listnewurls"
In addition to all the normal file maintenance, print all new URLs to STDOUT, one per
line.

"-o" or "--options"
Specify the query options. The query options will be submitted to the user search
engine with the query string. This feature permits modification of the query string
for a specific search engine or option. More than one query option may be specified.

Example: "-o what=news" causes AltaVista to search Usenet. Although this works, the
preferred mechanism in this case would be "-e AltaVista::News" or "-e
AltaVista::AdvancedNews". Options are intended for internal or expert use.

"-f" or "--uf" or "--urlfilter"
This option specifies a regular expression which will be compared against the URLs of
any results; if they match the case-insensitive regular expression, they will be
removed from the hit set.

Example: "-f '.*\.isi\.edu'" avoids all of ISI's web pages.

"--cleanup i"
Delete all traces of query results from more than i days ago. If --cleanup is given,
all other options other than the qid will be ignored.

"--cmdline"
Reconstruct the complete command line (AutoSearch and all its arguments) that was used
to create the query results. Command line will be shown on STDERR. If --cmdline is
given, all other options other than the qid will be ignored.

"--mail user@address" or "-m user@address"
After search is complete, send email to that user, listing the NEW results. Email is
HTML format. Requires the Email::Send and related modules. If you send email through
an SMTP server, you must set environment variable SMTPSERVER to your server name or IP
address. If your SMTP server requires password, you must set environment variables
SMTPUSERNAME and SMTPPASSWORD. If you send email via sendmail, you should set
environment variable SENDMAIL if the sendmail executable is not in the path.

"--emailfrom user@address"
If your outgoing mail server rejects email from certain users, you can use this
argument to set the From: header.

"--userid bbunny"
If the search engine requires a login/password (e.g. Ebay::Completed), use this.

"--password Carr0t5"
If the search engine requires a login/password (e.g. Ebay::Mature), use this.

DESCRIPTION


AutoSearch submits a query to a search engine, produces HTML pages that reflect the set of
'hits' (filtered search results) returned by the search engine, and tracks these results
over time. The URL and title are displayed in the qid/index.html, the URL, the title, and
description are displayed in the 'weekly' files.

To organize these results, each search result is placed in a query information directory
(qid). The directory becomes the search results 'handle', an easy way to track a set of
results. Thus a qid of "/usr/local/htdocs/lsam/autosearch/load_balancing" might locate
the results on your web server at "http://www.isi.edu/lsam/autosearch/load_balancing".

Inside the qid directory you will find files relating to this query. The primary file is
index.html, which reflects the latest search results. Every not-filtered hit for every
search is stored in index.html. When a hit is no longer found by the search engine it a
removed from index.html. As new results for a search are returned from the search engine
they are placed in index.html.

At the bottom of index.html, there is a heading "Weekly Search Results", which is updated
each time the search is submitted (see "AUTOMATED SEARCHING"). The list of search runs is
stored in reverse chronological order. Runs which provide no new information are
identified with

No Unique Results found for search on <date>

Runs which contain changes are identified by

Web search results for search on <date>

which will be linked a page detailing the changes from that run.

Detailed search results are noted in weekly files. These files are named YYYYMMDD.html
and are stored in the qid directory. The weekly files include THE URL, title, and a the
description (if available). The title is a link to the original web page.

AUTOMATED SEARCHING


On UNIX-like systems, cron(1) may be used to establish periodic searches and the web pages
will be maintained by AutoSearch. To establish the first search, use the first example
under SYNOPSIS. You must specify the qid, query name and query string. If any of the
items are missing, you will be interactively prompted for the missing item(s).

Once the first search is complete you can re-run the search with the second form under
SYNOPSIS.

A cron entry like:

0 3 * * 1 /nfs/u1/wls/AutoSearch.pl /www/div7/lsam/autosearch/caching

might be used to run the search each Monday at 3:00 AM. The query name and query string
may be repeated; but they will not be used. This means that with a cron line like:

0 3 * * 1 /nfs/u1/wls/AutoSearch.pl /www/div7/lsam/autosearch/caching -n caching -s caching

a whole new search series can be originated by

rm -r /www/div7/lsam/autosearch/caching

However, the only reason to start a new search series would be to throw away the old
weekly files.

We don't recommend running searches more than once per day, but if so the per-run files
will be updated in-place. Any changes are added to the page with a comment that "Recently
Added:"; and deletions are indicated with "Recently Suspended:."

CHANGING THE LOOK OF THE PAGES


The basic format of these two pages is simple and customizable. One requirement is that
the basic structure remain unchanged. HTML comments are used to identify sections of the
document. Almost everything can be changed except for the strings which identify the
section starts and ends.

Noteworthy tags and their meaning:

<!--Top-->.*<!--/Top-->
The text contained within this tag is placed at the top of the output
page. If the text contains AutoSearch WEB Searching, then the query name
will replace it. If the text does not contain this magic string and it is
the first ever search, the user will be asked for a query name.

<!--Query{.*}/Query-->
The text contained between the braces is the query string. This is how
AutoSearch maintains the query string. You may edit this string to change
the query string; but only in qid/index.html. The text ask user is
special and will force AutoSearch to request the search string from the
user.

<!--SearchEngine{.*}/SearchEngine-->
The text contained between the braces is the search engine. Other engines
supported are HotBot and Lycos. You may edit this string to change the
engine used; but only in qid/index.html. The text ask user is special and
will force AutoSearch to to request the search string from the user.

<!--QueryOptions{.*}/QueryOptions-->
The text contained between the braces specifies a query options. Multiple
occurrencs of this command are allowed to specify multiple options.

<!--URLFilter{.*}/URLFilter-->
The text contained between the braces is the URL filter. This is how
AutoSearch maintains the filter. Again you may edit this string to change
the query string; but only in qid/index.html. The text ask user is
special and will force AutoSearch to ask the user (STDIN) for the query
string. When setting up the first search, you must edit first_index.html,
not qid/index.html. The URL filter is a standard perl5 regular
expression. URLs which do not match will be kept.

<!--Bottom-->.*<!--/Bottom-->
The text contained within this tag is placed at the bottom of the output
page. This is a good place to put navigation, page owner information,
etc.

The remainder of the tags fall into a triplet of ~Heading, ~Template, and ~, where ~ is
Summary, Weekly, Appended, and Suspended. The sub-sections appear in the order given. To
produce a section AutoSearch outputs the heading, the template, the section, n copies of
the formatted data, and an /section. The tags and their function are:

~Heading The heading tag identifies the heading for a section of the output file.
The SummaryHeading is for the summary portion, etc. The section may be
empty (e.g., Suspended) and thus no heading is output.

~Template The template tag identifies how each item is to be formatted. Simple text
replacement is used to change the template into the actual output text.
The text to be replaced is noted in ALLCAPS.

~ This tag is used to locate the section (Summary, Weekly, etc.). This
section represents the actual n-items of data.

You can edit these values in the qid/index.html page of an existing search. The file
first_index.html (in the directory above qid) will be used as a default template for new
queries.

Examples of these files can be seen in the pages under
"http://www.isi.edu/lsam/tools/autosearch/", or in the output generated by a new
AutoSearch.

Use AutoSearchp online using onworks.net services


Free Servers & Workstations

Download Windows & Linux apps

  • 1
    Phaser
    Phaser
    Phaser is a fast, free, and fun open
    source HTML5 game framework that offers
    WebGL and Canvas rendering across
    desktop and mobile web browsers. Games
    can be co...
    Download Phaser
  • 2
    VASSAL Engine
    VASSAL Engine
    VASSAL is a game engine for creating
    electronic versions of traditional board
    and card games. It provides support for
    game piece rendering and interaction,
    and...
    Download VASSAL Engine
  • 3
    OpenPDF - Fork of iText
    OpenPDF - Fork of iText
    OpenPDF is a Java library for creating
    and editing PDF files with a LGPL and
    MPL open source license. OpenPDF is the
    LGPL/MPL open source successor of iText,
    a...
    Download OpenPDF - Fork of iText
  • 4
    SAGA GIS
    SAGA GIS
    SAGA - System for Automated
    Geoscientific Analyses - is a Geographic
    Information System (GIS) software with
    immense capabilities for geodata
    processing and ana...
    Download SAGA GIS
  • 5
    Toolbox for Java/JTOpen
    Toolbox for Java/JTOpen
    The IBM Toolbox for Java / JTOpen is a
    library of Java classes supporting the
    client/server and internet programming
    models to a system running OS/400,
    i5/OS, o...
    Download Toolbox for Java/JTOpen
  • 6
    D3.js
    D3.js
    D3.js (or D3 for Data-Driven Documents)
    is a JavaScript library that allows you
    to produce dynamic, interactive data
    visualizations in web browsers. With D3
    you...
    Download D3.js
  • More »

Linux commands

  • 1
    abidiff
    abidiff
    abidiff - compare ABIs of ELF files
    abidiff compares the Application Binary
    Interfaces (ABI) of two shared libraries
    in ELF format. It emits a meaningful
    repor...
    Run abidiff
  • 2
    abidw
    abidw
    abidw - serialize the ABI of an ELF
    file abidw reads a shared library in ELF
    format and emits an XML representation
    of its ABI to standard output. The
    emitted ...
    Run abidw
  • 3
    copac2xml
    copac2xml
    bibutils - bibliography conversion
    utilities ...
    Run copac2xml
  • 4
    copt
    copt
    copt - peephole optimizer SYSNOPIS:
    copt file.. DESCRIPTION: copt is a
    general-purpose peephole optimizer. It
    reads code from its standard input and
    writes an ...
    Run copt
  • 5
    gather_stx_titles
    gather_stx_titles
    gather_stx_titles - gather title
    declarations from Stx documents ...
    Run gather_stx_titles
  • 6
    gatling-bench
    gatling-bench
    bench - http benchmark ...
    Run gatling-bench
  • More »

Ad