Announcement: "SurfCompanion web-agent" in NetRexx

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

Announcement: "SurfCompanion web-agent" in NetRexx

serkom
Dear NetRexxers,

I would like to annouce my early version of the SurfCompanion to this group. The FREE project is written in NetRexx with
about 300 kB or 8000 lines of source code.

The SurfCompanion is a :

           Surfing web-agent embeded into a www-server with searching capability!

Please download from "http://SurfCompanion.wwz.de" and check it out. The distribution file is SUCO093.ZIP (300kB
including documentation). It is Feedbackware (mailto:[hidden email]).

Have fun

Kai Schmidt


P.S. I include a features list here:

Features of the Java-SurfCompanion

Buzz words: Web-Agent, personal WWW-Server, robot, surf assistent, Off-Line-Reader, Internet Tool,
local search engine, activ intranet proxy

Features:
Supports HTTP, NNTP (news) and FTP protocols (sends SMTP)
Prepares downloaded documents for local, off-line usage
Runs for/at given times
Up- and Download with subdirectories
Control and configuration with your browser.
Logging of all taken aktions
Simple installation by copying, needs Java JDK 1.1 installed
Client/Server application, can be on one computer or www-server remote.
Includes personalized WWW-Server (local or remote) to display results or instruct.
Includes searching capabilities, and marks found results.
Multithreads, one per remote URL
Authentification
Highly configurable with defaults
Filters pages by server, content, mime-type, size, date , depth, path
Greps and marks in page when criteria of searchword list fullfilled, textsearch
International, local Date/Time format
Written in NetRexx (java compatible dialect)
Shareware, Beta versions are FREE

Use it for:

Off-line reader for WWW-pages and newsgroups (recursivelly follows wanted links)
Copy of server parts to quietly read on your Laptop later on or share with others
Index and search your internal documents
Periodically mirror your favourite sites
Normal Web-server operation (Intranet)
Download huge files some convenient time automatically from a remote place
Backup or archive sites periodically
Upload files or directories to a server
Find dead Links or unused documents
Check your remote WWW-Server for responsivness (pages and services, statistics)
Copy and study the setup of interesting HTML-pages of the web
Store pages only with contained keywords, or search later on
Use as company proxy intelligently feed with self selected items
and more.

Do you have additonal wishes, requests or proposals? Please don't hesitate to tell me. Just write an Email to
[hidden email]!








~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
To unsubscribe from this mailing list ( ibm-netrexx ), please send a note to
[hidden email]
with the following message in the body of the note
unsubscribe ibm-netrexx <e-mail address>