[Gambas-user] Offline documentation is out-of-date

ron ronstk at ...239...
Sun Jun 4 15:19:42 CEST 2006


On Sunday 04 June 2006 06:31, Rob Kudla wrote:
> On Sat June 3 2006 21:39, Christopher Brian Jack wrote:
> > How hard would it be to produce an application that updated
> > the offline documentation to match updates to the online
> > documentation?  Digests like MD5 or SHA1 would be sufficient
> > for detecting file changes and the plus would be that
> 
> While the old wiki stored its data in files, the new one is 
> MySQL-based.  Further, as with all wikis, the offline version 
> needs to contain HTML markup, while the data stored on the 
> server side contains wiki code... so there needs to be a program 
> that generates the HTML out of the wiki code.
> 
> That program may be as simple as wget --mirror, but ideally you 
> want something a little faster and nicer to the web server.  The 
> "publish" program provided that for TWiki; whether the current 
> wiki Benoit wrote has such a utility, I don't know, but we can 
> always use the wget method once in a while in a pinch (and if I 
> were to catch people doing that a lot during the day, I'd have 
> to put IP bandwidth restrictions in place.)
> 
> Rob


Hi Rob

Wel I did use HTTrack to get the english version from the site.
By links in the pages for other languages he did follow them
Result was over 100mb transfer and after cleanup 40Mb of data.
I did stop myself for the reason you mention about bandwith.
This can be done atmost once every two weeks for a single person
but I'm not alone.

The script for reading the MySQL database and a SGL dump of the
database is more interesting. The only bad part for some is they
need a local webserver to present it. This can be a perl script 
or even a gambas program. (doc.cgi in the source tarbal?)

This program/script is already existing, as we can read the doc now.
 


More information about the User mailing list