• Way to download contents of entire directory...

    From FlipChip(tm)@21:1/5 to All on Sun Mar 3 02:14:31 2019
    I'm trying to get my tired head around how to download every file from
    some gopherhole dir.

    Lynx -source gopher://something.something/file > /path/to/local/file

    this downloads a file but isn't the best option.

    Before starting to tinker I thought to ask if there is something for this
    kind of thing.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From met@ph.or@21:1/5 to All on Sun Mar 3 04:35:10 2019
    I'm trying to get my tired head around how to download every file from
    some gopherhole dir.

    Lynx -source gopher://something.something/file > /path/to/local/file

    this downloads a file but isn't the best option.

    Before starting to tinker I thought to ask if there is something for this kind of thing.

    The lynx browser's '-transversal' option is http only; maybe curl supports crawling gopher?

    Could do it with a shell script (snarf is a URL fetcher):

    --
    #!/bin/sh
    #
    DLDIR="$HOME/foo"
    TYPES="0 9 I s g"
    GHOST="gopher://frog.bog"
    GDIR="somedir"
    cd $DLDIR
    for T in $TYPES ;do
    for F in $(snarf $GHOST/1/$GDIR - |grep "^$T" |tr '\t' ':' |cut -d ':' -f2) ;do
    snarf "$GHOST/$T/$F"
    done
    done
    --

    Not elegant but it works.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From FlipChip(tm)@21:1/5 to met on Sun Mar 3 12:06:32 2019
    On Sun, 03 Mar 2019 04:35:10 +0000, met wrote:

    I'm trying to get my tired head around how to download every file from
    some gopherhole dir.

    Lynx -source gopher://something.something/file > /path/to/local/file

    this downloads a file but isn't the best option.

    Before starting to tinker I thought to ask if there is something for
    this kind of thing.

    The lynx browser's '-transversal' option is http only; maybe curl
    supports crawling gopher?

    Could do it with a shell script (snarf is a URL fetcher):

    --
    #!/bin/sh #
    DLDIR="$HOME/foo"
    TYPES="0 9 I s g"
    GHOST="gopher://frog.bog"
    GDIR="somedir"
    cd $DLDIR for T in $TYPES ;do
    for F in $(snarf $GHOST/1/$GDIR - |grep "^$T" |tr '\t' ':' |cut -d ':'
    -f2) ;do
    snarf "$GHOST/$T/$F"
    done
    done
    --

    Not elegant but it works.

    Thanks for the reply!

    Lynx -source does support gopher:// I have tested it and it does what it supposed to do, example I mentioned dumps gopherhole file to local file.
    Of course -crawl doesn't work with it, me thinks.

    I take a look at you script later on, from first glance it looks like
    it's something that can be used.

    It's just too kliketiklik to download hundreds of files from gopherhole
    without this kind of thingie.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From met@ph.or@21:1/5 to All on Sun Mar 3 17:26:38 2019
    On Sun, 03 Mar 2019 04:35:10 +0000, met wrote:

    I'm trying to get my tired head around how to download every file from
    some gopherhole dir.

    Lynx -source gopher://something.something/file > /path/to/local/file

    this downloads a file but isn't the best option.

    Before starting to tinker I thought to ask if there is something for
    this kind of thing.

    The lynx browser's '-transversal' option is http only; maybe curl
    supports crawling gopher?

    Could do it with a shell script (snarf is a URL fetcher):

    --
    #!/bin/sh #
    DLDIR="$HOME/foo"
    TYPES="0 9 I s g"
    GHOST="gopher://frog.bog"
    GDIR="somedir"
    cd $DLDIR for T in $TYPES ;do
    for F in $(snarf $GHOST/1/$GDIR - |grep "^$T" |tr '\t' ':' |cut -d ':'
    -f2) ;do
    snarf "$GHOST/$T/$F"
    done
    done
    --

    Not elegant but it works.

    Thanks for the reply!

    Lynx -source does support gopher:// I have tested it and it does what it supposed to do, example I mentioned dumps gopherhole file to local file.
    Of course -crawl doesn't work with it, me thinks.

    I take a look at you script later on, from first glance it looks like
    it's something that can be used.

    It's just too kliketiklik to download hundreds of files from gopherhole without this kind of thingie.

    No problem. Regarding using lynx, you might want to test -source on
    various servers; i.e. on gopher://sdf.org (running gophernicus) -source
    returns HTML source as lynx is rendering gopher pages into web pages
    for display purposes. Using -dump and -listonly would at least give
    you a nice list. lynx requires explicit use of Type 0 on directories in
    order to return gopher code, gopher://sdf.org/0/ not gopher://sdf.org/ .
    Also, with lynx you have to use '%7e' for sites/directories that use the
    '~' character; see gopher://devio.us/ for example. For these reasons I
    tend to use snarf for simple gopher code retrieval.

    You might consider re-posting this on the gopher project mlist or
    inquire on #gopherproject or #bitreich-en at irc.freenode.net (IRC).
    Certainly floodgap.com is using some sort of crawler/indexer for the
    veronica search engine.

    Cheers,
    meta4

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Mateusz Viste@21:1/5 to All on Mon Mar 4 21:59:05 2019
    On Sun, 03 Mar 2019 02:14:31 +0000, FlipChip(tm) wrote:
    I'm trying to get my tired head around how to download every file from
    some gopherhole dir.

    Gopherus trunk version supports downloading the content of an entire
    directory (F10 key).

    homepage: http://gopherus.sourceforge.net

    Trunk version can be downloaded here ("download snapshot"): https://sourceforge.net/p/gopherus/code/HEAD/tree/trunk/

    HTH,

    Mateusz
    --
    gopher://gopher.viste.fr

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From FlipChip(tm)@21:1/5 to Mateusz Viste on Tue Mar 5 11:50:11 2019
    On Mon, 04 Mar 2019 21:59:05 +0000, Mateusz Viste wrote:

    On Sun, 03 Mar 2019 02:14:31 +0000, FlipChip(tm) wrote:
    I'm trying to get my tired head around how to download every file from
    some gopherhole dir.

    Gopherus trunk version supports downloading the content of an entire directory (F10 key).

    homepage: http://gopherus.sourceforge.net

    Trunk version can be downloaded here ("download snapshot"): https://sourceforge.net/p/gopherus/code/HEAD/tree/trunk/

    HTH,

    Mateusz

    Oh. That's good to know!

    I use mostly Gopherus in fact. Thanks for the info!

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From FlipChip(tm)@21:1/5 to met on Tue Mar 5 11:46:51 2019
    On Sun, 03 Mar 2019 17:26:38 +0000, met wrote:

    On Sun, 03 Mar 2019 04:35:10 +0000, met wrote:

    I'm trying to get my tired head around how to download every file
    from some gopherhole dir.

    Lynx -source gopher://something.something/file > /path/to/local/file

    this downloads a file but isn't the best option.

    Before starting to tinker I thought to ask if there is something for
    this kind of thing.

    The lynx browser's '-transversal' option is http only; maybe curl
    supports crawling gopher?

    Could do it with a shell script (snarf is a URL fetcher):

    --
    #!/bin/sh #
    DLDIR="$HOME/foo"
    TYPES="0 9 I s g"
    GHOST="gopher://frog.bog"
    GDIR="somedir"
    cd $DLDIR for T in $TYPES ;do
    for F in $(snarf $GHOST/1/$GDIR - |grep "^$T" |tr '\t' ':' |cut -d
    ':'
    -f2) ;do
    snarf "$GHOST/$T/$F"
    done
    done
    --

    Not elegant but it works.

    Thanks for the reply!

    Lynx -source does support gopher:// I have tested it and it does what
    it supposed to do, example I mentioned dumps gopherhole file to local
    file.
    Of course -crawl doesn't work with it, me thinks.

    I take a look at you script later on, from first glance it looks like
    it's something that can be used.

    It's just too kliketiklik to download hundreds of files from gopherhole
    without this kind of thingie.

    No problem. Regarding using lynx, you might want to test -source on
    various servers; i.e. on gopher://sdf.org (running gophernicus) -source returns HTML source as lynx is rendering gopher pages into web pages for display purposes. Using -dump and -listonly would at least give you a
    nice list. lynx requires explicit use of Type 0 on directories in order
    to return gopher code, gopher://sdf.org/0/ not gopher://sdf.org/ .
    Also, with lynx you have to use '%7e' for sites/directories that use the
    '~' character; see gopher://devio.us/ for example. For these reasons I
    tend to use snarf for simple gopher code retrieval.

    You might consider re-posting this on the gopher project mlist or
    inquire on #gopherproject or #bitreich-en at irc.freenode.net (IRC). Certainly floodgap.com is using some sort of crawler/indexer for the
    veronica search engine.

    Cheers,
    meta4

    Was going to take a look of this ... but ALAS!

    http://www.pavuk.org/man.html

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From met@ph.or@21:1/5 to All on Tue Mar 5 06:56:41 2019
    On 3/5/19 4:50 AM, FlipChip(tm) wrote:
    On Mon, 04 Mar 2019 21:59:05 +0000, Mateusz Viste wrote:

    On Sun, 03 Mar 2019 02:14:31 +0000, FlipChip(tm) wrote:
    I'm trying to get my tired head around how to download every file from
    some gopherhole dir.

    Gopherus trunk version supports downloading the content of an entire
    directory (F10 key).

    homepage: http://gopherus.sourceforge.net

    Trunk version can be downloaded here ("download snapshot"):
    https://sourceforge.net/p/gopherus/code/HEAD/tree/trunk/

    HTH,

    Mateusz

    Oh. That's good to know!

    I use mostly Gopherus in fact. Thanks for the info!


    Nice - between this and pavuk it sounds like recursive
    indexing/mirroring for Gopher is covered.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From FlipChip(tm)@21:1/5 to Mateusz Viste on Fri Mar 8 11:27:42 2019
    On Mon, 04 Mar 2019 21:59:05 +0000, Mateusz Viste wrote:

    On Sun, 03 Mar 2019 02:14:31 +0000, FlipChip(tm) wrote:
    I'm trying to get my tired head around how to download every file from
    some gopherhole dir.

    Gopherus trunk version supports downloading the content of an entire directory (F10 key).

    homepage: http://gopherus.sourceforge.net

    Trunk version can be downloaded here ("download snapshot"): https://sourceforge.net/p/gopherus/code/HEAD/tree/trunk/

    HTH,

    Mateusz

    Hi Mateusz!

    Just downloaded and tried trunk version of Gopherus. But alas! It core
    dumps if trying to download a directory with F10.

    Tried your FAQ's in fact :)

    I'm running Pop_OS! 18.10 without much tweaking.

    Just wanted to let you know.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Mateusz Viste@21:1/5 to All on Fri Mar 8 12:19:38 2019
    On Fri, 08 Mar 2019 11:27:42 +0000, FlipChip(tm) wrote:
    Just downloaded and tried trunk version of Gopherus. But alas! It core
    dumps if trying to download a directory with F10.

    Tried your FAQ's in fact :)

    I'm running Pop_OS! 18.10 without much tweaking.

    That's a bummer.

    I tested it right now on the same FAQ page - works for me... Would you be willing to provide me with the below things so I could look into?
    - your compiled debug-enabled binary
    - the resulting core file

    To compile a debug version, you will have to edit the Makefile.lin file
    and replace this:

    CFLAGS = -std=gnu89 -O3 -Wall -Wextra -pedantic

    by this:

    CFLAGS = -std=gnu89 -O0 -g -Wall -Wextra -pedantic


    then recompile it:
    make -f Makefile.lin clean
    make -f Makefile.lin

    set your core limit to no limit:
    ulimit -c unlimited

    and set the core pattern if needed:
    sysctl kernel.core_pattern=core


    Once all the above is done, you should see a 'core' file appearing in
    gopherus directory as soon as it crashes.

    BTW, what interface do you use? curses or SDL? Is the core dump happening
    with both?

    Mateusz
    --
    gopher://gopher.viste.fr

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From FlipChip(tm)@21:1/5 to Mateusz Viste on Fri Mar 8 18:19:56 2019
    On Fri, 08 Mar 2019 12:19:38 +0000, Mateusz Viste wrote:

    On Fri, 08 Mar 2019 11:27:42 +0000, FlipChip(tm) wrote:
    Just downloaded and tried trunk version of Gopherus. But alas! It core
    dumps if trying to download a directory with F10.

    Tried your FAQ's in fact :)

    I'm running Pop_OS! 18.10 without much tweaking.

    That's a bummer.

    I tested it right now on the same FAQ page - works for me... Would you
    be willing to provide me with the below things so I could look into?
    - your compiled debug-enabled binary - the resulting core file

    To compile a debug version, you will have to edit the Makefile.lin file
    and replace this:

    CFLAGS = -std=gnu89 -O3 -Wall -Wextra -pedantic

    by this:

    CFLAGS = -std=gnu89 -O0 -g -Wall -Wextra -pedantic


    then recompile it:
    make -f Makefile.lin clean make -f Makefile.lin

    set your core limit to no limit:
    ulimit -c unlimited

    and set the core pattern if needed:
    sysctl kernel.core_pattern=core


    Once all the above is done, you should see a 'core' file appearing in gopherus directory as soon as it crashes.

    BTW, what interface do you use? curses or SDL? Is the core dump
    happening with both?

    Mateusz

    Halt the debugging. Let's debug user first :)

    I tried again INSIDE the directory rather than while over the directory
    in menu.

    Somehow I expected it to work like that. Individual files act this way so
    it was logical to me.

    It works, that's the point.

    Would it be big thing to change that one can hit recursive download while
    over it in menu? Also it would be nice that path could be entered, just
    like with single file.

    Now stuff is poured into ~/ it seems.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Mateusz Viste@21:1/5 to All on Fri Mar 8 20:45:20 2019
    On Fri, 08 Mar 2019 18:19:56 +0000, FlipChip(tm) wrote:
    I tried again INSIDE the directory rather than while over the directory
    in menu.

    It works, that's the point.

    Cool :)
    But still - Gopherus was crashing, that's never good. Now I understand
    why - and fixed it.

    Would it be big thing to change that one can hit recursive download
    while over it in menu?

    Too complicated - won't do, sorry.
    Note that Gopherus is not doing "recursive" download (ie. it does not
    recurse into subdirectories) - F10 is only downloading the files that are present in the current gopher path.

    Also it would be nice that path could be entered,
    just like with single file.

    Agreed - and actually, that's planned. I also plan to handle in some
    graceful way the situation when Gopherus cannot download a file because
    said file already exists locally.

    Now stuff is poured into ~/ it seems.

    It goes to your 'current directory', ie. the directory you were when you launched Gopherus.

    Thanks for your feedback!

    Mateusz
    --
    gopher://gopher.viste.fr

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)