I'm trying to get my tired head around how to download every file from
some gopherhole dir.
Lynx -source gopher://something.something/file > /path/to/local/file
this downloads a file but isn't the best option.
Before starting to tinker I thought to ask if there is something for this kind of thing.
I'm trying to get my tired head around how to download every file from
some gopherhole dir.
Lynx -source gopher://something.something/file > /path/to/local/file
this downloads a file but isn't the best option.
Before starting to tinker I thought to ask if there is something for
this kind of thing.
The lynx browser's '-transversal' option is http only; maybe curl
supports crawling gopher?
Could do it with a shell script (snarf is a URL fetcher):
--
#!/bin/sh #
DLDIR="$HOME/foo"
TYPES="0 9 I s g"
GHOST="gopher://frog.bog"
GDIR="somedir"
cd $DLDIR for T in $TYPES ;do
for F in $(snarf $GHOST/1/$GDIR - |grep "^$T" |tr '\t' ':' |cut -d ':'
-f2) ;do
snarf "$GHOST/$T/$F"
done
done
--
Not elegant but it works.
On Sun, 03 Mar 2019 04:35:10 +0000, met wrote:
I'm trying to get my tired head around how to download every file from
some gopherhole dir.
Lynx -source gopher://something.something/file > /path/to/local/file
this downloads a file but isn't the best option.
Before starting to tinker I thought to ask if there is something for
this kind of thing.
The lynx browser's '-transversal' option is http only; maybe curl
supports crawling gopher?
Could do it with a shell script (snarf is a URL fetcher):
--
#!/bin/sh #
DLDIR="$HOME/foo"
TYPES="0 9 I s g"
GHOST="gopher://frog.bog"
GDIR="somedir"
cd $DLDIR for T in $TYPES ;do
for F in $(snarf $GHOST/1/$GDIR - |grep "^$T" |tr '\t' ':' |cut -d ':'
-f2) ;do
snarf "$GHOST/$T/$F"
done
done
--
Not elegant but it works.
Thanks for the reply!
Lynx -source does support gopher:// I have tested it and it does what it supposed to do, example I mentioned dumps gopherhole file to local file.
Of course -crawl doesn't work with it, me thinks.
I take a look at you script later on, from first glance it looks like
it's something that can be used.
It's just too kliketiklik to download hundreds of files from gopherhole without this kind of thingie.
I'm trying to get my tired head around how to download every file from
some gopherhole dir.
On Sun, 03 Mar 2019 02:14:31 +0000, FlipChip(tm) wrote:
I'm trying to get my tired head around how to download every file from
some gopherhole dir.
Gopherus trunk version supports downloading the content of an entire directory (F10 key).
homepage: http://gopherus.sourceforge.net
Trunk version can be downloaded here ("download snapshot"): https://sourceforge.net/p/gopherus/code/HEAD/tree/trunk/
HTH,
Mateusz
On Sun, 03 Mar 2019 04:35:10 +0000, met wrote:
I'm trying to get my tired head around how to download every file
from some gopherhole dir.
Lynx -source gopher://something.something/file > /path/to/local/file
this downloads a file but isn't the best option.
Before starting to tinker I thought to ask if there is something for
this kind of thing.
The lynx browser's '-transversal' option is http only; maybe curl
supports crawling gopher?
Could do it with a shell script (snarf is a URL fetcher):
--
#!/bin/sh #
DLDIR="$HOME/foo"
TYPES="0 9 I s g"
GHOST="gopher://frog.bog"
GDIR="somedir"
cd $DLDIR for T in $TYPES ;do
for F in $(snarf $GHOST/1/$GDIR - |grep "^$T" |tr '\t' ':' |cut -d
':'
-f2) ;do
snarf "$GHOST/$T/$F"
done
done
--
Not elegant but it works.
Thanks for the reply!
Lynx -source does support gopher:// I have tested it and it does what
it supposed to do, example I mentioned dumps gopherhole file to local
file.
Of course -crawl doesn't work with it, me thinks.
I take a look at you script later on, from first glance it looks like
it's something that can be used.
It's just too kliketiklik to download hundreds of files from gopherhole
without this kind of thingie.
No problem. Regarding using lynx, you might want to test -source on
various servers; i.e. on gopher://sdf.org (running gophernicus) -source returns HTML source as lynx is rendering gopher pages into web pages for display purposes. Using -dump and -listonly would at least give you a
nice list. lynx requires explicit use of Type 0 on directories in order
to return gopher code, gopher://sdf.org/0/ not gopher://sdf.org/ .
Also, with lynx you have to use '%7e' for sites/directories that use the
'~' character; see gopher://devio.us/ for example. For these reasons I
tend to use snarf for simple gopher code retrieval.
You might consider re-posting this on the gopher project mlist or
inquire on #gopherproject or #bitreich-en at irc.freenode.net (IRC). Certainly floodgap.com is using some sort of crawler/indexer for the
veronica search engine.
Cheers,
meta4
On Mon, 04 Mar 2019 21:59:05 +0000, Mateusz Viste wrote:
On Sun, 03 Mar 2019 02:14:31 +0000, FlipChip(tm) wrote:
I'm trying to get my tired head around how to download every file from
some gopherhole dir.
Gopherus trunk version supports downloading the content of an entire
directory (F10 key).
homepage: http://gopherus.sourceforge.net
Trunk version can be downloaded here ("download snapshot"):
https://sourceforge.net/p/gopherus/code/HEAD/tree/trunk/
HTH,
Mateusz
Oh. That's good to know!
I use mostly Gopherus in fact. Thanks for the info!
On Sun, 03 Mar 2019 02:14:31 +0000, FlipChip(tm) wrote:
I'm trying to get my tired head around how to download every file from
some gopherhole dir.
Gopherus trunk version supports downloading the content of an entire directory (F10 key).
homepage: http://gopherus.sourceforge.net
Trunk version can be downloaded here ("download snapshot"): https://sourceforge.net/p/gopherus/code/HEAD/tree/trunk/
HTH,
Mateusz
Just downloaded and tried trunk version of Gopherus. But alas! It core
dumps if trying to download a directory with F10.
Tried your FAQ's in fact :)
I'm running Pop_OS! 18.10 without much tweaking.
On Fri, 08 Mar 2019 11:27:42 +0000, FlipChip(tm) wrote:
Just downloaded and tried trunk version of Gopherus. But alas! It core
dumps if trying to download a directory with F10.
Tried your FAQ's in fact :)
I'm running Pop_OS! 18.10 without much tweaking.
That's a bummer.
I tested it right now on the same FAQ page - works for me... Would you
be willing to provide me with the below things so I could look into?
- your compiled debug-enabled binary - the resulting core file
To compile a debug version, you will have to edit the Makefile.lin file
and replace this:
CFLAGS = -std=gnu89 -O3 -Wall -Wextra -pedantic
by this:
CFLAGS = -std=gnu89 -O0 -g -Wall -Wextra -pedantic
then recompile it:
make -f Makefile.lin clean make -f Makefile.lin
set your core limit to no limit:
ulimit -c unlimited
and set the core pattern if needed:
sysctl kernel.core_pattern=core
Once all the above is done, you should see a 'core' file appearing in gopherus directory as soon as it crashes.
BTW, what interface do you use? curses or SDL? Is the core dump
happening with both?
Mateusz
I tried again INSIDE the directory rather than while over the directory
in menu.
It works, that's the point.
Would it be big thing to change that one can hit recursive download
while over it in menu?
Also it would be nice that path could be entered,
just like with single file.
Now stuff is poured into ~/ it seems.
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 292 |
Nodes: | 16 (2 / 14) |
Uptime: | 198:37:28 |
Calls: | 6,617 |
Calls today: | 1 |
Files: | 12,168 |
Messages: | 5,315,889 |