INN
Diablo
DNews
C News
WendzelNNTPd
Historic Software
B News
A News
format INN understands so we have shelved the idea, not too worried since ipv4 despite exhaustion will be round for a very long time yet anyway.
We still run it (since the 90s) but all our attempts in past couple years
to move the spool into INN which is ipv6 and 64bit capable, has thus far failed, it bogs down, and trying to move near TB of data would take
months, as there is no apparent way to convert DNews's spool into a
format INN understands so we have shelved the idea, not too worried since ipv4 despite exhaustion will be round for a very long time yet anyway.
DNews
Hi,
noel a tapoté le 04/08/2022 11:08:
We still run it (since the 90s) but all our attempts in past couple years
to move the spool into INN which is ipv6 and 64bit capable, has thus far
failed, it bogs down, and trying to move near TB of data would take
months, as there is no apparent way to convert DNews's spool into a
format INN understands so we have shelved the idea, not too worried since
ipv4 despite exhaustion will be round for a very long time yet anyway.
The only simple solution would to find how to feed all articles from
Dnews server to the INN2 server.
Or with suck on the new server?
<https://netwinsite.com/dnews/faq3.htm#10>
I had never used suck but there may some users here.
I opted to use multiple instances of pullnews to initially seed the spool. It crashed a lot and had difficulty with very large groups.
After doing as much
as I could with pullnews I started using suck since it uses the history database and deduplicates before downloading every article.
Hi Jesse,
I opted to use multiple instances of pullnews to initially seed the spool. It
crashed a lot and had difficulty with very large groups.
Do you remember the errors you got?
They are maybe worthwhile fixing in pullnews.
After doing as much
as I could with pullnews I started using suck since it uses the history
database and deduplicates before downloading every article.
"pullnews -O" does that too.
It could be related to the server on the other end (pulling from a commercial entity). The one situation that happened a lot, especially with large groups, was at some point in the session pullnews would just start spewing x's to the terminal in a loop and not stop until killed. It never gave me an error. Killing and resuming always seemed to work.
I did a few comparisons with pullnews and suck, even when using
the -O flag with pullnews, suck is a lot faster in the overall runtime of pulling an entire group (tested the same group on empty servers). Internally I
don't know how the NNTP operations compare, but pullnews wasn't as efficient as suck's method of building the list of articles, deduping, grabbing all articles, and posting as serial operations led to getting the job done faster.
I've just had a glance at suck source code. It directly runs the native
C function provided by INN (dbzexists) on the history file. No wonder
it is more efficient than pullnews which just runs an NNTP command
(STAT) on the local news server...
Thanks again for your valuable bug report!
Any other issue you remember with pullnews?
Le 04/08/2022 à 11:08, noel a écrit :
format INN understands so we have shelved the idea, not too worried
since ipv4 despite exhaustion will be round for a very long time yet
anyway.
not so long. My Linux User Group FAI (REDbySFR) just set our box as full
IPV6 - no more IPV4 usable address for servers :-( - and no warning :-(
jdd
Hi,
noel a tapoté le 04/08/2022 11:08:
We still run it (since the 90s) but all our attempts in past couple
years to move the spool into INN which is ipv6 and 64bit capable, has
thus far failed, it bogs down, and trying to move near TB of data would
take months, as there is no apparent way to convert DNews's spool into
a format INN understands so we have shelved the idea, not too worried
since ipv4 despite exhaustion will be round for a very long time yet
anyway.
The only simple solution would to find how to feed all articles from
Dnews server to the INN2 server.
Or with suck on the new server?
<https://netwinsite.com/dnews/faq3.htm#10>
I had never used suck but there may some users here.
We tried suck, its the only way they might get into INN, thats where its bogging down
We tried suck, its the only way they might get into INN, thats where its bogging down
Hi Noel,
We tried suck, its the only way they might get into INN, thats where its
bogging down
What problem do you encounter with suck? (except that it may take a long
time to run)
Did you eventually try to run suck in its multifile mode "suck -m"
directly on your DNews news server to extract the articles?
It can also generate batch files suitable for innxmit or rnews with respectively the "-bi" and "-br" flags.
Hi Noel,
We tried suck, its the only way they might get into INN, thats where
its bogging down
I've found out in the Dnews documentation <https://netwinsite.com/dnews/tellnews.htm> the following command:
"""
tellnews refeed feed.name
Description: This command will re-send all current articles to a
particular news feed. This command is fairly cpu intensive as the
entire history file (maybe 80MB) must be parsed. Also, it does not
apply all the rules associated with the feed you have specified, it only checks that the group name matches. This command is particularly useful
when setting up the FTS (Full Text Searching) system on a running
server. It allows you to 're-index' all existing items. It should not
be used for other purposes.
"""
Did you try it? Maybe by setting the feed.name entry in newsfeeds.conf
and testing it with a single newsgroup to check whether it works? (and
if it is, then "group *")
Also, DNews seems to store all the articles in files named "db*.itm".
Did you try to have a look at the contents of these files? Maybe the articles could be extracted from them with an appropriate (simple)
script?
I've found out a message of the DNews support:
https://groups.google.com/g/netwin.dnews/c/ISgH6dc5baM/m/CrLz-bU4hDMJ
"""
Create a feed entry like this:
site xxx groups *
And then write a program to read 'xxx.feed' or 'xxx.send' files which
list each new item, and gives the offset into the bucket and item length
so you can read direct from teh .itm files, this method is not
gauranteed to work with future versiosn of dnews :-)
"""
Well, that does not seem easy... I'm very surprised by such a response
from the official DNews support. They could at least have natively
provided a tool to export their proprietary data spool. Seems like they wanted their customes to be captive :-/
On 8/6/22 06:51, Julien ÉLIE wrote:
Hi Noel,I would like to try Dnews running arch linux. How much does it cost and
We tried suck, its the only way they might get into INN, thats where
its bogging down
What problem do you encounter with suck? (except that it may take a
long time to run)
Did you eventually try to run suck in its multifile mode "suck -m"
directly on your DNews news server to extract the articles?
It can also generate batch files suitable for innxmit or rnews with
respectively the "-bi" and "-br" flags.
how would you go about installing it. I was using INN running 1 group
for a nntp only access bbs. Could use wendzelnntp but that does not
allow for multiple servers or linking. I was using mystic bbs to link
the news server to fidonet.
Hi Noel,
We tried suck, its the only way they might get into INN, thats where
its bogging down
What problem do you encounter with suck? (except that it may take a long
time to run)
Did you eventually try to run suck in its multifile mode "suck -m"
directly on your DNews news server to extract the articles?
It can also generate batch files suitable for innxmit or rnews with respectively the "-bi" and "-br" flags.
Also, DNews seems to store all the articles in files named "db*.itm".
Did you try to have a look at the contents of these files? Maybe the articles could be extracted from them with an appropriate (simple)
script?
Did you try it? Maybe by setting the feed.name entry in newsfeeds.conf
and testing it with a single newsgroup to check whether it works? (and
if it is, then "group *")
I'm very aware of that command and have used it before over the 25 years
of running dnews, it was the first thing I tried and was also a failure
in this case, perhaps it was the INN storage I selected, cant remember
what it was.
What problem do you encounter with suck? (except that it may take a long
time to run)
Its time to run, strangely, and this is likely memory related, it starts
off fast enough, but over time, its just gets slow
and not sure I can
pause new articles for the 6 months it would take to be done, I'd have no
one left using it with that long no new posts lol, I say that because I
dpont want to upset counters, I dont want the sucking or pulling to be through 2010 articles trhen import this hours new stuff and go back to
2010 :)
Also, DNews seems to store all the articles in files named "db*.itm".
Did you try to have a look at the contents of these files? Maybe the
articles could be extracted from them with an appropriate (simple)
script?
many of these bucket files are up to 195mb in zise, the majorty though
are around 60-90mb, I've found one from 2018 thats about 10mb gz'd to 2mb
if your curious to see what one looks like.
https://members.ausics.net/noelb/db_1615_2.itm.gz
Hi Noel,
What problem do you encounter with suck? (except that it may take a long >>> time to run)
Its time to run, strangely, and this is likely memory related, it starts
off fast enough, but over time, its just gets slow
Like Jesse found out too (high memory consumption).
and not sure I can
pause new articles for the 6 months it would take to be done, I'd have no
one left using it with that long no new posts lol, I say that because I
dpont want to upset counters, I dont want the sucking or pulling to be
through 2010 articles trhen import this hours new stuff and go back to
2010 :)
Also exactly what Jesse wishes for article numbers being "time-ordered" :)
What problem do you encounter with suck? (except that it may take a long >>>> time to run)
Its time to run, strangely, and this is likely memory related, it starts >>> off fast enough, but over time, its just gets slow
Like Jesse found out too (high memory consumption).
Most suck processes I've watched use 2+GB of memory, even with a small list of
groups. I found you can decrease its consumption by disabling the killfiles (-K).
I've just had a look at a fix.I opted to use multiple instances of pullnews to initially seed the spool. It
crashed a lot and had difficulty with very large groups.
Do you remember the errors you got?
They are maybe worthwhile fixing in pullnews.
It could be related to the server on the other end (pulling from a commercial entity). The one situation that happened a lot, especially with large groups, was at some point in the session pullnews would just start spewing x's to the terminal in a loop and not stop until killed. It never gave me an error. Killing and resuming always seemed to work.
It would be great if an option could be used to start a new
connection should one be considered dead or unusable.
Here is the new behaviour:
% ./pullnews -t2
[...]The one situation that happened a lot, especially with large
groups, was at some point in the session pullnews would just start
spewing x's to the terminal in a loop and not stop until killed.
It would be great if an option could be used to start a new
connection should one be considered dead or unusable.
I've just had a look at a fix.
% ./pullnews -t2
Article retrieval failed ([Net::NNTP] Connection closed)
Let's attempt again.
Connecting to upstream server news.trigofacile.com... done.
I've noted that pullnews does not currently support TLS connections.
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 307 |
Nodes: | 16 (2 / 14) |
Uptime: | 69:10:15 |
Calls: | 6,915 |
Files: | 12,380 |
Messages: | 5,431,893 |