• Quantity and addresses of antique Usenet peers that carry full text fee

    From SugarBug@21:1/5 to All on Thu Mar 28 14:11:43 2024
    I wonder how many peers are carrying and preserving fulltext historical and curent Usenet feeds.

    Will anyone give a ballpark number?

    Which peers have articles remotest in antiquity?

    Even guesstimates might be useful.

    --
    3883@sugar.bug | sybershock.com | sci.crypt

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Ivo Gandolfo@21:1/5 to SugarBug on Thu Mar 28 21:52:15 2024
    On 28/03/2024 20:11, SugarBug wrote:
    I wonder how many peers are carrying and preserving fulltext historical and curent Usenet feeds.

    Will anyone give a ballpark number?

    Which peers have articles remotest in antiquity?

    Even guesstimates might be useful.


    Actually I have my full server back to 2000. Actually i'm working to add
    other back from archive.org.


    Sincerely

    --
    Ivo Gandolfo

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Grant Taylor@21:1/5 to SugarBug on Thu Mar 28 21:00:18 2024
    On 3/28/24 14:11, SugarBug wrote:
    I wonder how many peers are carrying and preserving fulltext historical
    and curent Usenet feeds.

    Considering that all of the news servers that I've looked at have a
    /default/ configuration to expire articles, you're actually asking for a special configuration.

    After Google killed Dejanews, (years ago) I would wonder if some of the
    big binary friendly news service providers might happen to have a VERY
    long text newsgroup archive simply based on size compared to binaries.

    Beyond that, I'd start inquiring if anyone like the Library of Congress
    has an archive. But I'd be somewhat surprised if they did. Or if they
    did, I suspect that getting access to it in any capacity would be onerous.



    --
    Grant. . . .

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From John Levine@21:1/5 to All on Fri Mar 29 02:15:29 2024
    According to Grant Taylor <gtaylor@tnetconsulting.net>:
    After Google killed Dejanews, (years ago) I would wonder if some of the
    big binary friendly news service providers might happen to have a VERY
    long text newsgroup archive simply based on size compared to binaries.

    My impression is that the commercial providers like Giganews stopped
    expiring stuff a long time ago so their archives go way back.


    --
    Regards,
    John Levine, johnl@taugh.com, Primary Perpetrator of "The Internet for Dummies",
    Please consider the environment before reading this e-mail. https://jl.ly

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From candycanearter07@21:1/5 to Kyonshi on Mon Apr 1 15:16:36 2024
    Kyonshi <gmkeros@gmail.com> wrote at 23:03 this Sunday (GMT):
    On 3/28/2024 8:11 PM, SugarBug wrote:
    I wonder how many peers are carrying and preserving fulltext historical and curent Usenet feeds.

    Will anyone give a ballpark number?

    Which peers have articles remotest in antiquity?

    Even guesstimates might be useful.


    The question is how much of that really is useful, considering the flood
    of spam the last few decades.


    There was still plenty of legitimate conversation, and
    filtering/skipping the spam should be easier in retrospect. You could
    also apply the existing spam reports for a portion of it.
    --
    user <candycane> is generated from /dev/urandom

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Billy G. (go-while)@21:1/5 to SugarBug on Sun May 12 13:41:45 2024
    On 28.03.24 20:11, SugarBug wrote:
    I wonder how many peers are carrying and preserving fulltext historical and curent Usenet feeds.

    Will anyone give a ballpark number?

    Which peers have articles remotest in antiquity?

    Even guesstimates might be useful.



    i've imported and deduped all available usenet backups/mbox files from archive.org (several TB compressed) without any filtering and accepted
    every group that showed up which results in 471k groups so far at the Full-Node.

    'news.software.nntp' for example dates back to 1987.

    it should be readable via NNTP

    Part-Node (111k groups)
    file: http://104.244.74.85/usenet/active/part.active.txt
    host: 104.244.74.85:11119
    user: freefree
    pass: freefree


    Full-Node (471k groups)
    file: http://104.244.74.85/usenet/active/full.active.txt
    host: 104.244.74.85:11120
    user: freefree
    pass: freefree

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)