• Go-http-client

    From Martin@21:1/5 to All on Wed Jan 31 10:25:35 2024
    In the last couple of days my website has had an increase in traffic,
    from about 30 different IP addresses, all with a User-Agent of "Go-http-client/1.1".

    Each starts with a "GET / HTTP/1.1" request, with various User-Agents, including Windows, Linux & MaxOS. If that works (as it will) it then
    issues GETs for about 30 varied files, then stops.

    It seems that Go-http-client is a package which "provides HTTP client
    and server implementations" but it is suddenly being used by lots of
    IPs in a suspicious way.

    Anyone else seen this?

    They obviously do not abide by robots/txt (or even read it), so the
    only way I know to block them is to add them to /htaccess as deny
    froms - some have the same top two numbers.

    Are there any better ways?
    One way is just to ignore them, I know, but I would not want a trickle
    to turn into a flood.

    Martin

    --
    Martin Avison
    Note that unfortunately this email address will become invalid
    without notice if (when) any spam is received.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Chris Hughes@21:1/5 to Martin on Wed Jan 31 11:47:00 2024
    In message <5b2b4df63fNews03@avisoft.f9.co.uk>
    Martin <News03@avisoft.f9.co.uk> wrote:

    In the last couple of days my website has had an increase in traffic,
    from about 30 different IP addresses, all with a User-Agent of "Go-http-client/1.1".

    Each starts with a "GET / HTTP/1.1" request, with various User-Agents, including Windows, Linux & MaxOS. If that works (as it will) it then
    issues GETs for about 30 varied files, then stops.

    It seems that Go-http-client is a package which "provides HTTP client
    and server implementations" but it is suddenly being used by lots of
    IPs in a suspicious way.

    Anyone else seen this?

    They obviously do not abide by robots/txt (or even read it), so the
    only way I know to block them is to add them to /htaccess as deny
    froms - some have the same top two numbers.

    Are there any better ways?
    One way is just to ignore them, I know, but I would not want a trickle
    to turn into a flood.

    Is your web space provided via PlusNet ?

    If so you could report possible suspicious activity.


    --
    Chris Hughes

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Martin@21:1/5 to Chris Hughes on Wed Jan 31 12:27:30 2024
    In article <a56a552b5b.chris@mytardis>,
    Chris Hughes <news13@noonehere.co.uk> wrote:
    In message <5b2b4df63fNews03@avisoft.f9.co.uk>
    Martin <News03@avisoft.f9.co.uk> wrote:

    In the last couple of days my website has had an increase in
    traffic, from about 30 different IP addresses, all with a
    User-Agent of "Go-http-client/1.1".

    Each starts with a "GET / HTTP/1.1" request, with various
    User-Agents, including Windows, Linux & MaxOS. If that works (as
    it will) it then issues GETs for about 30 varied files, then
    stops.

    It seems that Go-http-client is a package which "provides HTTP
    client and server implementations" but it is suddenly being used
    by lots of IPs in a suspicious way.

    Anyone else seen this?

    They obviously do not abide by robots/txt (or even read it), so
    the only way I know to block them is to add them to /htaccess as
    deny froms - some have the same top two numbers.

    Are there any better ways? One way is just to ignore them, I
    know, but I would not want a trickle to turn into a flood.

    Is your web space provided via PlusNet ?
    If so you could report possible suspicious activity.

    Yes ... but I doubt they would be interested at the current level.

    Martin

    --
    Martin Avison
    Note that unfortunately this email address will become invalid
    without notice if (when) any spam is received.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Chris Hughes@21:1/5 to Martin on Wed Jan 31 12:59:20 2024
    In message <5b2b591f8fNews03@avisoft.f9.co.uk>
    Martin <News03@avisoft.f9.co.uk> wrote:

    In article <a56a552b5b.chris@mytardis>,
    Chris Hughes <news13@noonehere.co.uk> wrote:
    In message <5b2b4df63fNews03@avisoft.f9.co.uk>
    Martin <News03@avisoft.f9.co.uk> wrote:

    In the last couple of days my website has had an increase in
    traffic, from about 30 different IP addresses, all with a
    User-Agent of "Go-http-client/1.1".

    Each starts with a "GET / HTTP/1.1" request, with various
    User-Agents, including Windows, Linux & MaxOS. If that works (as
    it will) it then issues GETs for about 30 varied files, then
    stops.

    It seems that Go-http-client is a package which "provides HTTP
    client and server implementations" but it is suddenly being used
    by lots of IPs in a suspicious way.

    Anyone else seen this?

    They obviously do not abide by robots/txt (or even read it), so
    the only way I know to block them is to add them to /htaccess as
    deny froms - some have the same top two numbers.

    Are there any better ways? One way is just to ignore them, I
    know, but I would not want a trickle to turn into a flood.

    Is your web space provided via PlusNet ?
    If so you could report possible suspicious activity.

    Yes ... but I doubt they would be interested at the current level.

    I meant to say via PlusNet's Community Forum, which often gets a faster response then ringing the normal customer support, as they frequently
    don't seem to know some users have web space! as you use a legacy system
    i.e. force9



    --
    Chris Hughes

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Martin@21:1/5 to Chris Hughes on Wed Jan 31 13:11:58 2024
    In article <e9095c2b5b.chris@mytardis>,
    Chris Hughes <news13@noonehere.co.uk> wrote:
    In message <5b2b591f8fNews03@avisoft.f9.co.uk>
    Martin <News03@avisoft.f9.co.uk> wrote:

    In article <a56a552b5b.chris@mytardis>,
    Chris Hughes <news13@noonehere.co.uk> wrote:
    In message <5b2b4df63fNews03@avisoft.f9.co.uk>
    Martin <News03@avisoft.f9.co.uk> wrote:

    In the last couple of days my website has had an increase in
    traffic, from about 30 different IP addresses, all with a
    User-Agent of "Go-http-client/1.1".

    Each starts with a "GET / HTTP/1.1" request, with various
    User-Agents, including Windows, Linux & MaxOS. If that works (as
    it will) it then issues GETs for about 30 varied files, then
    stops.

    It seems that Go-http-client is a package which "provides HTTP
    client and server implementations" but it is suddenly being used
    by lots of IPs in a suspicious way.

    Anyone else seen this?

    They obviously do not abide by robots/txt (or even read it), so
    the only way I know to block them is to add them to /htaccess as
    deny froms - some have the same top two numbers.

    Are there any better ways? One way is just to ignore them, I
    know, but I would not want a trickle to turn into a flood.

    Is your web space provided via PlusNet ?
    If so you could report possible suspicious activity.

    Yes ... but I doubt they would be interested at the current level.

    I meant to say via PlusNet's Community Forum, which often gets a
    faster response then ringing the normal customer support, as they
    frequently don't seem to know some users have web space! as you
    use a legacy system i.e. force9

    Aaah yes - that is a good idea. Thanks.

    --
    Martin Avison
    Note that unfortunately this email address will become invalid
    without notice if (when) any spam is received.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Theo@21:1/5 to Martin on Wed Jan 31 15:18:26 2024
    Martin <News03@avisoft.f9.co.uk> wrote:
    In the last couple of days my website has had an increase in traffic,
    from about 30 different IP addresses, all with a User-Agent of "Go-http-client/1.1".

    Each starts with a "GET / HTTP/1.1" request, with various User-Agents, including Windows, Linux & MaxOS. If that works (as it will) it then
    issues GETs for about 30 varied files, then stops.

    It seems that Go-http-client is a package which "provides HTTP client
    and server implementations" but it is suddenly being used by lots of
    IPs in a suspicious way.

    Anyone else seen this?

    Looking at the riscos.info logs, there's a variety of entries matching that. Since the start of December there have been 1632 requests.
    Some examples (I have redacted part of the IPs, but they're all with
    completely different prefixes):

    Testing if the site will proxy for another:

    106.2.x.x - - [19/Jan/2024:11:14:23 +0000] "CONNECT www.whitehouse.gov:443 HTTP/1.1" 302 292 "-" "Go-http-client/1.1"
    80.91.x.x - - [20/Jan/2024:11:30:17 +0000] "CONNECT google.com:443 HTTP/1.1" 302 284 "-" "Go-http-client/1.1"

    Testing for vulnerable pages:

    91.92.x.x - - [14/Dec/2023:08:14:25 +0000] "GET //alfa.php HTTP/1.1" 404 287 "-" "Go-http-client/1.1"
    91.92.x.x - - [14/Dec/2023:08:14:25 +0000] "GET //doc.php HTTP/1.1" 404 286 "-" "Go-http-client/1.1"
    91.92.x.x - - [14/Dec/2023:08:14:25 +0000] "GET //marijuana.php HTTP/1.1" 404 292 "-" "Go-http-client/1.1"
    91.92.x.x - - [14/Dec/2023:08:14:25 +0000] "GET //mini.php HTTP/1.1" 404 287 "-" "Go-http-client/1.1"
    91.92.x.x - - [14/Dec/2023:08:14:25 +0000] "GET //shell.php HTTP/1.1" 404 288 "-" "Go-http-client/1.1"
    91.92.x.x - - [14/Dec/2023:08:14:25 +0000] "GET //small.php HTTP/1.1" 404 288 "-" "Go-http-client/1.1"
    91.92.x.x - - [14/Dec/2023:08:14:25 +0000] "GET //wso.php HTTP/1.1" 404 286 "-" "Go-http-client/1.1"
    91.92.x.x - - [14/Dec/2023:08:14:25 +0000] "GET //wp-info.php HTTP/1.1" 404 290 "-" "Go-http-client/1.1"

    A legit access followed by some probing:

    195.20.x.x - - [06/Dec/2023:05:14:05 +0000] "GET / HTTP/1.1" 302 287 "-" "Go-http-client/1.1"
    195.20.x.x - - [06/Dec/2023:05:14:16 +0000] "GET / HTTP/1.1" 301 26 "-" "Go-http-client/1.1"
    195.20.x.x - - [06/Dec/2023:05:14:17 +0000] "GET /index.php/RISC_OS HTTP/1.1" 200 7210 "http://www.riscos.info/" "Go-http-client/1.1"
    195.20.x.x - - [06/Dec/2023:05:14:19 +0000] "GET /+CSCOE+/logon.html HTTP/1.1" 302 305 "-" "Go-http-client/1.1"
    195.20.x.x - - [06/Dec/2023:05:14:50 +0000] "GET /global-protect/login.esp HTTP/1.1" 302 311 "-" "Go-http-client/1.1"
    195.20.x.x - - [06/Dec/2023:05:14:50 +0000] "GET /global-protect/login.esp HTTP/1.1" 404 303 "-" "Go-http-client/1.1"

    The ownership of some of those prefixes is:

    netname: Netease-Network
    descr: Guangzhou NetEase Computer System Co.,Ltd
    country: CN

    organisation: ORG-FZTA3-RIPE
    org-name: Ferdinand Zink trading as Tube-Hosting
    country: DE

    organisation: ORG-LA1853-RIPE
    org-name: Limenet
    org-type: OTHER
    address: 84 W Broadway, Ste 200
    address: 03038 Derry
    address: United States of America

    organisation: ORG-GL496-RIPE
    org-name: Shelter LLC
    country: RU

    so not a geographic pattern.

    They obviously do not abide by robots/txt (or even read it), so the
    only way I know to block them is to add them to /htaccess as deny
    froms - some have the same top two numbers.

    Are there any better ways?
    One way is just to ignore them, I know, but I would not want a trickle
    to turn into a flood.

    They appear to just be probing for vulnerable sites. I don't think anything you do will affect the rate, they are just picking targets at random. I'd guess it's just coming from a malware toolkit of some kind that happens to
    be programmed in Go, possibly running through a botnet.

    I doubt any kind of IP filtering is going to work. So it boils down to
    hot they're bothering you - filling up the log (something that's been
    happening to riscos.info a few times of late), eating your bandwidth or CPU.

    There are too many IPs to block in firewall rules. You could block accesses from Go-http-client, but I think it would still log as blocked. Mostly from the above they aren't actually interacting with real content on the site so
    the CPU is not doing much serving real pages, and the 302/404 traffic is minimal (~300 bytes per request). Maybe some kind of adaptive
    firewalling/rate limiting, but that would probably block genuine traffic.

    Unless you have scripts on your site that are actually vulnerable (in which case you should fix them) I'm not sure there's much to be done. If you
    provide a site on the internet, people (or bots) on the internet connect to
    it. That's the deal.

    Theo

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Martin@21:1/5 to Theo on Wed Jan 31 16:56:24 2024
    In article <kGr*WKPBz@news.chiark.greenend.org.uk>,
    Theo <theom+news@chiark.greenend.org.uk> wrote:
    Martin <News03@avisoft.f9.co.uk> wrote:
    In the last couple of days my website has had an increase in
    traffic, from about 30 different IP addresses, all with a
    User-Agent of "Go-http-client/1.1".

    Each starts with a "GET / HTTP/1.1" request, with various
    User-Agents, including Windows, Linux & MaxOS. If that works (as
    it will) it then issues GETs for about 30 varied files, then
    stops.

    It seems that Go-http-client is a package which "provides HTTP
    client and server implementations" but it is suddenly being used
    by lots of IPs in a suspicious way.

    Anyone else seen this?

    Looking at the riscos.info logs, there's a variety of entries
    matching that. Since the start of December there have been 1632
    requests.

    I have had over 800 in the previous 2 days.

    Some examples (I have redacted part of the IPs, but
    they're all with completely different prefixes):

    Testing if the site will proxy for another:

    Not seen any like that.

    Testing for vulnerable pages:

    Or that!

    A legit access followed by some probing:

    All mine have been to existing pages or files - all returned with
    status 200.

    The ownership of some of those prefixes is:

    Mine seemed to be allocated to Asia Pacific (APNIC).
    Difficult these days to get more precise information.

    They appear to just be probing for vulnerable sites. I don't think
    anything you do will affect the rate, they are just picking targets
    at random. I'd guess it's just coming from a malware toolkit of
    some kind that happens to be programmed in Go, possibly running
    through a botnet.

    Probably - Googling 'botnet using go-http-client' gives lots of hits!

    I doubt any kind of IP filtering is going to work. So it boils
    down to hot they're bothering you - filling up the log (something
    that's been happening to riscos.info a few times of late), eating
    your bandwidth or CPU.

    They are certainly vastly increasing my bandwidth usage, though I have
    not quantified it.

    There are too many IPs to block in firewall rules. You could block
    accesses from Go-http-client, but I think it would still log as
    blocked. Mostly from the above they aren't actually interacting
    with real content on the site so the CPU is not doing much serving
    real pages, and the 302/404 traffic is minimal (~300 bytes per
    request). Maybe some kind of adaptive firewalling/rate limiting,
    but that would probably block genuine traffic.

    MIne are downloading real files (including zips) with status 200.

    Unless you have scripts on your site that are actually vulnerable
    (in which case you should fix them) I'm not sure there's much to be
    done.

    No scripts here. Just plain HTML.

    If you provide a site on the internet, people (or bots) on
    the internet connect to it. That's the deal.

    Yes, indeed. I will just keep an eye open for the moment.

    Thanks
    Martin

    --
    Martin Avison
    Note that unfortunately this email address will become invalid
    without notice if (when) any spam is received.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Theo@21:1/5 to Martin on Wed Jan 31 18:32:17 2024
    Martin <News03@avisoft.f9.co.uk> wrote:
    In article <kGr*WKPBz@news.chiark.greenend.org.uk>,
    Theo <theom+news@chiark.greenend.org.uk> wrote:

    I have had over 800 in the previous 2 days.

    All mine have been to existing pages or files - all returned with
    status 200.

    Mine seemed to be allocated to Asia Pacific (APNIC).
    Difficult these days to get more precise information.

    Try a 'whois' on the IP, it should tell you the Autonomous System (AS) which owns the IP range. That is usually an ISP but can sometimes be a company.
    Of course you'd need to talk to them to go any further.

    MIne are downloading real files (including zips) with status 200.

    No scripts here. Just plain HTML.

    I would guess somebody's using a tool to crawl your site, for what purpose
    we don't know. It happens to be written using a popular Go HTTP library and they didn't change the User-Agent. It doesn't sound like the same kind of probing I'm seeing.

    I've been seeing a lot of crawls from AI companies (Bytedance, Facebook) who are sucking data for training AI models. Perhaps they are doing something similar.

    Theo

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Martin@21:1/5 to Martin on Thu Feb 22 09:53:09 2024
    In article <5b2b4df63fNews03@avisoft.f9.co.uk>,
    Martin <News03@avisoft.f9.co.uk> wrote:
    In the last couple of days my website has had an increase in
    traffic, from about 30 different IP addresses, all with a
    User-Agent of "Go-http-client/1.1".

    Each starts with a "GET / HTTP/1.1" request, with various
    User-Agents, including Windows, Linux & MaxOS. If that works (as it
    will) it then issues GETs for about 30 varied files, then stops.

    It seems that Go-http-client is a package which "provides HTTP
    client and server implementations" but it is suddenly being used by
    lots of IPs in a suspicious way.

    Anyone else seen this?

    They obviously do not abide by robots/txt (or even read it), so the
    only way I know to block them is to add them to /htaccess as deny
    froms - some have the same top two numbers.

    Are there any better ways? One way is just to ignore them, I know,
    but I would not want a trickle to turn into a flood.

    The trickle continued, some days far outnumbering other requests.

    But I have found a way to stop them! I added to my ./htaccess file...

    RewriteCond %{HTTP_USER_AGENT} "=Go-http-client/1.1"
    RewriteRule .* - [F,L]

    ... now returns 403 Forbidden. Stopped 260 in 12 hours yesterday.

    This certainly works on PlusNet - may or may not on other ISPs.

    Martin

    --
    Martin Avison
    Note that unfortunately this email address will become invalid
    without notice if (when) any spam is received.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Richard Torrens (News)@21:1/5 to Martin on Fri Feb 23 10:15:10 2024
    In article <5b369f6248News03@avisoft.f9.co.uk>,
    Martin <News03@avisoft.f9.co.uk> wrote:
    The trickle continued, some days far outnumbering other requests.

    But I have found a way to stop them! I added to my ./htaccess file...

    RewriteCond %{HTTP_USER_AGENT} "=Go-http-client/1.1"
    RewriteRule .* - [F,L]

    ... now returns 403 Forbidden. Stopped 260 in 12 hours yesterday.

    This certainly works on PlusNet - may or may not on other ISPs.

    Martin


    https://user-agents.net/string/go-http-client-1-1

    gives info on this. But these requests may not be evil. I would guess
    mostly neutral.

    There are s many bots and crawlers these days log files are of little
    practical use!

    --
    ------------------------------------------------------------------
    Richard Torrens. News email address is valid - for a limited time only.
    You must use the full News+number@Torrens.org as in the From address. http://www.Torrens.org for genealogy, natural history, wild food, walks, cats and more!

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Martin@21:1/5 to News+19772@Torrens.org on Fri Feb 23 11:56:13 2024
    In article <5b37253c5anews*@Torrens.org>,
    Richard Torrens (News) <News+19772@Torrens.org> wrote:
    In article <5b369f6248News03@avisoft.f9.co.uk>,
    Martin <News03@avisoft.f9.co.uk> wrote:
    The trickle continued, some days far outnumbering other requests.

    But I have found a way to stop them! I added to my ./htaccess
    file...

    RewriteCond %{HTTP_USER_AGENT} "=Go-http-client/1.1"
    RewriteRule .* - [F,L]

    ... now returns 403 Forbidden. Stopped 260 in 12 hours yesterday.
    This certainly works on PlusNet - may or may not on other ISPs.

    https://user-agents.net/string/go-http-client-1-1
    gives info on this. But these requests may not be evil. I would
    guess mostly neutral.

    Of 1209 requests yesterday, 1117 were this user-agent - over 96%.

    They were from a wide variety of IP addresses, with anything from 2 to
    30 requests each, to a similar subset of pages.

    None of them looked at robots/txt, so I would say that in my
    experience, they were all spurious, probably malicious.

    There are so many bots and crawlers these days log files are of
    little practical use!

    I have rarely looked at mine for ages ... until there was massive
    increase in their daily sizes!

    Anyway, they all get Forbidden from me now!

    Martin

    --
    Martin Avison
    Note that unfortunately this email address will become invalid
    without notice if (when) any spam is received.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)