In the last couple of days my website has had an increase in traffic,
from about 30 different IP addresses, all with a User-Agent of "Go-http-client/1.1".
Each starts with a "GET / HTTP/1.1" request, with various User-Agents, including Windows, Linux & MaxOS. If that works (as it will) it then
issues GETs for about 30 varied files, then stops.
It seems that Go-http-client is a package which "provides HTTP client
and server implementations" but it is suddenly being used by lots of
IPs in a suspicious way.
Anyone else seen this?
They obviously do not abide by robots/txt (or even read it), so the
only way I know to block them is to add them to /htaccess as deny
froms - some have the same top two numbers.
Are there any better ways?
One way is just to ignore them, I know, but I would not want a trickle
to turn into a flood.
In message <5b2b4df63fNews03@avisoft.f9.co.uk>
Martin <News03@avisoft.f9.co.uk> wrote:
In the last couple of days my website has had an increase in
traffic, from about 30 different IP addresses, all with a
User-Agent of "Go-http-client/1.1".
Each starts with a "GET / HTTP/1.1" request, with various
User-Agents, including Windows, Linux & MaxOS. If that works (as
it will) it then issues GETs for about 30 varied files, then
stops.
It seems that Go-http-client is a package which "provides HTTP
client and server implementations" but it is suddenly being used
by lots of IPs in a suspicious way.
Anyone else seen this?
They obviously do not abide by robots/txt (or even read it), so
the only way I know to block them is to add them to /htaccess as
deny froms - some have the same top two numbers.
Are there any better ways? One way is just to ignore them, I
know, but I would not want a trickle to turn into a flood.
Is your web space provided via PlusNet ?
If so you could report possible suspicious activity.
In article <a56a552b5b.chris@mytardis>,
Chris Hughes <news13@noonehere.co.uk> wrote:
In message <5b2b4df63fNews03@avisoft.f9.co.uk>
Martin <News03@avisoft.f9.co.uk> wrote:
In the last couple of days my website has had an increase in
traffic, from about 30 different IP addresses, all with a
User-Agent of "Go-http-client/1.1".
Each starts with a "GET / HTTP/1.1" request, with various
User-Agents, including Windows, Linux & MaxOS. If that works (as
it will) it then issues GETs for about 30 varied files, then
stops.
It seems that Go-http-client is a package which "provides HTTP
client and server implementations" but it is suddenly being used
by lots of IPs in a suspicious way.
Anyone else seen this?
They obviously do not abide by robots/txt (or even read it), so
the only way I know to block them is to add them to /htaccess as
deny froms - some have the same top two numbers.
Are there any better ways? One way is just to ignore them, I
know, but I would not want a trickle to turn into a flood.
Is your web space provided via PlusNet ?
If so you could report possible suspicious activity.
Yes ... but I doubt they would be interested at the current level.
In message <5b2b591f8fNews03@avisoft.f9.co.uk>
Martin <News03@avisoft.f9.co.uk> wrote:
In article <a56a552b5b.chris@mytardis>,
Chris Hughes <news13@noonehere.co.uk> wrote:
In message <5b2b4df63fNews03@avisoft.f9.co.uk>
Martin <News03@avisoft.f9.co.uk> wrote:
In the last couple of days my website has had an increase in
traffic, from about 30 different IP addresses, all with a
User-Agent of "Go-http-client/1.1".
Each starts with a "GET / HTTP/1.1" request, with various
User-Agents, including Windows, Linux & MaxOS. If that works (as
it will) it then issues GETs for about 30 varied files, then
stops.
It seems that Go-http-client is a package which "provides HTTP
client and server implementations" but it is suddenly being used
by lots of IPs in a suspicious way.
Anyone else seen this?
They obviously do not abide by robots/txt (or even read it), so
the only way I know to block them is to add them to /htaccess as
deny froms - some have the same top two numbers.
Are there any better ways? One way is just to ignore them, I
know, but I would not want a trickle to turn into a flood.
Is your web space provided via PlusNet ?
If so you could report possible suspicious activity.
Yes ... but I doubt they would be interested at the current level.
I meant to say via PlusNet's Community Forum, which often gets a
faster response then ringing the normal customer support, as they
frequently don't seem to know some users have web space! as you
use a legacy system i.e. force9
In the last couple of days my website has had an increase in traffic,
from about 30 different IP addresses, all with a User-Agent of "Go-http-client/1.1".
Each starts with a "GET / HTTP/1.1" request, with various User-Agents, including Windows, Linux & MaxOS. If that works (as it will) it then
issues GETs for about 30 varied files, then stops.
It seems that Go-http-client is a package which "provides HTTP client
and server implementations" but it is suddenly being used by lots of
IPs in a suspicious way.
Anyone else seen this?
They obviously do not abide by robots/txt (or even read it), so the
only way I know to block them is to add them to /htaccess as deny
froms - some have the same top two numbers.
Are there any better ways?
One way is just to ignore them, I know, but I would not want a trickle
to turn into a flood.
Martin <News03@avisoft.f9.co.uk> wrote:
In the last couple of days my website has had an increase in
traffic, from about 30 different IP addresses, all with a
User-Agent of "Go-http-client/1.1".
Each starts with a "GET / HTTP/1.1" request, with various
User-Agents, including Windows, Linux & MaxOS. If that works (as
it will) it then issues GETs for about 30 varied files, then
stops.
It seems that Go-http-client is a package which "provides HTTP
client and server implementations" but it is suddenly being used
by lots of IPs in a suspicious way.
Anyone else seen this?
Looking at the riscos.info logs, there's a variety of entries
matching that. Since the start of December there have been 1632
requests.
Some examples (I have redacted part of the IPs, but
they're all with completely different prefixes):
Testing if the site will proxy for another:
Testing for vulnerable pages:
A legit access followed by some probing:
The ownership of some of those prefixes is:
They appear to just be probing for vulnerable sites. I don't think
anything you do will affect the rate, they are just picking targets
at random. I'd guess it's just coming from a malware toolkit of
some kind that happens to be programmed in Go, possibly running
through a botnet.
I doubt any kind of IP filtering is going to work. So it boils
down to hot they're bothering you - filling up the log (something
that's been happening to riscos.info a few times of late), eating
your bandwidth or CPU.
There are too many IPs to block in firewall rules. You could block
accesses from Go-http-client, but I think it would still log as
blocked. Mostly from the above they aren't actually interacting
with real content on the site so the CPU is not doing much serving
real pages, and the 302/404 traffic is minimal (~300 bytes per
request). Maybe some kind of adaptive firewalling/rate limiting,
but that would probably block genuine traffic.
Unless you have scripts on your site that are actually vulnerable
(in which case you should fix them) I'm not sure there's much to be
done.
If you provide a site on the internet, people (or bots) on
the internet connect to it. That's the deal.
In article <kGr*WKPBz@news.chiark.greenend.org.uk>,
Theo <theom+news@chiark.greenend.org.uk> wrote:
I have had over 800 in the previous 2 days.
All mine have been to existing pages or files - all returned with
status 200.
Mine seemed to be allocated to Asia Pacific (APNIC).
Difficult these days to get more precise information.
MIne are downloading real files (including zips) with status 200.
No scripts here. Just plain HTML.
In the last couple of days my website has had an increase in
traffic, from about 30 different IP addresses, all with a
User-Agent of "Go-http-client/1.1".
Each starts with a "GET / HTTP/1.1" request, with various
User-Agents, including Windows, Linux & MaxOS. If that works (as it
will) it then issues GETs for about 30 varied files, then stops.
It seems that Go-http-client is a package which "provides HTTP
client and server implementations" but it is suddenly being used by
lots of IPs in a suspicious way.
Anyone else seen this?
They obviously do not abide by robots/txt (or even read it), so the
only way I know to block them is to add them to /htaccess as deny
froms - some have the same top two numbers.
Are there any better ways? One way is just to ignore them, I know,
but I would not want a trickle to turn into a flood.
The trickle continued, some days far outnumbering other requests.
But I have found a way to stop them! I added to my ./htaccess file...
RewriteCond %{HTTP_USER_AGENT} "=Go-http-client/1.1"
RewriteRule .* - [F,L]
... now returns 403 Forbidden. Stopped 260 in 12 hours yesterday.
This certainly works on PlusNet - may or may not on other ISPs.
Martin
In article <5b369f6248News03@avisoft.f9.co.uk>,
Martin <News03@avisoft.f9.co.uk> wrote:
The trickle continued, some days far outnumbering other requests.
But I have found a way to stop them! I added to my ./htaccess
file...
RewriteCond %{HTTP_USER_AGENT} "=Go-http-client/1.1"
RewriteRule .* - [F,L]
... now returns 403 Forbidden. Stopped 260 in 12 hours yesterday.
This certainly works on PlusNet - may or may not on other ISPs.
https://user-agents.net/string/go-http-client-1-1
gives info on this. But these requests may not be evil. I would
guess mostly neutral.
There are so many bots and crawlers these days log files are of
little practical use!
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 300 |
Nodes: | 16 (2 / 14) |
Uptime: | 17:03:19 |
Calls: | 6,707 |
Calls today: | 1 |
Files: | 12,239 |
Messages: | 5,351,354 |