I have an gawk program that takes as input Events Stream
$CURL -s https://domain.com/path-to-stream | prog.awk
prog.awk
----
while ((getline line < "/dev/stdin") > 0) {
<does a lot of stuff>
}
So far it is working great (yay awk), but I wonder because "does a
lot of stuff" might take a while some seconds, and I don't want to
drop any incoming stream, which can be a lot. Not even sure what is
doing the buffering - the OS via the pipe? Curl? Awk? Can data be
lost without knowing? Thanks for any insight.
I have an gawk program that takes as input Events StreamAwk? Can data be lost without knowing? Thanks for any insight.
$CURL -s https://domain.com/path-to-stream | prog.awk
prog.awk
----
while ((getline line < "/dev/stdin") > 0) {
<does a lot of stuff>
}
So far it is working great (yay awk), but I wonder because "does a lot of stuff" might take a while some seconds, and I don't want to drop any incoming stream, which can be a lot. Not even sure what is doing the buffering - the OS via the pipe? Curl?
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 296 |
Nodes: | 16 (2 / 14) |
Uptime: | 42:15:32 |
Calls: | 6,648 |
Files: | 12,193 |
Messages: | 5,329,573 |