Web considered harmful
======================
Over the past decade, the internet has seen a transition from
single-task protocols to the web to the extent that new functionality
is often only exposed as a web-API with a proprietary protocol.
While the base protocol (HTTP) and information serialization (HTML,
XML, JSON) is standardized, the methods for extracting information
from the received data varies from website to website.
The solution in the 1990s was to make a standardized protocol,
e.g. IMAP or NNTP, which could be used to access email or news in a standardized manner.
For interfacing with, say, google mail, however, a client application
will have to speak the google mail API which is incompatible with the
mail API of another provider. This transition is turning the internet
into a collection of walled gardens with the obvious drawback that
most websites -- if an API is present at all -- will only have the
official client implementation to said API available. Mostly there
will be a few closed-source implementations provided by the vendor,
most commonly a combination of the following:
* a website (often with mandatory javascript)
* a mobile website (possibly without javascript, but optimized for
small screens and thus not very practical on a desktop browser and
often not exposing all available features)
* Android or iPhone app (sometimes not exposing all available
features, restricted to a single platform)
leaving users little choice in case they are using a different
platform or want to collect their data in a unified format.
Even worse is receiving information from websites where no API exists.
There is no standard for logging into websites which have a mandatory username/password login prompt and implementations will have to handle cookies, referer headers (ridiculously many website mandate one for
XSRF protection even though the standard makes them optional) and site specific form locations to which POST and GET requests will need to be
made in a site specific order.
For the most part, there has been no effort in changing any aspect of
this problem, which has existed for more than 10 years. On the
contrary, companies have consecutively started to discontinue support
for open web standards such as RSS/Atom.
Conclusion: The web as it is now is harmful to the open standard
culture of the internet.
Related readings (please expand): https://www.gnu.org/philosophy/javascript-trap.html
Comments and discussion would be appreciated.
On 3/22/14 12:43 PM, mw wrote:
Web considered harmful
======================
Over the past decade, the internet has seen a transition from
single-task protocols to the web to the extent that new functionality
is often only exposed as a web-API with a proprietary protocol.
While the base protocol (HTTP) and information serialization (HTML,
XML, JSON) is standardized, the methods for extracting information
from the received data varies from website to website.
The solution in the 1990s was to make a standardized protocol,
e.g. IMAP or NNTP, which could be used to access email or news in a standardized manner.
For interfacing with, say, google mail, however, a client application
will have to speak the google mail API which is incompatible with the
mail API of another provider. This transition is turning the internet
into a collection of walled gardens with the obvious drawback that
most websites -- if an API is present at all -- will only have the
official client implementation to said API available. Mostly there
will be a few closed-source implementations provided by the vendor,
most commonly a combination of the following:
* a website (often with mandatory javascript)
* a mobile website (possibly without javascript, but optimized for
small screens and thus not very practical on a desktop browser and
often not exposing all available features)
* Android or iPhone app (sometimes not exposing all available
features, restricted to a single platform)
leaving users little choice in case they are using a different
platform or want to collect their data in a unified format.
Even worse is receiving information from websites where no API exists. There is no standard for logging into websites which have a mandatory username/password login prompt and implementations will have to handle cookies, referer headers (ridiculously many website mandate one for
XSRF protection even though the standard makes them optional) and site specific form locations to which POST and GET requests will need to be
made in a site specific order.
For the most part, there has been no effort in changing any aspect of
this problem, which has existed for more than 10 years. On the
contrary, companies have consecutively started to discontinue support
for open web standards such as RSS/Atom.
Conclusion: The web as it is now is harmful to the open standard
culture of the internet.
Related readings (please expand): https://www.gnu.org/philosophy/javascript-trap.html
Comments and discussion would be appreciated.
I have noticed that after all these years too - I fucking hate modern Internet. I fucking hate how social media has taken over us, I fucking
hate how hard it is to do anything in modern Web.
I will take the good ol' times of internetworking on Unix command line
in 80s over this modern crap every day.
On 3/22/14 12:43 PM, mw wrote:
I have noticed that after all these years too - I fucking hate modern Internet. I fucking hate how social media has taken over us, I fucking
hate how hard it is to do anything in modern Web.
I will take the good ol' times of internetworking on Unix command line
in 80s over this modern crap every day.
On 2021-12-17, Scientific (she/her) wrote:
On 2014-03-22, mw wrote [re-adding full quote]:
Web considered harmful
======================
Over the past decade, the internet has seen a transition from
single-task protocols to the web to the extent that new functionality
is often only exposed as a web-API with a proprietary protocol.
While the base protocol (HTTP) and information serialization (HTML,
XML, JSON) is standardized, the methods for extracting information
from the received data varies from website to website.
The solution in the 1990s was to make a standardized protocol,
e.g. IMAP or NNTP, which could be used to access email or news in a
standardized manner.
For interfacing with, say, google mail, however, a client application
will have to speak the google mail API which is incompatible with the
mail API of another provider. This transition is turning the internet
into a collection of walled gardens with the obvious drawback that
most websites -- if an API is present at all -- will only have the
official client implementation to said API available. Mostly there
will be a few closed-source implementations provided by the vendor,
most commonly a combination of the following:
* a website (often with mandatory javascript)
* a mobile website (possibly without javascript, but optimized for
small screens and thus not very practical on a desktop browser and
often not exposing all available features)
* Android or iPhone app (sometimes not exposing all available
features, restricted to a single platform)
leaving users little choice in case they are using a different
platform or want to collect their data in a unified format.
Even worse is receiving information from websites where no API exists.
There is no standard for logging into websites which have a mandatory
username/password login prompt and implementations will have to handle
cookies, referer headers (ridiculously many website mandate one for
XSRF protection even though the standard makes them optional) and site
specific form locations to which POST and GET requests will need to be
made in a site specific order.
For the most part, there has been no effort in changing any aspect of
this problem, which has existed for more than 10 years. On the
contrary, companies have consecutively started to discontinue support
for open web standards such as RSS/Atom.
Conclusion: The web as it is now is harmful to the open standard
culture of the internet.
Related readings (please expand):
https://www.gnu.org/philosophy/javascript-trap.html
Comments and discussion would be appreciated.
Quite a necro, but I approve! :-)
I have noticed that after all these years too - I fucking hate modern
Internet. I fucking hate how social media has taken over us, I fucking
hate how hard it is to do anything in modern Web.
I’m right there with you. One of my projects for 2022 is going to be to move away from the web as a primary means of sending or receiving information. I’m looking at things like Jekyll to get away from having
a heavy stack for my site(s), but even that might be too closely tied to
the way the modern web works.
I will take the good ol' times of internetworking on Unix command line
in 80s over this modern crap every day.
Well, it’s not like everything was perfectly executed back then, either. For example, no standardization on configuration files has been a constant annoyance for decades. But there is a lot to be said for text file formats of increasing complexity based on need. I mean, web browsers do *so* much these days, yet if you hand them a bit of Markdown they’re left clueless?
On 3/22/14 12:43 PM, mw wrote:
I have noticed that after all these years too - I fucking hate modern Internet. I fucking hate how social media has taken over us, I fucking
hate how hard it is to do anything in modern Web.
I will take the good ol' times of internetworking on Unix command line
in 80s over this modern crap every day.
On 2021-12-18, Doc O'Leary wrote:
On 2021-12-17, Scientific (she/her) wrote:
On 2014-03-22, mw wrote [re-adding full quote]:Quite a necro, but I approve! :-)
Web considered harmful
======================
Over the past decade, the internet has seen a transition from
single-task protocols to the web to the extent that new functionality
is often only exposed as a web-API with a proprietary protocol.
While the base protocol (HTTP) and information serialization
(HTML,
XML, JSON) is standardized, the methods for extracting information
from the received data varies from website to website.
The solution in the 1990s was to make a standardized protocol,
e.g. IMAP or NNTP, which could be used to access email or news in a
standardized manner.
For interfacing with, say, google mail, however, a client
application
will have to speak the google mail API which is incompatible with the
mail API of another provider. This transition is turning the internet
into a collection of walled gardens with the obvious drawback that
most websites -- if an API is present at all -- will only have the
official client implementation to said API available. Mostly there
will be a few closed-source implementations provided by the vendor,
most commonly a combination of the following:
* a website (often with mandatory javascript)
* a mobile website (possibly without javascript, but optimized
for
small screens and thus not very practical on a desktop browser and
often not exposing all available features)
* Android or iPhone app (sometimes not exposing all available
features, restricted to a single platform)
leaving users little choice in case they are using a different
platform or want to collect their data in a unified format.
Even worse is receiving information from websites where no API
exists.
There is no standard for logging into websites which have a mandatory
username/password login prompt and implementations will have to handle >>>> cookies, referer headers (ridiculously many website mandate one for
XSRF protection even though the standard makes them optional) and site >>>> specific form locations to which POST and GET requests will need to be >>>> made in a site specific order.
For the most part, there has been no effort in changing any aspect
of
this problem, which has existed for more than 10 years. On the
contrary, companies have consecutively started to discontinue support
for open web standards such as RSS/Atom.
Conclusion: The web as it is now is harmful to the open standard
culture of the internet.
Related readings (please expand):
https://www.gnu.org/philosophy/javascript-trap.html
Comments and discussion would be appreciated.
I have noticed that after all these years too - I fucking hate modernI’m right there with you. One of my projects for 2022 is going to
Internet. I fucking hate how social media has taken over us, I fucking
hate how hard it is to do anything in modern Web.
be to
move away from the web as a primary means of sending or receiving
information. I’m looking at things like Jekyll to get away from having
a heavy stack for my site(s), but even that might be too closely tied to
the way the modern web works.
I will take the good ol' times of internetworking on Unix command lineWell, it’s not like everything was perfectly executed back then,
in 80s over this modern crap every day.
either.
For example, no standardization on configuration files has been a constant >> annoyance for decades. But there is a lot to be said for text file formats >> of increasing complexity based on need. I mean, web browsers do *so* much >> these days, yet if you hand them a bit of Markdown they’re left clueless?
At least there's reader mode, but that's like using uBlock Origin
instead of serving only what's needed.
I'm surprised that Gemini managed to get quite popular within like one
or two years and Firefox still cannot render Markdown natively.
On 2021-12-18, Doc O'Leary wrote:
On 2021-12-17, Scientific (she/her) wrote:
On 2014-03-22, mw wrote:
Over the past decade, the internet has seen a transition from
single-task protocols to the web to the extent that new functionality
is often only exposed as a web-API with a proprietary protocol.
While the base protocol (HTTP) and information serialization (HTML,
XML, JSON) is standardized, the methods for extracting information
from the received data varies from website to website.
The solution in the 1990s was to make a standardized protocol,
e.g. IMAP or NNTP, which could be used to access email or news in a
standardized manner.
For interfacing with, say, google mail, however, a client application
will have to speak the google mail API which is incompatible with the
mail API of another provider. This transition is turning the internet
into a collection of walled gardens with the obvious drawback that
most websites -- if an API is present at all -- will only have the
official client implementation to said API available. Mostly there
will be a few closed-source implementations provided by the vendor,
most commonly a combination of the following:
leaving users little choice in case they are using a different
platform or want to collect their data in a unified format.
Even worse is receiving information from websites where no API exists.
There is no standard for logging into websites which have a mandatory
username/password login prompt and implementations will have to handle >>> cookies, referer headers (ridiculously many website mandate one for
XSRF protection even though the standard makes them optional) and site >>> specific form locations to which POST and GET requests will need to be >>> made in a site specific order.
For the most part, there has been no effort in changing any aspect of
this problem, which has existed for more than 10 years. On the
contrary, companies have consecutively started to discontinue support
for open web standards such as RSS/Atom.
Conclusion: The web as it is now is harmful to the open standard
culture of the internet.
Related readings (please expand):
https://www.gnu.org/philosophy/javascript-trap.html
I have noticed that after all these years too - I fucking hate modern
Internet. I fucking hate how social media has taken over us, I fucking
hate how hard it is to do anything in modern Web.
I'm right there with you. One of my projects for 2022 is going to be to move away from the web as a primary means of sending or receiving information. I'm looking at things like Jekyll to get away from having
a heavy stack for my site(s), but even that might be too closely tied to the way the modern web works.
I will take the good ol' times of internetworking on Unix command line
in 80s over this modern crap every day.
Well, it's not like everything was perfectly executed back then, either. For example, no standardization on configuration files has been a constant annoyance for decades. But there is a lot to be said for text file formats of increasing complexity based on need. I mean, web browsers do *so* much these days, yet if you hand them a bit of Markdown they're left clueless?
At least there's reader mode, but that's like using uBlock Origin
instead of serving only what's needed.
There are a few reasons why they would not implement Markdown, one of which is
that there are a few different variants, so they aren't always compatible.
It is common they implement the bad stuff, some of the good features though are
not implemented, and some good feature are even being removed, too.
It is unfortunate that fixing it involves more things like that instead of just making it in a simpler way, but it seems necessary, to me.
There are a few reasons why they would not implement Markdown, one of which is
that there are a few different variants, so they aren't always compatible.
Neither are all the variants of HTML compatible, but you presumably wouldn’t argue that as a reason browsers shouldn’t handle *any* HTML, right? My point is that there are many document formats that have a more
or less direct conversion to features that are supported by HTML, yet
feeding one to a “modern” browser that has kitchen-sink support for just about everything else under the sun leaves them dumbfounded. I mean, a
basic CSV file should be trivially easy to display as any other table would be, but is there any major browser that does that?
What can be said to be bad or good are in the eye of the beholder. I personally dislike the focus on publisher-controlled presentation. CSS
was supposed to move us away from that, but most browsers don’t make it easy to override sites so that the visitor can define their own unique
view of a usable web.
It is unfortunate that fixing it involves more things like that instead of just making it in a simpler way, but it seems necessary, to me.
Well, I’d say it’s only “necessary” in the sense that some people can’t see beyond bloating one app until it does everything they need.
I can easily see a tool developed with the Unix Philosophy in mind, but I can also see that most users wouldn’t actually use it, because they are quite happy living in an online world where the presentation is controlled by someone else whose aim is continued engagement.
One problem in general is that software is not designed for advanced users. Computer software should be designed for advanced users.One could say this about _anything_ no? Cars should be made for
For example, IRC can be a separate program, but it can make sense to support HTTP, HTML, Gemini, Gopher, etc together in one program, although I think that
it might be better having the core program not supporting any of these and only
the interface which calls extensions to implement them, instead. This way, you
can use the links between them, bookmark, etc.
One problem in general is that software is not designed for advanced users. Computer software should be designed for advanced users.
If a new browser must be written, another alternative is just to not implement
CSS at all, maybe. Some things will not work without CSS, but maybe if you have
HTML and ARIA, and possibility of user customizations (even if it is its own simplified kind of variant of CSS that only can be used by the end user) then it might be suitable for most, maybe.
Another feature I would want is to remove many animations.
It makes sense to have different things in different programs, but is sometimes
to be suitable to have multiple protocols/formats available in one interface, even if it calls external programs to do so.
For example, IRC can be a separate program, but it can make sense to support HTTP, HTML, Gemini, Gopher, etc together in one program, although I think that
it might be better having the core program not supporting any of these and only
the interface which calls extensions to implement them, instead. This way, you
can use the links between them, bookmark, etc.
Largely I think this thread is about technology people lamenting a
past where the net was only for other technology people. But the net
is infinitely wide. There's space for everyone on here. There doesn't
need to be gatekeeping on the net. We're not running out of internet
any time soon.
With HTTP/2 and HTTP/3 this doesn't necessarily need to be
true. HTTP/2 and HTTP/3 is good enough at this point to give you a
duplex channel.
I would argue somewhat the opposite. We *are* definitely running out of Internet that is free and open for people.
That especially applies to
the web, where large corporations have exercised vast power to manipulate people to act against their own best interest. Complaints of “gatekeeping” on Usenet ring hollow; if the “space” provided by Facebook
and Twitter are more to your liking, go there and try to have this kind
of discussion.
HTTP/3 is so different from HTTP/2 that they shouldn’t even be discussed
as being related protocol. It leaves me stepping back even further from
the request semantics and question what people are even looking to accomplish. Too many things (e.g., microservice APIs) are jammed through HTTP simply because web stacks are so common, not because they’re a good way to get the job done.
So, if anything, I’m lamenting the past where the web was *just* the web. It was a particular kind of information system, exchanging mainly HTML documents, that people could easily read and link to. Then it lost sight
of the Unix Philosophy and tried to become everything to everybody. So (again, in full acknowledgement of the irony of discussing this on Usenet when so many people have had their attention absorbed by web forums controlled by social media companies) I ask you: what do you think the
WWW *shouldn’t* do?
The Internet is as it always was.
It's up to us to _educate_ folks about what's lacking.
You can't achieve that by
calling names, in fact people are even less likely to listen to you if
you call them names.
I don't find it gatekeeping as much as complaining. My father loves to
lament times gone by but his memories conveniently edits away all the downsides. Again I find this behavior unproductive and closed
minded. You'll never get people to care about freedom if you start out
by insulting them or complaining about them. My father remains
unpopular at dinner parties.
The
trouble was middleboxes. Middleboxes would throw away anything that
wasn't on a few set of explicitly allowed ports (HTTP, HTTPS, SMTP) or
wasn't just TCP traffic.
Most importantly I may be _wrong_ and the others may be right.
Nobody can systematically silence
you. Others may killfile you, but nobody has power over your voice on
Usenet the way Reddit can just ban people and entire
communities. The same goes for other net technologies like email.
I'm hoping that if HTTP/3 can actually become a net standard that
middleboxes respect, that we can _finally_ start sending UDP packets,
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 285 |
Nodes: | 16 (2 / 14) |
Uptime: | 69:15:33 |
Calls: | 6,488 |
Calls today: | 1 |
Files: | 12,096 |
Messages: | 5,275,379 |