• Re: Web considered harmful

    From Scientific (she/her)@21:1/5 to All on Fri Dec 17 20:33:00 2021
    On 3/22/14 12:43 PM, mw wrote:
    Web considered harmful
    ======================

    Over the past decade, the internet has seen a transition from
    single-task protocols to the web to the extent that new functionality
    is often only exposed as a web-API with a proprietary protocol.

    While the base protocol (HTTP) and information serialization (HTML,
    XML, JSON) is standardized, the methods for extracting information
    from the received data varies from website to website.

    The solution in the 1990s was to make a standardized protocol,
    e.g. IMAP or NNTP, which could be used to access email or news in a standardized manner.

    For interfacing with, say, google mail, however, a client application
    will have to speak the google mail API which is incompatible with the
    mail API of another provider. This transition is turning the internet
    into a collection of walled gardens with the obvious drawback that
    most websites -- if an API is present at all -- will only have the
    official client implementation to said API available. Mostly there
    will be a few closed-source implementations provided by the vendor,
    most commonly a combination of the following:

    * a website (often with mandatory javascript)

    * a mobile website (possibly without javascript, but optimized for
    small screens and thus not very practical on a desktop browser and
    often not exposing all available features)

    * Android or iPhone app (sometimes not exposing all available
    features, restricted to a single platform)

    leaving users little choice in case they are using a different
    platform or want to collect their data in a unified format.

    Even worse is receiving information from websites where no API exists.
    There is no standard for logging into websites which have a mandatory username/password login prompt and implementations will have to handle cookies, referer headers (ridiculously many website mandate one for
    XSRF protection even though the standard makes them optional) and site specific form locations to which POST and GET requests will need to be
    made in a site specific order.

    For the most part, there has been no effort in changing any aspect of
    this problem, which has existed for more than 10 years. On the
    contrary, companies have consecutively started to discontinue support
    for open web standards such as RSS/Atom.

    Conclusion: The web as it is now is harmful to the open standard
    culture of the internet.

    Related readings (please expand): https://www.gnu.org/philosophy/javascript-trap.html

    Comments and discussion would be appreciated.

    I have noticed that after all these years too - I fucking hate modern
    Internet. I fucking hate how social media has taken over us, I fucking
    hate how hard it is to do anything in modern Web.

    I will take the good ol' times of internetworking on Unix command line
    in 80s over this modern crap every day.

    --
    There is no verifiable evidence that gender dysphoria can be treated in
    other ways than transitioning. None whatsoever.
    Gender identity becomes unchangeable by age 4, something transphobes
    fail to understand.
    Scaring trans people away from transitioning and repressing their
    identities *IS* conversion therapy.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Ant@21:1/5 to science@is.truth on Sat Dec 18 06:05:21 2021
    "Scientific (she/her)" <science@is.truth> wrote:
    On 3/22/14 12:43 PM, mw wrote:
    Web considered harmful
    ======================

    Over the past decade, the internet has seen a transition from
    single-task protocols to the web to the extent that new functionality
    is often only exposed as a web-API with a proprietary protocol.

    While the base protocol (HTTP) and information serialization (HTML,
    XML, JSON) is standardized, the methods for extracting information
    from the received data varies from website to website.

    The solution in the 1990s was to make a standardized protocol,
    e.g. IMAP or NNTP, which could be used to access email or news in a standardized manner.

    For interfacing with, say, google mail, however, a client application
    will have to speak the google mail API which is incompatible with the
    mail API of another provider. This transition is turning the internet
    into a collection of walled gardens with the obvious drawback that
    most websites -- if an API is present at all -- will only have the
    official client implementation to said API available. Mostly there
    will be a few closed-source implementations provided by the vendor,
    most commonly a combination of the following:

    * a website (often with mandatory javascript)

    * a mobile website (possibly without javascript, but optimized for
    small screens and thus not very practical on a desktop browser and
    often not exposing all available features)

    * Android or iPhone app (sometimes not exposing all available
    features, restricted to a single platform)

    leaving users little choice in case they are using a different
    platform or want to collect their data in a unified format.

    Even worse is receiving information from websites where no API exists. There is no standard for logging into websites which have a mandatory username/password login prompt and implementations will have to handle cookies, referer headers (ridiculously many website mandate one for
    XSRF protection even though the standard makes them optional) and site specific form locations to which POST and GET requests will need to be
    made in a site specific order.

    For the most part, there has been no effort in changing any aspect of
    this problem, which has existed for more than 10 years. On the
    contrary, companies have consecutively started to discontinue support
    for open web standards such as RSS/Atom.

    Conclusion: The web as it is now is harmful to the open standard
    culture of the internet.

    Related readings (please expand): https://www.gnu.org/philosophy/javascript-trap.html

    Comments and discussion would be appreciated.

    I have noticed that after all these years too - I fucking hate modern Internet. I fucking hate how social media has taken over us, I fucking
    hate how hard it is to do anything in modern Web.

    I will take the good ol' times of internetworking on Unix command line
    in 80s over this modern crap every day.

    You can still use lynx web browser. ;) However, most web sites don't
    work with it. :(

    --
    Cold & windy winter rain storm came & left a big mess! Dang coldness, colony, works, strikes, software upgrades, free games, spams, (scam/fraud)s, greeds, inflations, illness, life, etc. again. :(
    Note: A fixed width font (Courier, Monospace, etc.) is required to see this signature correctly.
    /\___/\ Ant(Dude) @ http://aqfl.net & http://antfarm.home.dhs.org.
    / /\ /\ \ Please nuke ANT if replying by e-mail.
    | |o o| |
    \ _ /
    ( )

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Doc O'Leary@21:1/5 to science@is.truth on Sat Dec 18 19:01:16 2021
    For your reference, records indicate that
    "Scientific (she/her)" <science@is.truth> wrote:

    On 3/22/14 12:43 PM, mw wrote:

    Quite a necro, but I approve! :-)

    I have noticed that after all these years too - I fucking hate modern Internet. I fucking hate how social media has taken over us, I fucking
    hate how hard it is to do anything in modern Web.

    I’m right there with you. One of my projects for 2022 is going to be to
    move away from the web as a primary means of sending or receiving
    information. I’m looking at things like Jekyll to get away from having
    a heavy stack for my site(s), but even that might be too closely tied to
    the way the modern web works.

    I will take the good ol' times of internetworking on Unix command line
    in 80s over this modern crap every day.

    Well, it’s not like everything was perfectly executed back then, either.
    For example, no standardization on configuration files has been a constant annoyance for decades. But there is a lot to be said for text file formats
    of increasing complexity based on need. I mean, web browsers do *so* much these days, yet if you hand them a bit of Markdown they’re left clueless?

    --
    "Also . . . I can kill you with my brain."
    River Tam, Trash, Firefly

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From David@21:1/5 to Doc O'Leary on Sat Jan 29 19:45:22 2022
    XPost: comp.infosystems.gemini

    On 2021-12-18, Doc O'Leary wrote:
    On 2021-12-17, Scientific (she/her) wrote:
    On 2014-03-22, mw wrote [re-adding full quote]:

    Web considered harmful
    ======================

    Over the past decade, the internet has seen a transition from
    single-task protocols to the web to the extent that new functionality
    is often only exposed as a web-API with a proprietary protocol.

    While the base protocol (HTTP) and information serialization (HTML,
    XML, JSON) is standardized, the methods for extracting information
    from the received data varies from website to website.

    The solution in the 1990s was to make a standardized protocol,
    e.g. IMAP or NNTP, which could be used to access email or news in a
    standardized manner.

    For interfacing with, say, google mail, however, a client application
    will have to speak the google mail API which is incompatible with the
    mail API of another provider. This transition is turning the internet
    into a collection of walled gardens with the obvious drawback that
    most websites -- if an API is present at all -- will only have the
    official client implementation to said API available. Mostly there
    will be a few closed-source implementations provided by the vendor,
    most commonly a combination of the following:

    * a website (often with mandatory javascript)

    * a mobile website (possibly without javascript, but optimized for
    small screens and thus not very practical on a desktop browser and
    often not exposing all available features)

    * Android or iPhone app (sometimes not exposing all available
    features, restricted to a single platform)

    leaving users little choice in case they are using a different
    platform or want to collect their data in a unified format.

    Even worse is receiving information from websites where no API exists.
    There is no standard for logging into websites which have a mandatory
    username/password login prompt and implementations will have to handle
    cookies, referer headers (ridiculously many website mandate one for
    XSRF protection even though the standard makes them optional) and site
    specific form locations to which POST and GET requests will need to be
    made in a site specific order.

    For the most part, there has been no effort in changing any aspect of
    this problem, which has existed for more than 10 years. On the
    contrary, companies have consecutively started to discontinue support
    for open web standards such as RSS/Atom.

    Conclusion: The web as it is now is harmful to the open standard
    culture of the internet.

    Related readings (please expand):
    https://www.gnu.org/philosophy/javascript-trap.html

    Comments and discussion would be appreciated.

    Quite a necro, but I approve! :-)

    I have noticed that after all these years too - I fucking hate modern
    Internet. I fucking hate how social media has taken over us, I fucking
    hate how hard it is to do anything in modern Web.

    I’m right there with you. One of my projects for 2022 is going to be to move away from the web as a primary means of sending or receiving information. I’m looking at things like Jekyll to get away from having
    a heavy stack for my site(s), but even that might be too closely tied to
    the way the modern web works.

    I will take the good ol' times of internetworking on Unix command line
    in 80s over this modern crap every day.

    Well, it’s not like everything was perfectly executed back then, either. For example, no standardization on configuration files has been a constant annoyance for decades. But there is a lot to be said for text file formats of increasing complexity based on need. I mean, web browsers do *so* much these days, yet if you hand them a bit of Markdown they’re left clueless?

    At least there's reader mode, but that's like using uBlock Origin
    instead of serving only what's needed.

    I'm surprised that Gemini managed to get quite popular within like one
    or two years and Firefox still cannot render Markdown natively.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From meff@21:1/5 to science@is.truth on Sat Jan 29 20:28:36 2022
    On 2021-12-17, Scientific (she/her) <science@is.truth> wrote:
    On 3/22/14 12:43 PM, mw wrote:
    I have noticed that after all these years too - I fucking hate modern Internet. I fucking hate how social media has taken over us, I fucking
    hate how hard it is to do anything in modern Web.

    You're trying to solve an emotional or social problem with a technical solution. More people like the Web than the crumudgeons who
    don't. People have voted with their feet.

    I will take the good ol' times of internetworking on Unix command line
    in 80s over this modern crap every day.

    Ableists only apply?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From rtr@21:1/5 to David on Sun Jan 30 08:48:46 2022
    XPost: comp.infosystems.gemini

    David <david@arch.invalid> writes:

    On 2021-12-18, Doc O'Leary wrote:
    On 2021-12-17, Scientific (she/her) wrote:
    On 2014-03-22, mw wrote [re-adding full quote]:
    Web considered harmful
    ======================
    Over the past decade, the internet has seen a transition from
    single-task protocols to the web to the extent that new functionality
    is often only exposed as a web-API with a proprietary protocol.
    While the base protocol (HTTP) and information serialization
    (HTML,
    XML, JSON) is standardized, the methods for extracting information
    from the received data varies from website to website.
    The solution in the 1990s was to make a standardized protocol,
    e.g. IMAP or NNTP, which could be used to access email or news in a
    standardized manner.
    For interfacing with, say, google mail, however, a client
    application
    will have to speak the google mail API which is incompatible with the
    mail API of another provider. This transition is turning the internet
    into a collection of walled gardens with the obvious drawback that
    most websites -- if an API is present at all -- will only have the
    official client implementation to said API available. Mostly there
    will be a few closed-source implementations provided by the vendor,
    most commonly a combination of the following:
    * a website (often with mandatory javascript)
    * a mobile website (possibly without javascript, but optimized
    for
    small screens and thus not very practical on a desktop browser and
    often not exposing all available features)
    * Android or iPhone app (sometimes not exposing all available
    features, restricted to a single platform)
    leaving users little choice in case they are using a different
    platform or want to collect their data in a unified format.
    Even worse is receiving information from websites where no API
    exists.
    There is no standard for logging into websites which have a mandatory
    username/password login prompt and implementations will have to handle >>>> cookies, referer headers (ridiculously many website mandate one for
    XSRF protection even though the standard makes them optional) and site >>>> specific form locations to which POST and GET requests will need to be >>>> made in a site specific order.
    For the most part, there has been no effort in changing any aspect
    of
    this problem, which has existed for more than 10 years. On the
    contrary, companies have consecutively started to discontinue support
    for open web standards such as RSS/Atom.
    Conclusion: The web as it is now is harmful to the open standard
    culture of the internet.
    Related readings (please expand):
    https://www.gnu.org/philosophy/javascript-trap.html
    Comments and discussion would be appreciated.
    Quite a necro, but I approve! :-)

    I have noticed that after all these years too - I fucking hate modern
    Internet. I fucking hate how social media has taken over us, I fucking
    hate how hard it is to do anything in modern Web.
    I’m right there with you. One of my projects for 2022 is going to
    be to
    move away from the web as a primary means of sending or receiving
    information. I’m looking at things like Jekyll to get away from having
    a heavy stack for my site(s), but even that might be too closely tied to
    the way the modern web works.

    I will take the good ol' times of internetworking on Unix command line
    in 80s over this modern crap every day.
    Well, it’s not like everything was perfectly executed back then,
    either.
    For example, no standardization on configuration files has been a constant >> annoyance for decades. But there is a lot to be said for text file formats >> of increasing complexity based on need. I mean, web browsers do *so* much >> these days, yet if you hand them a bit of Markdown they’re left clueless?

    At least there's reader mode, but that's like using uBlock Origin
    instead of serving only what's needed.

    I'm surprised that Gemini managed to get quite popular within like one
    or two years and Firefox still cannot render Markdown natively.


    I think one of the main strengths of gemini is that it's simple enough
    that it allows a lot of people to dip their hands into it and implement
    servers and clients for it but it's also modern enough that we don't
    have to deal with esoteric behavior such as with gopher.

    --
    Give them an inch and they will take a mile.
    --
    gemini://rtr.kalayaan.xyz

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From news@zzo38computer.org.invalid@21:1/5 to David on Sun Jan 30 17:11:57 2022
    XPost: comp.infosystems.gemini

    David <david@arch.invalid> wrote:
    On 2021-12-18, Doc O'Leary wrote:
    On 2021-12-17, Scientific (she/her) wrote:
    On 2014-03-22, mw wrote:

    Over the past decade, the internet has seen a transition from
    single-task protocols to the web to the extent that new functionality
    is often only exposed as a web-API with a proprietary protocol.

    While the base protocol (HTTP) and information serialization (HTML,
    XML, JSON) is standardized, the methods for extracting information
    from the received data varies from website to website.

    That is the case; also these formats are more complicated than they should
    be in some ways but also lack some things unfortunately (e.g. JSON only supports Unicode text and only floating point numbers, not binary data or 64-bit integers (unless encoded); HTTP, HTML, and XML have more problems).

    The solution in the 1990s was to make a standardized protocol,
    e.g. IMAP or NNTP, which could be used to access email or news in a
    standardized manner.

    Yes, and we can still do such things as needed. I also have some other ideas that I mention farther below.

    We can still use protocols such as NNTP, IRC, etc; we can also make up new protocols if they are needed. Multiple protocols for accessing the same messages would also work.

    I would want to promote supporting any suitable protocols, file formats, etc but this isn't common.

    For interfacing with, say, google mail, however, a client application
    will have to speak the google mail API which is incompatible with the
    mail API of another provider. This transition is turning the internet
    into a collection of walled gardens with the obvious drawback that
    most websites -- if an API is present at all -- will only have the
    official client implementation to said API available. Mostly there
    will be a few closed-source implementations provided by the vendor,
    most commonly a combination of the following:

    leaving users little choice in case they are using a different
    platform or want to collect their data in a unified format.

    True. Sometimes specialized formats will be needed for some applications
    (and existing formats may be unsuitable), but they should be documented,
    and conversion software could be available if appropriate.

    Even worse is receiving information from websites where no API exists.

    It is bad; yes. In this way it is necessary to do without, but some web
    pages have other obstructive things, that can get in the way even if you
    are just trying to view it normally, too.

    There is no standard for logging into websites which have a mandatory
    username/password login prompt and implementations will have to handle >>> cookies, referer headers (ridiculously many website mandate one for
    XSRF protection even though the standard makes them optional) and site >>> specific form locations to which POST and GET requests will need to be >>> made in a site specific order.

    Actually there is (HTTP basic/digest auth), but it isn't commonly used, and most web browsers do not provide the user much control over it (such as a command to log out, options to persist sessions, etc).

    For the most part, there has been no effort in changing any aspect of
    this problem, which has existed for more than 10 years. On the
    contrary, companies have consecutively started to discontinue support
    for open web standards such as RSS/Atom.

    Conclusion: The web as it is now is harmful to the open standard
    culture of the internet.

    I agree. However, even if standards are open does not automatically make them good (but it does make them better than proprietary systems).

    Related readings (please expand):
    https://www.gnu.org/philosophy/javascript-trap.html

    One of the things that article says is: "Browser users also need a convenient facility to specify JavaScript code to use instead of the JavaScript in a certain page." I very much agree with this; it is a very important feature. Furthermore, there may be some things that a user might want their alternative scripts to do that the ones included in the document cannot do (e.g. access other programs and files on the user's computer, bypass CORS, etc).

    They also mention Java, Flash, Silverlight, etc. It is true, JavaScript is not the only way; furthermore, Java and JavaScripts are only the programming languages, which are not themself bad, but I think embedding them in documents in this way is bad (but common). There is also WebAssembly, too. So, I will just call these programs in the document as "document scripts" instead (as opposed to JavaScript code which is part of the web browser itself, etc).

    Even if a program is free software, the user does not necessary want to execute that program on their computer, so the above is important, as is such things
    as whitelisting (possibly with cryptographic hashes to identify them). (User specified whitelisting should also be how "secure contexts" are implemented; the existing implementation is no good. Actually, whitelisting by cryptographic hash both solves spies tampering with data in non-TLS, and the server operator changing it to undesirable things in TLS, too; secure contexts fail to solve the latter thing.)

    Some of the criteria for nontrival scripts are a bit strange, such as the criteria that arrays cannot have more than fifty elements. (An actual memory management system to restrict memory allocation might be better. It could
    also restrict execution time, etc, as needed.)

    Also in some cases, it may be wanted to change the definition of some functions before the script is executed.

    Free JavaScript code is insufficient, though. There will also need ways to
    make the data interoperable, including outside of the web browser.

    Also, even if a script is allowed to run, if it requests (for example) camera access, it should allow the user to specify the command or device filename to use as input. This way, web apps that use it can work even if you do not have
    a camera. The same is true for other things, such as audio input/output, MIDI, game controls, etc. It is even true for keyboard commands, so it doesn't override the keyboard commands, or allows user customization, etc.

    Another thing to do, other than scripts, is CSS. I thought the idea of "meta CSS" to allow the end user to customize the interpretation of CSS and all of the priorities, etc. ARIA also helps a bit (or at least it would, if it were implemented; I mention this a bit more below). For example, one thing that a user might want to do is to skip animations (at least, I often find CSS animations to be annoying, and a waste of energy). Another thing would be to specify rules that are disabled in the presence of other rules (for example, sometimes you might want CSS).

    I have noticed that after all these years too - I fucking hate modern
    Internet. I fucking hate how social media has taken over us, I fucking
    hate how hard it is to do anything in modern Web.

    I agree; it is difficult to do many things. But, I don't use Facebook, etc.

    I'm right there with you. One of my projects for 2022 is going to be to move away from the web as a primary means of sending or receiving information. I'm looking at things like Jekyll to get away from having
    a heavy stack for my site(s), but even that might be too closely tied to the way the modern web works.

    I will take the good ol' times of internetworking on Unix command line
    in 80s over this modern crap every day.

    Yes, it is better. Modern designs have problems one is that command-line access is not working very well, and many other problems, too, including not letting the user to specify what they want and assuming things other than what the user had specified, etc. Programs also are not working together very well, unlike the UNIX which can use pipes, etc to use programs together.

    Well, it's not like everything was perfectly executed back then, either. For example, no standardization on configuration files has been a constant annoyance for decades. But there is a lot to be said for text file formats of increasing complexity based on need. I mean, web browsers do *so* much these days, yet if you hand them a bit of Markdown they're left clueless?

    There are a few reasons why they would not implement Markdown, one of which is that there are a few different variants, so they aren't always compatible.

    It is common they implement the bad stuff, some of the good features though are not implemented, and some good feature are even being removed, too.

    However, one feature I find useful due to these mess is the web developer console. Even not being a web developer, it is useful as a end user, too.
    In a few cases, the document.evaluate command might be able to extract data.

    At least there's reader mode, but that's like using uBlock Origin
    instead of serving only what's needed.

    There are some other problems with the reader mode too.

    I would want to implement a "ARIA view" mode, which mostly ignores the CSS (possibly with a few exceptions, such as still paying attention to whether
    or not it specifies a fixed pitch font) in favour of using most HTML commands (except those specifying colours) and ARIA properties, to render the document. (For example, some web applications use custom widgets, but have the suitable ARIA properties; then they can be used to display standard widgets in place of the custom ones. Simply disabling CSS doesn't work; I have tried.)

    One of my ideas is also to have request/response overriding in the client software that can be configured by the user. This would make many other
    options to be unnecessary, such as cookies, language, etc; this is a unified method which does this and a lot more, including things that we have not thought of yet (if the end user can think of it).

    Another thing that could be done is alternative providers. In this way, it
    is possible to provide things in many ways without being locked in and
    without being restricted to specific complicated software, etc.

    There is also one more thing I considered in the case of HTTP, which would allow you to serve Markdown, MathML, FLIF, FLAC, etc, and allows better user customization, accessibility, efficiency (if native implementations are available), possibly reducing bandwidth requirements, etc. It is a new
    response header, which can occur any number of times. If the response is
    not understood, then it can load one of those instead but without changing
    the current document URL. This way, if the user has enabled this feature
    (the user should always be allowed to disable or override stuff; the above request/response overriding already does this in this case), then it would automatically just work as far as the user can see, without needing to do anything special, etc.

    Web browsers (and other programs) need better user control, instead of
    removing good features and adding bad ones, or assuming that the user
    wanted something other than what is specified, etc. I think UNIX philosophy
    is much better, instead.

    Some way to specify common links for data and alternative protocols should
    also be necessary (possibly <link rel="alternate"> might do). The alternate protocols might not have a MIME type, but can still specify the URL.

    It is unfortunate that fixing it involves more things like that instead of
    just making it in a simpler way, but it seems necessary, to me.

    Fortunately, much of the above is not needed in the case of Gemini, which
    does not have these problems. However, I think that the Gemini format and protocol is perhaps a bit too simple (while Markdown is too complicated,
    and HTTP and HTML are too complicated, and PDF is too complcated, etc; FTP
    is also bad but for other reasons). But, for most of the things that Gemini
    is used for, it is probably OK (although, in addition to the current specification, should also implement "insecure-gemini" scheme which is the
    same but without TLS and that 6x responses are not allowed).

    I may have other things to write, but will do so later, instead of now.

    --
    Don't laugh at the moon when it is day time in France.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Doc O'Leary@21:1/5 to news@zzo38computer.org.invalid on Thu Feb 3 04:04:10 2022
    XPost: comp.infosystems.gemini

    For your reference, records indicate that
    news@zzo38computer.org.invalid wrote:

    There are a few reasons why they would not implement Markdown, one of which is
    that there are a few different variants, so they aren't always compatible.

    Neither are all the variants of HTML compatible, but you presumably
    wouldn’t argue that as a reason browsers shouldn’t handle *any* HTML, right? My point is that there are many document formats that have a more
    or less direct conversion to features that are supported by HTML, yet
    feeding one to a “modern” browser that has kitchen-sink support for just about everything else under the sun leaves them dumbfounded. I mean, a
    basic CSV file should be trivially easy to display as any other table would
    be, but is there any major browser that does that?

    It is common they implement the bad stuff, some of the good features though are
    not implemented, and some good feature are even being removed, too.

    What can be said to be bad or good are in the eye of the beholder. I personally dislike the focus on publisher-controlled presentation. CSS
    was supposed to move us away from that, but most browsers don’t make it
    easy to override sites so that the visitor can define their own unique
    view of a usable web.

    It is unfortunate that fixing it involves more things like that instead of just making it in a simpler way, but it seems necessary, to me.

    Well, I’d say it’s only “necessary” in the sense that some people can’t
    see beyond bloating one app until it does everything they need. I can
    easily see a tool developed with the Unix Philosophy in mind, but I can
    also see that most users wouldn’t actually use it, because they are quite happy living in an online world where the presentation is controlled by
    someone else whose aim is continued engagement.


    --
    "Also . . . I can kill you with my brain."
    River Tam, Trash, Firefly

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From news@zzo38computer.org.invalid@21:1/5 to droleary@2017usenet1.subsume.com on Thu Feb 3 11:54:36 2022
    XPost: comp.infosystems.gemini

    One problem in general is that software is not designed for advanced users. Computer software should be designed for advanced users.

    Doc O'Leary <droleary@2017usenet1.subsume.com> wrote:
    There are a few reasons why they would not implement Markdown, one of which is
    that there are a few different variants, so they aren't always compatible.

    Neither are all the variants of HTML compatible, but you presumably wouldn’t argue that as a reason browsers shouldn’t handle *any* HTML, right? My point is that there are many document formats that have a more
    or less direct conversion to features that are supported by HTML, yet
    feeding one to a “modern” browser that has kitchen-sink support for just about everything else under the sun leaves them dumbfounded. I mean, a
    basic CSV file should be trivially easy to display as any other table would be, but is there any major browser that does that?

    It is a valid point. It ought to be possible to make extensions in a web browser
    to implement whatever format you want to including overriding its built-in capability of any format, including using extensions written in native code (loading by .so files, or using pipes), that the end user can set up if wanted. (The same should be true for character encodings and protocols too, in addition to file formats, audio filters, I/O interfaces, etc.)

    Furthermore, I had mentioned the possibility that if the end user has not disabled
    document scripts, then there is possibility to display even if there is no handling
    of that file format built-in or configured by the user, by extra HTTP headers.

    What can be said to be bad or good are in the eye of the beholder. I personally dislike the focus on publisher-controlled presentation. CSS
    was supposed to move us away from that, but most browsers don’t make it easy to override sites so that the visitor can define their own unique
    view of a usable web.

    I agree with you; I also dislike the focus on publisher-controlled presentation.
    Even if the CSS can be overridden (or disabled), this is not good enough. It is one thing why I think that ARIA is important to fix it.

    There can also be adding meta-CSS, which includes codes that can be specified only by the user and is not possibly by publisher, and can use CSS codes as additional criteria, and can change the meaning of certain CSS properties too.

    If a new browser must be written, another alternative is just to not implement CSS at all, maybe. Some things will not work without CSS, but maybe if you have HTML and ARIA, and possibility of user customizations (even if it is its own simplified kind of variant of CSS that only can be used by the end user) then it might be suitable for most, maybe.

    Another feature I would want is to remove many animations.

    It is unfortunate that fixing it involves more things like that instead of just making it in a simpler way, but it seems necessary, to me.

    Well, I’d say it’s only “necessary” in the sense that some people can’t see beyond bloating one app until it does everything they need.

    It makes sense to have different things in different programs, but is sometimes to be suitable to have multiple protocols/formats available in one interface, even if it calls external programs to do so.

    For example, IRC can be a separate program, but it can make sense to support HTTP, HTML, Gemini, Gopher, etc together in one program, although I think that it might be better having the core program not supporting any of these and only the interface which calls extensions to implement them, instead. This way, you can use the links between them, bookmark, etc.

    Additionally, to support end user defining pipes, etc for I/O, which makes it better. This is many older UNIX programs are doing that I would hope, too. (For example, I had design music play program will just write it to stdout, you can pipe to aplay to play back, or sox to convert it, etc. So, the web browser ought
    to be design in such a way, too.)

    (Even other programs will, e.g. the UNIX shell to execute other programs and use
    the pipes to use multiple programs together, loading SQLite extensions in the SQLite command shell to use their functions and virtual tables in one interface,
    a picture editor or sound editor GUI program to load plug-ins for file formats etc, and others, so the web browser should do so, too.)

    I can easily see a tool developed with the Unix Philosophy in mind, but I can also see that most users wouldn’t actually use it, because they are quite happy living in an online world where the presentation is controlled by someone else whose aim is continued engagement.

    Multiple programs can be made for similar purpose, and I think that is what is needed. Unfortunately, WWW is rather difficult and extra stupid. But, I would hope that it can be done (even if some features are excluded; I can live with that, and actually deliberately want to exclude some features, and for some
    of them to be implemented in an entirely different way than what the existing implementations are currently doing).

    Programs with subsets/supersets of features, and different sets of features, can also be possible; this should not exclude such a possibility.

    The lack of capabilities of WebExtension is problematic. One thing that will partially help is to allow loading native code extensions (.so files, or you can sometimes use pipes which sometimes can mean you might not need to write
    an extension for the web browser). Such native code extensions could call back into the JavaScript interface, too.

    Extensions added through the extension catalog should not be allowed to load native codes; to do so you must install it by yourself instead.

    --
    Don't laugh at the moon when it is day time in France.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From meff@21:1/5 to news@zzo38computer.org.invalid on Fri Feb 4 02:38:31 2022
    XPost: comp.infosystems.gemini

    On 2022-02-03, news@zzo38computer.org.invalid <news@zzo38computer.org.invalid> wrote:
    One problem in general is that software is not designed for advanced users. Computer software should be designed for advanced users.
    One could say this about _anything_ no? Cars should be made for
    advanced users, tools should be made for advanced users, kitchens
    should be made for advanced users, and so on. I think the reality is
    that most humans are not advanced users of most things.

    Largely I think this thread is about technology people lamenting a
    past where the net was only for other technology people. But the net
    is infinitely wide. There's space for everyone on here. There doesn't
    need to be gatekeeping on the net. We're not running out of internet
    any time soon.

    For example, IRC can be a separate program, but it can make sense to support HTTP, HTML, Gemini, Gopher, etc together in one program, although I think that
    it might be better having the core program not supporting any of these and only
    the interface which calls extensions to implement them, instead. This way, you
    can use the links between them, bookmark, etc.

    With HTTP/2 and HTTP/3 this doesn't necessarily need to be
    true. HTTP/2 and HTTP/3 is good enough at this point to give you a
    duplex channel.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Doc O'Leary@21:1/5 to news@zzo38computer.org.invalid on Fri Feb 4 20:49:37 2022
    XPost: comp.infosystems.gemini

    For your reference, records indicate that
    news@zzo38computer.org.invalid wrote:

    One problem in general is that software is not designed for advanced users. Computer software should be designed for advanced users.

    Underlying that, it is often the case that software is not designed *by* advanced users. Which is to say, that even if the developers are tech
    rock stars, the are usually answering to some MBA who doesn’t have a
    clue what it means to have software that is well-architected.

    If a new browser must be written, another alternative is just to not implement
    CSS at all, maybe. Some things will not work without CSS, but maybe if you have
    HTML and ARIA, and possibility of user customizations (even if it is its own simplified kind of variant of CSS that only can be used by the end user) then it might be suitable for most, maybe.

    I think the kitchen-sink nature of the modern web is just too brittle to
    *not* need a completely restructured browser. Trying to jam everything
    into HTML, including ARIA, is not a great approach. I mean, if there are
    parts of a web page that are semantically navigation links, I’m not sure
    why that is getting served up as part of the page content in the first
    place, never mind layering CSS on top of it to display it in some
    particular way that is not in the viewers best interest.

    Another feature I would want is to remove many animations.

    Auto-load videos (especially with sound) are something I could do
    without, too. I remember when there used to be a click-to-play
    extension that disabled Adobe Flash, but now that multimedia is “standard” on the modern web, it has become harder and harder to eliminate such
    things, especially on mobile platforms.

    Another feature along those lines would be to put a limit on how much data you’ll allow a page to load. There is no web page that I want to visit sight-unseen that requires 400MB of data to be loaded and consumes 2GB of
    RAM.

    It makes sense to have different things in different programs, but is sometimes
    to be suitable to have multiple protocols/formats available in one interface, even if it calls external programs to do so.

    Sure. Even browsers themselves these days spin up additional processes to sandbox pages for security and UI responsiveness. The problem remains
    that, for the modern web, things are all fundamentally controlled by the
    remote server. So long as that transaction is more about rendering a page
    a certain way rather than transferring information for the user to do with
    as they please, the web will increasingly become bogged down by its own
    weight.

    For example, IRC can be a separate program, but it can make sense to support HTTP, HTML, Gemini, Gopher, etc together in one program, although I think that
    it might be better having the core program not supporting any of these and only
    the interface which calls extensions to implement them, instead. This way, you
    can use the links between them, bookmark, etc.

    I’ve always liked the idea of a common UI over some kind of middleware. I mean, whether it’s email or Usenet or Reddit or chat, I should be able to *whatever* software I like for viewing messages in a conversation. But I
    do acknowledge that most people are simply unable or unwilling to separate
    the content from its presentation.

    --
    "Also . . . I can kill you with my brain."
    River Tam, Trash, Firefly

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Doc O'Leary@21:1/5 to meff on Sat Feb 5 18:42:12 2022
    XPost: comp.infosystems.gemini

    For your reference, records indicate that
    meff <email@example.com> wrote:

    Largely I think this thread is about technology people lamenting a
    past where the net was only for other technology people. But the net
    is infinitely wide. There's space for everyone on here. There doesn't
    need to be gatekeeping on the net. We're not running out of internet
    any time soon.

    I would argue somewhat the opposite. We *are* definitely running out of Internet that is free and open for people. That especially applies to
    the web, where large corporations have exercised vast power to manipulate people to act against their own best interest. Complaints of
    “gatekeeping” on Usenet ring hollow; if the “space” provided by Facebook
    and Twitter are more to your liking, go there and try to have this kind
    of discussion.

    With HTTP/2 and HTTP/3 this doesn't necessarily need to be
    true. HTTP/2 and HTTP/3 is good enough at this point to give you a
    duplex channel.

    HTTP/3 is so different from HTTP/2 that they shouldn’t even be discussed
    as being related protocol. It leaves me stepping back even further from
    the request semantics and question what people are even looking to
    accomplish. Too many things (e.g., microservice APIs) are jammed through
    HTTP simply because web stacks are so common, not because they’re a good
    way to get the job done.

    So, if anything, I’m lamenting the past where the web was *just* the web.
    It was a particular kind of information system, exchanging mainly HTML documents, that people could easily read and link to. Then it lost sight
    of the Unix Philosophy and tried to become everything to everybody. So
    (again, in full acknowledgement of the irony of discussing this on Usenet
    when so many people have had their attention absorbed by web forums
    controlled by social media companies) I ask you: what do you think the
    WWW *shouldn’t* do?

    --
    "Also . . . I can kill you with my brain."
    River Tam, Trash, Firefly

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From meff@21:1/5 to Doc O'Leary on Sat Feb 5 23:24:15 2022
    XPost: comp.infosystems.gemini

    On 2022-02-05, Doc O'Leary <droleary@2017usenet1.subsume.com> wrote:
    I would argue somewhat the opposite. We *are* definitely running out of Internet that is free and open for people.

    I agree with your sentiment but not your diagnosis. Getting people to
    care about a free and open Web is the fight that's being lost
    here. The Internet is as it always was. Tier 1s are peering and asking
    for transit, as are Tier 2 and Tier 3. While IPv4s have become
    expensive due to exhaustion, IPv6 /64s cost peanuts. You can get an IP
    address and send a packet to another IP address any day (well,
    depending on if the ISPs have put the recepient behind CGNAT or not.) Unfortunately _people_ don't care about the freedom and openness
    anymore.

    It's up to us to _educate_ folks about what's lacking. I
    don't find this complaining-behind-closed-doors behavior particularly
    conducive to this though. We need to remind people about
    why free and open communication is important, no matter whether the
    captor is a corporation or a government. You can't achieve that by
    calling names, in fact people are even less likely to listen to you if
    you call them names.

    That especially applies to
    the web, where large corporations have exercised vast power to manipulate people to act against their own best interest. Complaints of “gatekeeping” on Usenet ring hollow; if the “space” provided by Facebook
    and Twitter are more to your liking, go there and try to have this kind
    of discussion.

    I don't find it gatekeeping as much as complaining. My father loves to
    lament times gone by but his memories conveniently edits away all the downsides. Again I find this behavior unproductive and closed
    minded. You'll never get people to care about freedom if you start out
    by insulting them or complaining about them. My father remains
    unpopular at dinner parties.

    HTTP/3 is so different from HTTP/2 that they shouldn’t even be discussed
    as being related protocol. It leaves me stepping back even further from
    the request semantics and question what people are even looking to accomplish. Too many things (e.g., microservice APIs) are jammed through HTTP simply because web stacks are so common, not because they’re a good way to get the job done.

    The authors of QUIC (the standard that eventually became HTTP/3) had
    started by trying to create a non-Web protocol from the ground up. The
    trouble was middleboxes. Middleboxes would throw away anything that
    wasn't on a few set of explicitly allowed ports (HTTP, HTTPS, SMTP) or
    wasn't just TCP traffic. I fully admit IMO that Google used their
    influence to jam their vision of the future of network transit into
    the IETF which is why QUIC was chosen as HTTP/3, but HTTP/3 did start
    out trying to be a different way of transiting packets over the
    net. Unfortunately, ISPs do not want to upgrade or evolve their
    middleboxes in any way.

    So, if anything, I’m lamenting the past where the web was *just* the web. It was a particular kind of information system, exchanging mainly HTML documents, that people could easily read and link to. Then it lost sight
    of the Unix Philosophy and tried to become everything to everybody. So (again, in full acknowledgement of the irony of discussing this on Usenet when so many people have had their attention absorbed by web forums controlled by social media companies) I ask you: what do you think the
    WWW *shouldn’t* do?

    I don't think it should or should not do anything. I am not an
    architect of humanity. I am not God. I'm fine with humans doing what
    they will. The Web is only as useful as the entities that produce
    content for it and the entities that consume content on it. I would
    like to see a world where humans once again understand why early
    Internet pioneers fought so hard for neutral networks, but at the end
    of the day I recognize that my views are minority ideas and that all I
    can do is try to sway hearts and minds, not tell others what to
    do. Most importantly I may be _wrong_ and the others may be right. I
    respect the will of other free humans.

    I'd like to try to meet others in the middle. That might mean offering
    Web interfaces for Usenet, writing about the forgotten parts of the
    Net that still have posters like you and I. One "carrot" I like to
    offer folks is censorship; unmoderated newsgroups have nobody telling
    you what is and is not verboten. Nobody can systematically silence
    you. Others may killfile you, but nobody has power over your voice on
    Usenet the way Reddit can just ban people and entire
    communities. The same goes for other net technologies like email.

    But it's also important to understand why the status quo exists
    (instead of just getting angry at it.) CGNAT makes P2P technology
    nearly impossible on the web. Email is overrun with spam. Mobile
    phones consume too much battery to keep persistent connections
    open. Most middleboxes block UDP packets. ISPs prioritize downlinks
    over uplinks and offer terrible QoS on uplinks. Most non-Web traffic
    is unencrypted and leaks personal information to middleboxes. The web
    has succeeded because it was relatively simple for ISPs to operate, so
    most of the complexity was pushed up to the application protocol (with
    stateful cludges like cookies.)

    I'm hoping that if HTTP/3 can actually become a net standard that
    middleboxes respect, that we can _finally_ start sending UDP packets,
    which would be more convenient for mobile devices and for many
    protocols. Wireguard tunnels (or Zerotier) and services built atop
    them, like Tailscale, have brought E2EE IP tunnels to people in an
    accessible way. Meshnets like CJDNS and Yggdrassil are out there which
    can tunnel over regular IPv4 connections. I use my energy to educate
    my friends and family about the importance of a free and open Internet
    and encourage more tinkering-happy friends of mine to play around with
    the "real" Internet, the one with IP packets flowing freely between
    hosts.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Doc O'Leary@21:1/5 to meff on Sun Feb 6 19:21:53 2022
    XPost: comp.infosystems.gemini

    For your reference, records indicate that
    meff <email@example.com> wrote:

    The Internet is as it always was.

    It clearly isn’t, nor should anyone expect it to be. A *crapload* has changed since the start of Eternal September. Non-technical people
    don’t care *at all* about things like IPv6 or HTTP/3, though. They
    don’t even care about the application layer, and couldn’t even begin
    to tell you anything about how the WWW works beyond opening their
    browser and entering a URL.

    It's up to us to _educate_ folks about what's lacking.

    Nope. You need to talk with more non-technical people. They are already
    aware of the ills of the modern Internet, but they feel helpless to do
    anything about it. I have friends who know that WhatsApp is toxic, but
    can’t bring themselves to abandon it because the network effect is too strong. Other friends leave their smart phones at the door along with
    their shoes when they get home because notifications of all kinds have
    become too demanding of their attention. People *know* they’re being manipulated, but the FOMO is overpowering for them.

    You can't achieve that by
    calling names, in fact people are even less likely to listen to you if
    you call them names.

    Who advocated that? I *will* say someone is doing something wrong if I
    think they’re doing something wrong, though. And there is *a lot* wrong
    with the modern Internet.

    I don't find it gatekeeping as much as complaining. My father loves to
    lament times gone by but his memories conveniently edits away all the downsides. Again I find this behavior unproductive and closed
    minded. You'll never get people to care about freedom if you start out
    by insulting them or complaining about them. My father remains
    unpopular at dinner parties.

    That’s sad. You have been successfully manipulated into thinking that criticism is closed minded and should be viewed as unpopular. You’ve
    fallen for the relentless positivity that pushes social media engagement.
    The world will never get better if people are unable or unwilling to face
    our problems head on. Maybe your father’s problem is that he’s choosing
    to attend vacuous dinner parties?

    The
    trouble was middleboxes. Middleboxes would throw away anything that
    wasn't on a few set of explicitly allowed ports (HTTP, HTTPS, SMTP) or
    wasn't just TCP traffic.

    And that is a problem that definitely should be solved, but the *right* solution is not to ham-fistedly jam even *more* under the umbrella of
    the WWW. If you want to make the argument of “is as it always was”,
    you can’t just roll over for every power play to co-opt standards that
    Google makes.

    Most importantly I may be _wrong_ and the others may be right.

    But you can’t actually sort that out unless you take a position in the
    first place. And solutions can both be wrong *and yet* popular. Yes,
    people are free to do what they will, but part of that should be the
    adult responsibility of, as you have done, acknowledging that they *can*
    be wrong.

    Nobody can systematically silence
    you. Others may killfile you, but nobody has power over your voice on
    Usenet the way Reddit can just ban people and entire
    communities. The same goes for other net technologies like email.

    Yet another thing that isn’t “is as it always was”. Modern email, despite still being based on open protocols, is largely controlled by gatekeepers like Google and Amazon. You *will* be systematically
    silenced for reasons of their choosing. Worse, cloud providers are
    more than happy to mix traffic from abusive customers in with
    legitimate users, turning them into human shields.

    I'm hoping that if HTTP/3 can actually become a net standard that
    middleboxes respect, that we can _finally_ start sending UDP packets,

    And while I can respect that as the ends, I don’t accept that the means
    of achieving it is respectable. History has shown that ISPs are more
    than willing to drag their feet or completely torpedo technology advances
    just because it is easier to do nothing new. If you truly want to open innovation back up, as I keep saying, you should *not* be asking for
    changes that can be restricted to *just* the WWW.

    You mention spam, and that is another *great* example of how problems are
    not getting solved on the modern Internet. I have *actual* solutions for
    spam, which is why I can give a valid email on my Usenet posts. But the
    big guys don’t really want to eliminate spam, because it gives them too
    much control over users. Some people look at what Google is doing and
    actually think they represent best practices!

    So, no, I don’t really expect adding more to WWW standards is going to
    make things better for anyone. Neither do I think nostalgia for a past Internet is particularly productive. My argument remains that we need
    to be looking at what is right and wrong about what we’re doing, and
    make changes for the better. For me, that means moving away from the
    WWW to systems that aren’t trying to act as the be-all solution for everything online.

    My aim for 2022 is to downgrade my web pages to be mostly static and
    ideally serverless. I’m going to see if I can move away from HTML-only
    and go with simpler text formats like Markdown, CSV, and YAML. I’ve
    done similar projects in the past when I abandoned Drupal, so I know it
    can be done. Then upside of browsers having a kitchen-sink approach is
    that you can turn it around on itself and force it to function almost
    like a usable information system! :-)

    --
    "Also . . . I can kill you with my brain."
    River Tam, Trash, Firefly

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)