• https

    From Ivan Shmakov@21:1/5 to All on Sun Nov 27 07:07:49 2016
    XPost: alt.html

    JJ <jj4public@vfemail.net> writes:
    On Sat, 19 Nov 2016 11:31:51 +0100, Athel Cornish-Bowden wrote:

    [Cross-posting to news:comp.infosystems.www.misc.]

    I'm encountering more and more sites that use https for pages where
    no particular need for security is evident. This seems to be a very
    recent thing (within the past few months). It is a considerable
    nuisance as my browser refuses to open pages if it can't establish a
    secure connection. Any suggestions as to why authors do this?

    Probably due to pressure from digital certificate propaganda
    scarecrow. The digital certificate prodivers are the ones that
    benefit the most.

    Well, we have all sorts of community-driven things now; like,
    say, a free encyclopedia. This even spans to the areas where
    security is of great importance -- like for an operating system.

    It's no surprise that efforts began to create a community-driven
    CA, too. Sadly, after over a decade, they still are having
    way too little luck with general recognition.

    I choose to rely on their X.509 certificates nevertheless.
    If anything, after the Heartbleed controversy, theirs look more
    trustworthy than StartCom's.

    HTTPS is only effective against man-in-the-middle attack. Any hacker
    who want to brute force a site login or do SQL injection exploit
    don't need to bother with the HTTPS encryption because anyone can
    access the site at protocol level.

    The "passive eavesdropper" case (as in: government surveillance;
    or just one's employer or school "getting curious") is generally
    distinguished from MitM -- but is no less an important attack
    that can be mitigated with encryption in general, including TLS.

    --
    FSF associate member #7257 58F8 0F47 53F5 2EB2 F6A5 8916 3013 B6A0 230E 334A

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Ivan Shmakov@21:1/5 to All on Sun Nov 27 08:24:22 2016
    XPost: alt.html

    Lewis <g.kreme@gmail.com.dontsendmecopies> writes:
    Athel Cornish-Bowden <acornish@imm.cnrs.fr> wrote:

    [Cross-posting to news:comp.infosystems.www.misc.]

    Summary: I like my little proxy oh so much. When HTTPS
    interferes with my intent to use it, I deem that a problem
    to solve, not a feature to value.

    I'm encountering more and more sites that use https for pages where
    no particular need for security is evident.

    The requirement for HTTP2 is TLS.

    Could you please cite the relevant RFC section on this?
    At the very least, Libcurl has support for non-TLS HTTP/2.

    Most sites are going secure because there is no reason not to.

    For a site operator? Perhaps. But there, the best interests
    of them and their users may disagree.

    Say, I can easily imagine a retrocomputing hobbyist wanting to
    read Wikipedia with, say, Arachne (on a box running FreeDOS) --
    and it doesn't seem to support TLS at all.

    Similarly, there was that case last year when Wikipedia got
    blacklisted in Russia. Previously, only specific articles
    (URIs) were blacklisted, but with WMF going full-TLS,
    it became impossible for the authorities and ISPs to comply
    with court decisions without blacklisting their whole sites.

    (Although in this specific case, it makes me wonder if the
    legislators behind those laws ever looked into that little
    document entitled "Constitution of the Russian Federation".)

    Finally, there's a general interest in adapting the Web to one's
    own preferences. (Or, at times, in "unscrewing the creativeness"
    of certain site operators -- to put it bluntly.)

    There's a variety of tools for that purpose, including
    Greasemonkey and, to an extent, NoScript extensions for Firefox
    (to deal with JavaScript or lack thereof), Stylish (to tame
    CSS), various "AdBlock" things, etc. Unfortunately, such
    software is most often specific to a certain user agent
    (or engine); for instance, I was unable to find a NoScript-like
    extension for Chromium when I needed one about a year ago.

    An obvious cross-browser solution to such problems would be
    to "filter" the resources retrieved from the remote through
    some software before handing them to the user agent. Which is,
    technically, exactly the same thing TLS is ought to prevent.

    ... Although workarounds are still possible -- like running
    'sslstrip' (that still requires one to take care of HSTS.) Or:

    For a few sites I visit often, I get them through a CGI Perl script
    on localhost that fetches the page and edits it to expunge bloat and
    garbage.

    -- Mike Spencer, in news:comp.misc (25 Oct 2016,
    news:87wpgxnjcs.fsf@bogus.nodomain.nowhere.)

    This seems to be a very recent thing (within the past few months).
    It is a considerable nuisance as my browser refuses to open pages if
    it can't establish a secure connection.

    Why is your browser unable to establish a secure connection?

    In my case, I choose not to trust some of the well-known CAs.
    (Although I admit I'm not anywhere near my goal of minimizing
    the attack surface by shortening the list down to perhaps
    just about a dozen of CAs that I cannot /avoid/ relying upon;
    while applying "security exceptions" on a case-by-case basis.)

    Sure enough, if the remote happens to /also/ have a
    Strict-Transport-Security: header configured, a compliant user
    agent will then forbid me /any/ access to the resource.

    (Clearly a case where RFC compliance takes over common sense.)

    Any suggestions as to why authors do this?

    Authors don't. There is no "make this page secure" HTML code.
    Security is set by the server admin.

    Who at times coincides with the author, however.

    --
    FSF associate member #7257 58F8 0F47 53F5 2EB2 F6A5 8916 3013 B6A0 230E 334A

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)