• I think we should have a standard for browser extensions that allows extensions to inspect and drop requests for unwanted ad resources.
  • AFAICT, the W3C produces 2 types of standards:

    - standards they write themselves that everyone ignores - standards they copy from the WHATWG

    • > standards they write themselves that everyone ignores

      CSS, WAI-ARIA, SVG, WebGPU, WebAuthn...and a large number of APIs that are referenced as part of the HTML spec but developed and standardized by different W3C groups.

      > standards they copy from the WHATWG

      Not for six years now: https://www.w3.org/blog/2019/w3c-and-whatwg-to-work-together...

      Lots of hn people need to update their priors about modern web standardization work.

    • I think the RDF standards have produced many useful tools for those that work with graph data. And the W3C is a useful coordination place for new standards like Verifiable Credentials[0] and Decentralized identifiers[1] and JSON Linked Data[2], which are all being used in ActivityPub, Bluesky, and a lot of other decentralizing projects.

      [0]https://en.wikipedia.org/wiki/Verifiable_credentials [1]https://en.wikipedia.org/wiki/Decentralized_identifier [2]https://en.wikipedia.org/wiki/JSON-LD

    • The CSS and specs are maintained by W3C and are widely implemented by browsers. Likewise for the WAI-ARIA, WCAG, MathML, and SVG specs.

      The XML and related specs are implemented by various applications and libraries even if web browsers dislike these specs. -- They are used a lot in document publishing workflows that use formats like JATS, and are supported by various tools and libraries.

      SVG is widely supported in vector graphics applications and rendering tools.

      And WHATWG hasn't just co-opted W3C specs -- it's also co-opted encoding, URIs and others from places like the RFCs.

    • Which is ActivityPub?
      • An incompatible attempt at reformulating Mastodon Protocol. If you've ever actually tried to work with the protocol, you'll know the standard only loosely describes it. If you attempt to implement it by following the standard, you won't be compatible with anything else, because everything else implements Mastodon Protocol instead (and calls it ActivityPub).
        • Touché. Mastodon is the Internet Explorer of ActivityPub. There are other, non-Mastodon extensions, and some ActivityPub-spec functionality that Mastodon eschews but other implementations support, but overall this is an accurate summary. Especially regarding the C2S protocol: most apps just use the Mastodon API instead (see https://github.com/mastodon/mastodon/issues/10520).
      • If that's your best example of a well used w3c native standard, thanks for helping me prove my point.
  • zb3
    Standards start at Google nowadays..
    • … And that's clearly a major concern because my front end developer colleagues treat everything that Google does as the one true way of the web. No matter if it is accepted into a standards body, or if the Chrome implementation is buggy, that is their target and every other browser is an after-thought at best.
      • Historically nothing has changed just who’s on top. Mosaic, Netscape, IE, Chrome…
      • I think this misses the point. If the vast majority of your users use something Chromium based, that's where you should put most of your effort. It doesn't matter if it's the right way or whatnot, your users only care about whether it works for them or not. Users don't care about the technicalities.
        • While true and a pragmatic approach, that's another part of the same root problem.
    • Considering the HTML provided by google dot com has 19 warnings and a Content-Security-Policy error for a page that only has a text field and a logo, I'm gonna take my chances and apply to become a W3C invited expert.
      • Surely they have pretty good reasons for doing things that way, don't you think?

        That page is probably among the top 3 most analyzed, optimized and A/B tested webpages in the world.

        • That's not my question to answer, though.

          Google controls the browser, the page, the CDN, AND to a large extent the very standards that the browser has to comply with.

          If what you claim is true, with all of this authority, why can't they write a compliant web page?

          • That's precisely why it seems likely to be deliberate, they could be playing 3D chess.

            It could be a way to keep some kind of competitive edge or some kind of fingerprinting strategy or for some other reason altogether.

          • > If what you claim is true, with all of this authority, why can't they write a compliant web page?

            I'm not sure why you'd think that they "can't" write a compliant web page. It's obvious they can, just like it's obvious they've been paying a bunch of experts top dollars for multiple decades to think about and test what exactly to write in this page's code. It's also obvious they've taken into account the basic fact that every character* they add costs them a measurable amount of money to serve given their scale.

            It's therefore pretty obvious that they're deliberately choosing to write a non-compliant web page. Presumably because among the multiple billions of users they serve this page to, a high-enough-to-matter portion is still using old and/or non compliant web browsers and they don't want to cut them out.

            * past certain packet length cutoffs

    • Yeah, everyone apparently failed to learn from IE lesson, and served the Web on a plate to Google.

      They even ship Chrome with their applications, because people can't be bothered to learn neither native UI frameworks, nor portable Web standards.