The Three Pillars of JavaScript Bloat

(43081j.com)

62 points | by onlyspaceghost 2 hours ago

7 comments

  • zdc1 1 hour ago
    A lot of this basically reads to me like hidden tech debt: people aren't updating their compilation targets to ESx, people aren't updating their packages, package authors aren't updating their implementations, etc.

    Ancient browser support is a thing, but ES5 has been supported everywhere for like 13 years now (as per https://caniuse.com/es5).

    • userbinator 55 minutes ago
      The newer version is often even more bloated. This whole article just reinforces my opinion of "WTF is wrong with JS developers" in general: a lot of mostly mindless trendchasing and reinventing wheels by making them square. Meanwhile, I look back at what was possible 2 decades ago with very little JS and see just how far things have degraded.
      • michaelchisari 17 minutes ago
        A standard library can help, but js culture is not built in a way that lends to it the way a language like Go is.

        It would take a well-respected org pushing a standard library that has clear benefits over "package shopping."

    • anematode 48 minutes ago
      The desire to keep things compatible with even ES6, let alone ES5 and before, is utterly bizarre to me. Then you see folks who unironically want to maintain compatibility with node 0.4, in 2025, and realize it could be way worse....

      Ironically, what often happens is that developers configure Babel to transpile their code to some ancient version, the output is bloated (and slower to execute, since passes like regenerator have a lot of overhead), and then the website doesn't even work on the putatively supported ancient browsers because of the use of recent CSS properties or JS features that can't be polyfilled.

      I've even had a case at work where a polyfill caused the program to break. iirc it was a shitty polyfill of the exponentiation operator ** that didn't handle BigInt inputs.

      • fragmede 38 minutes ago
        Just how old an Android device in the developing world do you not want to support? Life's great at the forefront of technology, but there's a balancing act to be able to support older technology vs the bleeding edge.
        • anematode 32 minutes ago
          I like the sentiment, but building a website that can actually function in that setting isn't a matter of mere polyfills. You need to cut out the insane bloat like React, Lottie, etc., and just write a simple website, at which point you don't really need polyfills anyway.

          In other words, if you're pulling in e.g. regenerator-runtime, you're already cutting out a substantial part of the users you're describing.

      • hsbauauvhabzb 29 minutes ago
        I’ve been very lost trying to understand the ecosystem between es versions , typescript and everything else. It ends up being a weird battle between seemingly unrelated things like require() vs import vs async when all I want to do is compile. All while I’m utterly confused by all the build tools, npm vs whatever other ones are out there, vite vs whatever other ones are out there, ‘oh babel? I’ve heard the name but no idea what it does’ ends up being my position on like 10 build packages.

        This isn’t the desire of people to build legacy support, it’s a broken, confusing and haphazard build system built on the corpses of other broken, confusing and haphazard build systems.

  • burntoutgray 1 hour ago
    I have a single pillar, admittedly for in-house PWAs: Upgrade to the current version of Chrome then if your problem persists, we'll look into it.
  • sipsi 0 minutes ago
    i suggess jpeg.news dot com
  • turtleyacht 1 hour ago
    It would be interesting to extend this project where opt-in folks submit a "telemetry of diffs," to track how certain dependencies needed to be extended, adapted, or patched; those special cases would be incorporated as future features and new regression tests.

    Someday, packages may just be "utility-shaped holes" in which are filled in and published on the fly. Package adoption could come from 80/20 agents [1] exploring these edges (security notwithstanding).

    However, as long as new packages inherit dependencies according to a human author's whims, that "voting" cycle has not yet been replaced.

    [1] https://news.ycombinator.com/item?id=47472694

  • sheept 1 hour ago
    I wonder this means there could be a faster npm install tool that pulls from a registry of small utility packages that can be replaced with modern JS features, to skip installing them.
    • seniorsassycat 28 minutes ago
      Not sure about faster, but you could do something with overrides, especially pnpm overrides since they can be configured with plugins. Build a list of packages that can be replaced with modern stubs.

      It couldn't inine them, but it could replace ponyfils with wrappers for native impls, and drop the fallback. It could provide simple modern implementations of is-string, and dedupe multiple major versions, tho that begs the question what breaking change lead to a new mv and why?

  • skydhash 1 hour ago
    Fantastic write up!

    And we're seeing rust happily going down the same path, especially with the micro packages.

  • leontloveless 42 minutes ago
    [dead]