456 points by onlyspaceghost 1 day ago | 267 comments | View on ycombinator
auxiliarymoose 1 day ago |
stevoski about 20 hours ago |
I feel like part of the blame for the situation is that JavaScript has always lacked a standard library which contains the "atomic architecture" style packages. (A standard library wouldn't solve everything, of course.)
andai 1 day ago |
The main cause of bloat is not polyfills or atomic packages. The cause of bloat is bloat!
I love this quote by Antoine de Saint-Exupéry (author of the Little Prince):
"Perfection is achieved, not when there is nothing left to add, but nothing to take away."
Most software is not written like that. It's not asking "how can we make this more elegant?" It's asking "what's the easiest way to add more stuff?"
The answer is `npm i more-stuff`.
zdc1 1 day ago |
Ancient browser support is a thing, but ES5 has been supported everywhere for like 13 years now (as per https://caniuse.com/es5).
rtpg about 24 hours ago |
Like seriously... at 50 million downloads maybe you should vendor some shit in.
Packages like this which have _7 lines of code_ should not exist! The metadata of the lockfile is bigger than the minified version of this code!
At one point in the past like 5% of create-react-app's dep list was all from one author who had built out their own little depgraph in a library they controlled. That person also included download counts on their Github page. They have since "fixed" the main entrypoint to the rats nest though, thankfully.
https://www.npmjs.com/package/has-symbols
SachitRafa 1 day ago |
burntoutgray 1 day ago |
kigiri about 9 hours ago |
One to generate zip files, one for markdown parsing, connecting to postgres, etc... most of them have no sub dependencies.
We always reach out first to what nodejs lib have, try to glue ourself small specific piece of code when needed.
The app is very stable and we have very few frustrations I used to have before. Note that we used to have way more but bit by bit removed them.
Now I would whitelist anything from the deno std* lib, they did a great job with that, even if you don't use Deno, with what ever your runtime provide plus deno std you never need more than a few packages to build anything.
JS is doing pretty good if you are mindful about it.
AltruisticGapHN about 18 hours ago |
Seriously what kind of business today needs to support ES3 browsers? Even banking sites should refuse to run on such old devices out of security concerns.
prinny_ about 18 hours ago |
Nobody argues what we currently have is great and that we shouldn't look to improve it. Reducing it to "JS developers bad" is an embarrassing statement and just shows ignorance, not only of the topic at hand, but of an engineering mindset in general.
g947o about 16 hours ago |
> There is a user in the JavaScript community who goes around adding "backwards compatibility" to projects. They do this by adding 50 extra package dependencies to your project, which are maintained by them.
wheattoast about 18 hours ago |
Easy enough for y’all with techie salaries, but as one of the millions of poor folks whose paychecks barely (or don’t even) pay the bills, it’d be really nice if we didn't have to junkheap our backbreakingly expensive hardware every few years just cuz y’all are anorexically obsessed with lean code, and find complex dependancies too confusing/bothersome to maintain.
baubino about 4 hours ago |
socalgal2 about 10 hours ago |
derodero24 about 15 hours ago |
procaryote about 21 hours ago |
Instead they've elevated it to a cultural pillar and think they've come up with a great innovation. It's like talking to antivaxers
algolint about 18 hours ago |
DanielHB about 15 hours ago |
> [...]
> Each of these having only one consumer means they’re equivalent of inline code but cost us more to acquire (npm requests, tar extraction, bandwidth, etc.).
It costs FAR more than dep install time. It has a runtime cost too, especially if in frontend code using bundlers where it also costs extra bundlespace and extra build time.
il-b about 24 hours ago |
tylerchilds about 13 hours ago |
for example, javascript runs in a browser or on microcontrollers. you can write code that work for both natively [1].
TypeScript-- a mechanism that needs to compile first into javascript
React-- a mechanism that needs to compile first into javascript
Configuration-de-jour-- Depending on how you need to string your TypeScript and React together, there's a thing you need to manage your javascript managers. Vite is the best option in this field, since it recognizes exposing tools to fine tune how to optimize your resulting javascript from your typescript and react is a terrible idea that leads to mass fragmentation on a global scale for what it even means to "spin up a js project"
In conclusion, is javascript a compile target like assembly or a language that people can handcode to eek performance out of like assembly?
est about 24 hours ago |
For personal objects I always prompt the AI to write JS directly, never introduce nodejs stack unless absolutely have to.
Turns out you don't always need Nodejs/Reactto make a functional SPA.
skrtskrt about 24 hours ago |
wiseowise about 20 hours ago |
That’s awesome. Could be hooked as a pre-commit for agents to do the grunt work of migration.
skydhash 1 day ago |
And we're seeing rust happily going down the same path, especially with the micro packages.
lerp-io about 21 hours ago |
IAmLiterallyAB about 23 hours ago |
undefined about 24 hours ago |
turtleyacht 1 day ago |
Someday, packages may just be "utility-shaped holes" in which are filled in and published on the fly. Package adoption could come from 80/20 agents [1] exploring these edges (security notwithstanding).
However, as long as new packages inherit dependencies according to a human author's whims, that "voting" cycle has not yet been replaced.
sheept 1 day ago |
casey2 about 22 hours ago |
stephenr about 24 hours ago |
For $client we've taken a very minimal approach to JavaScript, particularly on customer facing pages. An upcoming feature finally replaces the last jquery (+ plugin) dependent component on the sales page, with a custom implementation.
That change shaved off ~100K (jquery plus a plugin removed) and for most projects now that probably seems like nothing.
The sales page after the change is now just 160K of JS.
The combination of not relying on JS for everything and preferring use-case-specific implementations where we do, means we aren't loading 5 libraries and using 1% of each.
I'm aware that telling most js community "developers" to "write your own code" is tantamount to telling fish to "just breathe air".
wonnage about 20 hours ago |
Bundlers handle this by automatically creating bundles for shared modules. But if you optimize to avoid all shared modules, you end up with hundreds of tiny files. So most bundlers enforce a minimum size limit. That's probably fine for a small app. But one or more of these things happens:
1. Over time everybody at the company tends to join one giant SPA because it's the easiest way to add a new page. 2. Code splitting works so well you decide to go ham and code split all of the things - modals, below-the-fold content, tracking scripts, etc.
Now you'll run into situations where 20 different unrelated bundles happen to share a single module, but that module is too small for the bundler to split out, and so you end up downloading it N times.
sylware about 16 hours ago |
sipsi 1 day ago |
steveharing1 about 21 hours ago |
Rithan about 2 hours ago |
leontloveless 1 day ago |
irenetusuq about 20 hours ago |
ctvdev about 18 hours ago |
huhulove1990 about 15 hours ago |
undefined about 20 hours ago |
undefined about 20 hours ago |
undefined about 21 hours ago |
undefined about 21 hours ago |
undefined about 21 hours ago |
butILoveLife about 16 hours ago |
hknzerodark1 about 17 hours ago |
hknzerodark1 about 20 hours ago |
grishka about 22 hours ago |
But the real cause of JS bloat is the so-called "front-end frameworks". Especially React.
First of all, why would you want to abstract away the only platform your app runs on? What for? That just changes the shape of your code but it ends up doing the same thing as if you were calling browser APIs directly, just less efficiently.
Second of all, what's this deal with mutating some model object, discarding the exact change that was made, and then making the "framework" diff the old object with the new one, call your code to render the "virtual DOM", then diff that, and only then update the real DOM tree? This is such an utterly bonkers idea to me. Like, you could just modify your real DOM straight from your networking code, you know?
Seriously, I don't understand modern web development. Neither does this guy who spent an hour and some to try to figure out React from the first principles using much the same approach I myself apply to new technologies: https://www.youtube.com/watch?v=XAGCULPO_DE
krmbzds about 24 hours ago |
general_reveal about 19 hours ago |
pjmlp about 22 hours ago |
There will be almost no bloat to worry about.
deanc about 19 hours ago |
onion2k about 21 hours ago |
The other two, atomic architecture and ponyfills, are simply developer inexperience (or laziness). If you're not looking at the source of a package and considering if you actually need it then you're not working well enough. And if you've added code in the past that the metrics about what browsers your visitors are using show isn't needed any more, then you're not actively maintaining and removing things when you can. That's not putting the user first, so you suck.
People keep telling me the approach I am taking won't scale or will be hard to maintain, yet my experience has been that things stay simple and easy to change in a way I haven't experienced in dependency-heavy projects.