I agree with what you're saying, of course, simplicity is better, etc.
But the nav on your blog is a terrible example.
Firstly, you don't get to just click on the links to go to where you want to go, you first have to click the three-lines button, even on a desktop with an enormous screen.
And secondly, despite your claims about an "enhanced experience with a modern browser", it seems to work exactly as if there was no enhancement at all? I click the three-lines menu and it takes me to a new page listing the links I can click. The "X" button to "close" the menu navigates me back particularly quickly, but that is all that I can tell that is unusual.
I'm using Firefox 136 on Ubuntu.
And in any event, this is all unnecessary, because you can make a nav by just putting a bunch of links at the top of the page, like HN does.
Not a fan of the nav, but loved the transitions. Applied them to my website (bespoke C++ static generator + template library): https://vittorioromeo.com/
I have a question: After clicking on a blog in the listing page ("Collective Speed is..."), the page navigated to that particular blog. What CSS transitions are used to convert that title to a header? I saw some animation which pushed that title to become a header. How does that work? I'm curious
I just tried it on their website, using the desktop browser, and the experience is absolutely OK: you just get the menu as in any web app, and you can close it to go back, etc. Just an old-school page which is blazing fast ... because it is an old-school page. It renders faster than a typical animation to open a sidebar.
But you don't need to open a menu to navigate to another page on an old school web page. Web pages in the 00s just showed you links to other parts of the website on a navbar that is always there. I agree this website is optimized for phones and works poorly on desktop — there is absolutely no reason to hide your links behind a burger menu when I have more than enough pixels on my monitor for all your links.
When I first saw the title, I thought "great! Someone else has (re)discovered hypermedia".
But they have gone in a completely bewildering direction. Rather than swap/morph html fragments in, they're doing full page navigations and using view transitions to make it look smooth.
Worse, they are manipulating the history to cover these blasphemous tracks.
Datastar would make this particularly simple - just include whatever menu/nav stuff that you need and show/hide/toggle it with a few signals/attributes.
No, they're not related. The submission's article is about CSS view transitions. HATEOAS seems to involve interacting with an API, but linking between pages on a blog does not require this.
Would love to hear anecdata from others but I'd say...not really? I was a kid in those days but there's no way I'd make a server round trip for /menu/ to open a menu.
I dunno, it wants to challenge our dependence on javascript and then to make it work it needs to inject a “back” behavior into a normal link?
Js and fallbacks for menus is a solved issue.
this is just another form of LLM dunning krueger derangement where you think the LLM-suggested solution is novel because you haven’t encountered it before, or because you fundamentally don’t understand the underlying problems that we have already solved.
Yeah, I don't think the menu should've been a separate page. It can be made JavaScript-less as a dialog opened by the popover HTML attributes,[0] and the escape key would be able to close it.
I guess it doesn't have to use JavaScript for the back behavior. It could use a server-side rendered referrer if that hasn't been stripped by the browser?
You say that JavaScript and fallbacks for menus is a solved issue but the number of menus that are just an absolute clusterfuck is ridiculous on the web today. They're really not a solved issue, Progressive enhancement is hard to do. Genuinely hard in some cases.
On balance, while this is not without flaws, it's interesting. Accessibility, deep linking, reduction in cognitive load for the developer. There's some merit here.
...because the opening line of the blog post says he's been "building websites with LLMs", and then attempts to cutely redefine that abbreviation as "Lots of Little htMl pages" in a parenthetical.
It's, um. Not the best kind of communication, and very easily leads to this kind of misunderstanding.
JavaScript doesn't effect screen readers at all unless you dynamically add content without the proper ARIA roles. It is trivial to correct.
As I just said, users who explicitly disable JavaScript cannot even use Google Search. Why should I accommodate those users when even Google refuses to do so? They are actively choosing to have a limited web experience. The vast majority of the internet is completely broken for them.
>A spotty connection hasn’t loaded the dependencies correctly - Either they load or they don't. How would the dependencies load "incorrectly"?
Let‘s say you have 5-7 dependencies to load, but 3 of them timed out because your train entered the tunnel. Your app ends up in incorrect state, fails silently and UX degrades unpredictably. This is where the conversion often drops visibly and the reason SSR is now a go-to solution for any marketing website.
Why am I loading dependencies from 5-7 places? Why is my website not using a bundler if it has so many varied dependencies? Why do we not expect the user to understand that they are in a tunnel without internet?
Regardless, this isn't really restricted to the usage of JavaScript. The website would likely have pretty bad UX if only half of the CSS loaded correctly, but no one programs defensively around it being absent.
Have you ever developed an enterprise scale frontend applications optimized for conversion targets? It feels like you have not. You may ship your own code in a bundle, yes. All integrations come on top of that. That chatbot, tracker, A/B testing logic etc - all are loaded separately from your service provider CDN.
An user opening a web page is not expecting a full-blown app with multi-second loading times. If that happens, they bounce, and you loose revenue. Web is supposed to have very short time to first content paint and very short time to interactive, the shorter, the better, less than 0.5s is the goal. It can deliver that, if built properly. Many SPAs, bulky JS apps are built this way for developer convenience, not for end users. The only real use case for SPA is when you deal with a lot of local data. A spreadsheet, document or image editor, a diagram tool (but then wasm is probably a better choice).
You may say, you are not building enterprise grade frontend. But if you are small enough, you don’t need SPA either.
Go on. How do I have no idea what I'm talking about? Why is it okay for a website to break simply because the analytics don't load? Why do you think that's good design? How is my personal, lived experience less valuable than yours?
Is it just that you're ashamed that you have made such poorly designed web apps that can't handle a few broken HTTP calls?
Is it just that you can't simply accept that JavaScript is a requirement for the modern web which is what this entire discussion is hinged upon?
You dismissed A/B testing as unnecessary. That is sufficient for this judgement. A/B tests mostly run on the happy path scenario of a customer: An A/B test breaks, the company is losing money at light speed.
The loading-related issues overall may eat 0,5-1% of the revenue. It is not something that should be an afterthought.
Lol, okay. I didn't know that every single customer was going to go through a tunnel as they loaded the page.
I didn't dismiss A/B testing. I'm just saying that, if the analytics don't load on the client, you should already have A loaded and ready to render. It's literally just a matter of a try/catch, and you shouldn't be waiting to load this stuff on the client-side anyways if this is truly supposed to be the "Happy Path".
Yes, I know that legacy software like Google Tag Manager requires client-side integration, but I would argue that is an orthogonal concern. You don't need to use that for your A/B testing. It's pretty easy to integrate this stuff into SSR-- especially if you stream in the HTML. This is why cookies exist.
And, again, none of this changes the central concept of this comment thread: JavaScript is necessary for the modern web experience.
Literally none of those things are necessary for a working website. If your site breaks when your analytics don't load, then that's just horrible design at any scale.
A normal person would immediately think "dang, page didn't load before I entered the tunnel. Guess I'll wait til I'm out again and refresh".
And if they're deliberately going somewhwre where there's no signal for an extended period of time, and really want it to work, they'll ensure they've loaded everything before doing so.
And I say this as someone who is developing a pwa that is for people with low end phones and very inconsistent and/or connections. I'm very cognizant and empathetic to their situation.
Anecdotal evidence does not beat statistics and user research. Bounce rate has inverse correlation to loading speed. People with low intent do not refresh, they simply don‘t come back and look elsewhere or just move on. Telling you this as someone who built first commercial website in 1999 and was a hyperscaler B2C startup CTO. Let‘s not measure the length of credentials.
To clarify, you're saying we should be jumping through convoluted hoops - full page navigation + js to rewrite history, all so that you can avoid a very minimal amount of js to show/hide a nav menu - for low intent people who are frequently entering tunnels?
Something like Datastar would enable this with like two html attributes, and only require 10kb of js (and would also allow for endless other things via declarative html).
> I'm just tired of being downvoted every single time I mention that JavaScript is necessary on the modern web
Downvotes should give you a hint that the few users that know what javascript is, don't like it, and the rest of them, if they learned, most likely won't like it either. Your attitude shows that you don't care.
My attitude shows that JavaScript is necessary for the modern web experience! No one has successfully argued against this yet-- nor have they even really tried! You're all just mad about my tone without even discussing the content as if this was a kindergarten class. This is absurd.
Let me get this right, you're saying that people on HackerNews don't know about JavaScript-- one of the most popular programming languages in the world?
Lots of important points already posted, from reliability over accessibility to SEO. To add a personal reason: speed. I hate slow computer things. If I open a page I e.g. don't have to use for work or similar (like Google Ads or Linkedin or similar horror shows), and loading the page takes longer than a second, I just bounce.
I built and maintain a static site for a company, and it's just wonderful. It opens instantly (which the search engines like), rebuilding and deploying is trivial, I am not forced to update every week because the JS framework has a new vuln.(There are a few dynamic, JS-based parts that load dynamically and fail gracefully.)
None of this refutes my post whatsoever. The amount of JavaScript to open a hamburger menu is literally around 500 bytes. It can easily gracefully degrade just like your last sentence said. You're going to load much, much more than that by following TFA. It loads an entire new HTML page! This argument just doesn't work here.
There's nothing to refute in your post; you only told the audience that you don't get OP, and people tried to explain the appeal to you. What you do with that information is your thing.
None of what you said explained the appeal of OP's technique! Using JavaScript doesn't have to affect your reliability, accessibility, SEO, or any of the things you said. How am I supposed to do anything with useless information?
If anything, hiding the hamburger menu behind an HTTP response is *less accessible*!
i'd say it really depends on the website you're building and what you define as "using javascript"
are you comparing fully client rendering vs SSR without client JS at all? are you building a marketing site or a dashboard type tool?
there is subtly to why each decision would be made based on these.
i think HN tends to demonize JS because we're all well aware of how bad the state of a lot of web apps have become with developers only knowing react and having no background in web fundamentals or accessibility.
i see no reason to completely shun JS, just use it intentionally and not reflexively.
Thank you so much for having one of the few measured takes in this thread. I'm not advocating for throwing React and NPM at every problem, but a little bit of hand-written JS is rarely an issue.
It's not "easier" using Javascript; raw Javascript websites are a nightmare to build, maintain and reason about. It is "easier" with Javascript + current-favourite-framework-of-the-day.
Also: accessibility, SEO, the all-new Generative Engine Optimization (GEO) thing, etc.
But you don't need JS to make a search engine work. I'm pretty sure I've used links2 in the past to google things just fine. Google Search would work fine without JS, Google just wants to track people
I agree with what you're saying, of course, simplicity is better, etc.
But the nav on your blog is a terrible example.
Firstly, you don't get to just click on the links to go to where you want to go, you first have to click the three-lines button, even on a desktop with an enormous screen.
And secondly, despite your claims about an "enhanced experience with a modern browser", it seems to work exactly as if there was no enhancement at all? I click the three-lines menu and it takes me to a new page listing the links I can click. The "X" button to "close" the menu navigates me back particularly quickly, but that is all that I can tell that is unusual.
I'm using Firefox 136 on Ubuntu.
And in any event, this is all unnecessary, because you can make a nav by just putting a bunch of links at the top of the page, like HN does.
Wow, I love it when people discover these NEW ways of doing things.
Not a fan of the nav, but loved the transitions. Applied them to my website (bespoke C++ static generator + template library): https://vittorioromeo.com/
I've been trying to steer LLMs towards the Astro framework due to this - the island concept seems like a perfect fit
https://docs.astro.build/en/concepts/islands/
I have a question: After clicking on a blog in the listing page ("Collective Speed is..."), the page navigated to that particular blog. What CSS transitions are used to convert that title to a header? I saw some animation which pushed that title to become a header. How does that work? I'm curious
That'd be the View Transition API. Specifically, it is a cross-document view transition. https://developer.chrome.com/docs/web-platform/view-transiti...
Really awesome new(ish) feature of the web platform. It doesn't yet have full support in Firefox unfortunately, but is supported in Chrome and Safari.
Thanks
This is close to how things used to be, in the time before server-side includes.
OK...and what does that look like on a desktop browser?
Because if I click on a menu button on a desktop browser, I generally don't expect it to take over the entire page with a menu.
This seems like an example of unhelpfully mobile-centric website design, which has been becoming more prevalent in recent years.
I just tried it on their website, using the desktop browser, and the experience is absolutely OK: you just get the menu as in any web app, and you can close it to go back, etc. Just an old-school page which is blazing fast ... because it is an old-school page. It renders faster than a typical animation to open a sidebar.
But you don't need to open a menu to navigate to another page on an old school web page. Web pages in the 00s just showed you links to other parts of the website on a navbar that is always there. I agree this website is optimized for phones and works poorly on desktop — there is absolutely no reason to hide your links behind a burger menu when I have more than enough pixels on my monitor for all your links.
I agree, but its not intrinsic to the approach of less JS and more pages.
You should of course not have a menu button on a desktop view. There is plenty of space to show the menu without hiding it behind a button.
Maybe it is you who are mobile centric?
Isn't this just HATEOAS as espoused by libraries like htmx, datastar, hotwire etc.
https://htmx.org/essays/hateoas/
When I first saw the title, I thought "great! Someone else has (re)discovered hypermedia".
But they have gone in a completely bewildering direction. Rather than swap/morph html fragments in, they're doing full page navigations and using view transitions to make it look smooth.
Worse, they are manipulating the history to cover these blasphemous tracks.
Datastar would make this particularly simple - just include whatever menu/nav stuff that you need and show/hide/toggle it with a few signals/attributes.
No, they're not related. The submission's article is about CSS view transitions. HATEOAS seems to involve interacting with an API, but linking between pages on a blog does not require this.
HATEOAS does not require interacting with an API.
As the article in the comment you are replying to makes very clear, this couldn't be more wrong
isnt this just the old school way of making a static website? its nice and chill and peaceful
Would love to hear anecdata from others but I'd say...not really? I was a kid in those days but there's no way I'd make a server round trip for /menu/ to open a menu.
the go-to method is using frames.
I dunno, it wants to challenge our dependence on javascript and then to make it work it needs to inject a “back” behavior into a normal link?
Js and fallbacks for menus is a solved issue. this is just another form of LLM dunning krueger derangement where you think the LLM-suggested solution is novel because you haven’t encountered it before, or because you fundamentally don’t understand the underlying problems that we have already solved.
Yeah, I don't think the menu should've been a separate page. It can be made JavaScript-less as a dialog opened by the popover HTML attributes,[0] and the escape key would be able to close it.
[0]: https://developer.mozilla.org/en-US/docs/Web/HTML/Reference/...
Could be achieved with a checkbox input and a hidden div too
I'm not at all against using some js for the menu, but a native dialog or popover is definitely a nice idea here.
I guess it doesn't have to use JavaScript for the back behavior. It could use a server-side rendered referrer if that hasn't been stripped by the browser?
You say that JavaScript and fallbacks for menus is a solved issue but the number of menus that are just an absolute clusterfuck is ridiculous on the web today. They're really not a solved issue, Progressive enhancement is hard to do. Genuinely hard in some cases.
On balance, while this is not without flaws, it's interesting. Accessibility, deep linking, reduction in cognitive load for the developer. There's some merit here.
I'm unsure why you think this was an LLM-suggested solution.
...because the opening line of the blog post says he's been "building websites with LLMs", and then attempts to cutely redefine that abbreviation as "Lots of Little htMl pages" in a parenthetical.
It's, um. Not the best kind of communication, and very easily leads to this kind of misunderstanding.
Why do you think this is LLM?
[dead]
[flagged]
You're pre-emptive hostility seems rather unwarranted.
This article is my usual go-to and lists several reasons why JavaScript might not be available, and thus why you shouldn't take it for granted: https://piccalil.li/blog/a-handful-of-reasons-javascript-won...
I feel compelled to add:
- the user explicitly disabled JavaScript
- the browser does not support JavaScript (I sometimes view websites using elinks)
AFAIK screen readers also work better without JavaScript, so it's also an accessibility issue.
JavaScript doesn't effect screen readers at all unless you dynamically add content without the proper ARIA roles. It is trivial to correct.
As I just said, users who explicitly disable JavaScript cannot even use Google Search. Why should I accommodate those users when even Google refuses to do so? They are actively choosing to have a limited web experience. The vast majority of the internet is completely broken for them.
[flagged]
>A spotty connection hasn’t loaded the dependencies correctly - Either they load or they don't. How would the dependencies load "incorrectly"?
Let‘s say you have 5-7 dependencies to load, but 3 of them timed out because your train entered the tunnel. Your app ends up in incorrect state, fails silently and UX degrades unpredictably. This is where the conversion often drops visibly and the reason SSR is now a go-to solution for any marketing website.
Why am I loading dependencies from 5-7 places? Why is my website not using a bundler if it has so many varied dependencies? Why do we not expect the user to understand that they are in a tunnel without internet?
Regardless, this isn't really restricted to the usage of JavaScript. The website would likely have pretty bad UX if only half of the CSS loaded correctly, but no one programs defensively around it being absent.
Have you ever developed an enterprise scale frontend applications optimized for conversion targets? It feels like you have not. You may ship your own code in a bundle, yes. All integrations come on top of that. That chatbot, tracker, A/B testing logic etc - all are loaded separately from your service provider CDN.
An user opening a web page is not expecting a full-blown app with multi-second loading times. If that happens, they bounce, and you loose revenue. Web is supposed to have very short time to first content paint and very short time to interactive, the shorter, the better, less than 0.5s is the goal. It can deliver that, if built properly. Many SPAs, bulky JS apps are built this way for developer convenience, not for end users. The only real use case for SPA is when you deal with a lot of local data. A spreadsheet, document or image editor, a diagram tool (but then wasm is probably a better choice).
You may say, you are not building enterprise grade frontend. But if you are small enough, you don’t need SPA either.
Go on. How do I have no idea what I'm talking about? Why is it okay for a website to break simply because the analytics don't load? Why do you think that's good design? How is my personal, lived experience less valuable than yours?
Is it just that you're ashamed that you have made such poorly designed web apps that can't handle a few broken HTTP calls?
Is it just that you can't simply accept that JavaScript is a requirement for the modern web which is what this entire discussion is hinged upon?
>How do I have no idea what I'm talking about?
You dismissed A/B testing as unnecessary. That is sufficient for this judgement. A/B tests mostly run on the happy path scenario of a customer: An A/B test breaks, the company is losing money at light speed.
The loading-related issues overall may eat 0,5-1% of the revenue. It is not something that should be an afterthought.
Lol, okay. I didn't know that every single customer was going to go through a tunnel as they loaded the page.
I didn't dismiss A/B testing. I'm just saying that, if the analytics don't load on the client, you should already have A loaded and ready to render. It's literally just a matter of a try/catch, and you shouldn't be waiting to load this stuff on the client-side anyways if this is truly supposed to be the "Happy Path".
Yes, I know that legacy software like Google Tag Manager requires client-side integration, but I would argue that is an orthogonal concern. You don't need to use that for your A/B testing. It's pretty easy to integrate this stuff into SSR-- especially if you stream in the HTML. This is why cookies exist.
And, again, none of this changes the central concept of this comment thread: JavaScript is necessary for the modern web experience.
> chatbot, tracker, A/B testing logic etc
Literally none of those things are necessary for a working website. If your site breaks when your analytics don't load, then that's just horrible design at any scale.
Apparently you have no idea what are you talking about.
A normal person would immediately think "dang, page didn't load before I entered the tunnel. Guess I'll wait til I'm out again and refresh".
And if they're deliberately going somewhwre where there's no signal for an extended period of time, and really want it to work, they'll ensure they've loaded everything before doing so.
And I say this as someone who is developing a pwa that is for people with low end phones and very inconsistent and/or connections. I'm very cognizant and empathetic to their situation.
Anecdotal evidence does not beat statistics and user research. Bounce rate has inverse correlation to loading speed. People with low intent do not refresh, they simply don‘t come back and look elsewhere or just move on. Telling you this as someone who built first commercial website in 1999 and was a hyperscaler B2C startup CTO. Let‘s not measure the length of credentials.
To clarify, you're saying we should be jumping through convoluted hoops - full page navigation + js to rewrite history, all so that you can avoid a very minimal amount of js to show/hide a nav menu - for low intent people who are frequently entering tunnels?
Something like Datastar would enable this with like two html attributes, and only require 10kb of js (and would also allow for endless other things via declarative html).
https://data-star.dev/reference/attributes#data-show
> To clarify, you're saying we should be jumping through convoluted hoops
Good that you are asked. Did I say anything like that in my comment above?
Don't worry, it's only in the HN bubble that adding a 500ms round trip to open a damn menu is acceptable and commended.
> I'm just tired of being downvoted every single time I mention that JavaScript is necessary on the modern web
Downvotes should give you a hint that the few users that know what javascript is, don't like it, and the rest of them, if they learned, most likely won't like it either. Your attitude shows that you don't care.
My attitude shows that JavaScript is necessary for the modern web experience! No one has successfully argued against this yet-- nor have they even really tried! You're all just mad about my tone without even discussing the content as if this was a kindergarten class. This is absurd.
Let me get this right, you're saying that people on HackerNews don't know about JavaScript-- one of the most popular programming languages in the world?
Lots of important points already posted, from reliability over accessibility to SEO. To add a personal reason: speed. I hate slow computer things. If I open a page I e.g. don't have to use for work or similar (like Google Ads or Linkedin or similar horror shows), and loading the page takes longer than a second, I just bounce.
I built and maintain a static site for a company, and it's just wonderful. It opens instantly (which the search engines like), rebuilding and deploying is trivial, I am not forced to update every week because the JS framework has a new vuln.(There are a few dynamic, JS-based parts that load dynamically and fail gracefully.)
None of this refutes my post whatsoever. The amount of JavaScript to open a hamburger menu is literally around 500 bytes. It can easily gracefully degrade just like your last sentence said. You're going to load much, much more than that by following TFA. It loads an entire new HTML page! This argument just doesn't work here.
There's nothing to refute in your post; you only told the audience that you don't get OP, and people tried to explain the appeal to you. What you do with that information is your thing.
None of what you said explained the appeal of OP's technique! Using JavaScript doesn't have to affect your reliability, accessibility, SEO, or any of the things you said. How am I supposed to do anything with useless information?
If anything, hiding the hamburger menu behind an HTTP response is *less accessible*!
i'd say it really depends on the website you're building and what you define as "using javascript"
are you comparing fully client rendering vs SSR without client JS at all? are you building a marketing site or a dashboard type tool?
there is subtly to why each decision would be made based on these.
i think HN tends to demonize JS because we're all well aware of how bad the state of a lot of web apps have become with developers only knowing react and having no background in web fundamentals or accessibility.
i see no reason to completely shun JS, just use it intentionally and not reflexively.
Thank you so much for having one of the few measured takes in this thread. I'm not advocating for throwing React and NPM at every problem, but a little bit of hand-written JS is rarely an issue.
> when it's much easier to just build a nice website using JavaScript
I'm currently building a web-based tool that uses dynamic forms for UI, without the help of of a framework (yeah I know; I have reasons). This is the result: https://github.com/KaliedaRik/sc-filter-builder/blob/main/js...
It's not "easier" using Javascript; raw Javascript websites are a nightmare to build, maintain and reason about. It is "easier" with Javascript + current-favourite-framework-of-the-day.
Also: accessibility, SEO, the all-new Generative Engine Optimization (GEO) thing, etc.
Why wouldn't google work without javascript? Just use some forms, submit the text the user entered, get back a page with the search results...
OP is right. a quick google (hehe) shows you they require it as of 2024.
But you don't need JS to make a search engine work. I'm pretty sure I've used links2 in the past to google things just fine. Google Search would work fine without JS, Google just wants to track people
Google is an advertising company. Not a search engine company.
They don't want your patronage if they can't track you and commodify you.