The Internet was originally designed to survive a nuclear war. Now we downgraded it deliberately to not survive a football game.
Decentralised infrastructure: good
Centralised infrastructure: bad
Good and bad for you, of course. For the big companies selling and controlling this stuff, it's vice versa.
Just stay alert and don't chain yourself with big tech dependencies. The reason Git is great is its decentralised nature. If you got so far, why cripple yourself by running your traffic through a single American company like Cloudflare?
Allowing scripting on websites (in the mid-90s) was a completely wrong decision. And an outrage. Programs are downloaded to my computer and executed without me being able to review them first—or rely on audits by people I trust. That’s completely unacceptable; it’s fundamentally flawed.
Of course, you disable scripts on websites. But there are sites that are so broken that they no longer work properly, since the developers are apparently so confused that they assume people only view their pages with JavaScript enabled.
It would have been so much better if we had simply decided back in the ’90s that executable programs and HTML don’t belong together. The world would be so much better today.
> Programs are downloaded to my computer and executed without me being able to review them first—or rely on audits by people I trust
Would've been cool if we could know if site X served the same JS as before. Like a system (maybe even decentralized) where people could upload hashes of the JS files for a site. Someone could even review them and post their opinions. But mainly you'll know you're getting the same JS as before - that the site hasn't been hacked or that you're not being targeted personally. If a file needs to update, the site could say in the changelog something like "updated the JS file used for collapsing comments to fix a bug". This could be pushed by the users to the system.
Especially important for banking sites and webmail.
Stepping back, it's pretty ridiculous that I need to download executable code, often bloated, solely to view read-only content. Just render the thing on the backend and send it to the client.
For me personally the most infuriating example of this is the Azure Updates[1] page, which in my job I need to check nearly daily to see what's reaching EoL, what's new, etc...
A couple of years ago they redeveloped it as a SPA app.
The original server-rendered version of it worked just fine, but it "had" to be made into an interactive client-side monstrosity that loads many times slower for "reasons".
It doesn't even load successfully about a quarter of the time. It shows items in reverse order (entries from 2013 first), which is some sort of async loading bug. They will never fix this. It's been there for two years already, it'll be there for a decade more, mark my words.
Then, it takes about a minute to load sometimes on a poor connection.
The links are JavaScript and don't allow "open in new tab".
Etc...
All of this to enable client-side filtering, which is a non-feature nobody ever wanted. A simple server-side filter capability would do the same thing, faster.
And anyway, the filtering is broken! If click the "New or updated" filter, it drops down an empty selection with no options. Clicking anything else doesn't change what is shown!
While developing this over-engineered monstrosity, they took the original site offline for "maintenance!"
Hilariously, despite Azure having multiple CDN products, the Azure Updates page doesn't correctly use their own CDN and marks almost everything as "no-cache; no-store" causing 2.5 MB (after compression) to be re-transferred every time, despite using unique signed URLs with SHA256 hashes in them!
This is the state of web-dev in the 2020s: A multi-trillion-dollar software company can't hire developers that know anything else other than SPA web app development!
This commonly used page has spectacularly poor web engineering, and this is from a company that sells a web app platform, a CDN, and the ASP.NET web app development framework!
My favorite part of that site, besides it loading incredibly fast, is even though it has an ad, for a wholly subsidiary, on it it is hard coded in the html.
> Programs are downloaded to my computer and executed without me being able to review them first—or rely on audits by people I trust.
JavaScript and WebAssembly programs are always executed in a sandboxed VM, without read access to the host OS files (unless, of course, you grant it).
Enabling scripting was a necessary step for interactive websites. Without it, a full page load would be required every time you upvote a Hacker News comment. In my opinion, the real problem is that browsers allow too many connections to third-party domains, which are mostly ads and trackers. Those should require user-approved permissions instead of being the default.
The Triptych Proposals [1] cover a lot of common use cases for submitting information to a server and updating part of a page. Something like that should have been possible to implement early in web history (I perceive some similarity to frames).
Modern CSS (and some newer HTML features) also reduces the need for scripting.
I very much doubt that "Enabling scripting was a necessary step for interactive websites." (emphasis added). It may well have been the most convenient and fastest way to get the functionality to the most users. With Javascript each website could provide functionality without waiting for such to be implemented by all browsers.
However distribution of power also leads to more complex trust relationships (even if one is confident that sandboxing is effective). Independent implementation also leads to more complexity overall.
In the world we have now, limiting XMLHttpRequest and Fetch to the same host as the current page would be great. But if that had always been the limitation, I fear that the adware peddlers would have just gotten proficient at shipping PHP packages/extensions that you could run on the same server as your site, and we'd be in largely the same situation, except that blocking the stuff would be harder than it is for us today.
Why can't MY browser send some random JS to THEIR website? If it's safe for me to run some stranger's code, should it be safe for strangers to run my code?
There is obviously huge demand for scripting on websites. There is no one authority on what gets allowed on the web, if the existing orgs didn't implement it, someone else would have and users would have moved over when they saw they could access new more capable, interactive pages.
The 49MB webpage just shows what our priorities are. It shows the target audience has fast internet that can load this without issues. On my average home connection in Australia, I can download a 49MB page in 0.3 seconds. We spend time optimising for what matters to the end user.
Disable not just JavaScript, but also CSS. I'm not kidding. Many websites actually contain all the content in HTML but use CSS to hide and then use JavaScript to show it again.
We should decouple the publishing of papers from academic careers completely.
Papers can't generate any reputation or money for the authors anymore. To achieve that, we must anonymize the authors.
All scientists get some (paid) time to write papers — if they want. What they write and if they publish it is not known to anybody. They are trusted to write something of value in that time.
Universities can come up with other ways of judging which professors they hire. Interviews. Test teachings. Or the writing of an non-public application essay, which describes their past research and discoveries.
The value, to society, to your field and to you institution, of being a scholar is to create new knowledge. New knowledge has no value unless you disseminate it, or publish.
Another necessity is the public (usually within its field) examination of the knowledge, including discussion/debate. Knowledge is merely embryonic without those things - undeveloped, not at all reliable. That is difficult without the author able to respond. And others want to expand and build on the work, which often benefits greatly from contacting the author.
In the modern (post-positivist?) approach to science, the world respects that it's written by a human who has a perspective and, despite their best intentions, biases. You can't evaluate any knowledge without knowing its source, in science or elsewhere. The first element of a citation is the author, not the title or journal (though I don't know why that happened historically).
And the latter is a reason any LLM author should be identified.
In a normal and sane world, a scientist is a nerd about their field. They are highly interested in new thoughts and insights. When a new paper in their field is published, they try hard to find the time to read it. The reason is: every paper is written by enthusiasts who want to add something of value, new insights, to the discussion. Proving or disproving theories, adding puzzle pieces to the general picture.
That is the normal situation, which is the foundation of the progression of civilisation.
But some people install incentive systems to sabotage this. They are sabotaging civilisation itself.
Bad grammar is disrespect.
Underlings have to swallow that disrespect. It is just a power game.
The next level is simply to insult everyone, and everyone will still remain submissive.
And if you insult people, and get rewarded by submission, one reaction is to amp up the insults.
After all, you don't know the limits of your power until someone quits. So abuse people, exhibit outlandish public behavior, say racist or otherwise objectionable things...every person who remains on your payroll is a sign of how powerful you are.
This is not a common tactic, but it's a highly visible tactic, and it's not hard to find some notable examples out there right now.
Decentralised infrastructure: good
Centralised infrastructure: bad
Good and bad for you, of course. For the big companies selling and controlling this stuff, it's vice versa.
Just stay alert and don't chain yourself with big tech dependencies. The reason Git is great is its decentralised nature. If you got so far, why cripple yourself by running your traffic through a single American company like Cloudflare?
reply