Hacker Newsnew | past | comments | ask | show | jobs | submit | s20n's commentslogin

I'm sure this is really smart but boy is this a pain to read. I even tried holding the orbs in hopes of reading the text but it kept reflowing so much that I gave up after 5 minutes.

Edit: I just realized that clicking once freezes the orbs.


I was clicking and clicking hoping for the reflow madness to stop.

Thanks for this tip -- BTW we need to click _each_ orb.

Occationally a previously untamed orb will start making its presence known until it is stopped. OK I think I have been able to catch every orb now. Now onto read about the future of text layout.


Structure and Computer of Interpretation Programs

> I’ve been using Copilot - and more recently Claude - as a sort of “spicy autocomplete” and occasional debugging assistant for some time, but any time I try to get it to do anything remotely clever, it completely shits the bed.

This seems like a really disingenuous statement. If claude can write an entire C compiler that is able to compile the linux kernel, I think it has already surpassed an unimaginable threshold for "cleverness"


You mean the one that can’t compile hello world?


But the article says "our human ancestors" which implies they are not talking about other hominins."

Edit: Okay I just found that Human can also refer to other hominids

from: https://www.merriam-webster.com/dictionary/human

- a bipedal primate mammal (Homo sapiens) : a person

- broadly : hominid


Of course, just as the highlight tool is used for redaction, wingdings is used for encryption!


Not Cyrillic?


I used i3 for the longest time and I'd say a wayland based alternative like sway or miracle is a better choice nowadays. Even KDE Plasma recently dropped x11 support [1] so going forward, most apps will target wayland first.

Migrating my i3 config to sway hardly took any effort. I was also able to get rid of a lot of xorg specific configurations from various x11 dotfiles and put them directly in the sway config (Such as Natural Scrolling)

[1]: https://itsfoss.com/news/kde-plasma-to-drop-x11-support/.


I've been using Emacs 30 on my android tablet for a few months now with a bluetooth keyboard. Needless to say, you can't really leverage eglot so it's basically a no-go for any meaningful software development. I've been using it for org-mode and it is fantastic for that.


Not to criticize you - I also use eglot and it's great - but let me mention that people have been doing pretty meaningful software development for several decades now, and LSPs are, I don't know, 5 years old?

There's a saying in my language, "the appetite grows while you eat"...


I think it's a fair complaint. You're on a setup with bad ergonomics as it is (tablet + Bluetooth keyboard.) Dealing with that and no LSP is rough. I'd be happy writing code on a desktop without an LSP, though I'd be happiest with both.


I did my share of coding on a Commodore 64 (have you seen that keyboard?) with a cassette tape as the only external storage, no debugger (just a very poor BASIC variant) and (of course) a mono CRT tv set as a monitor. No internet, of course, just a few books/magazines.

Kids these days... ;-)


I think the C64 had a fine keyboard? It's mostly a standard layout and a lot chunkier than the small Bluetooth keyboards that tend to cause wrist issues. I also began coding in the CRT days so idk why that would be a barrier, I guess you mean for resolution? My issues are ergonomic not functionality oriented.


Is there an Android app that does Waypipe or wprs to forward a remote Emacs (with eglot/LSP) to your Android tablet?


If you've got it installed as suggested in the article, with its own termux installation, can't you compile the LSPs there and use them with eglot?


what is preventing you from using eglot on android?


the fdroid build of android doesn't have a real linux environment that you can install arbitrary binaries on to. you can switch to a termux-ish proot environment and do x-forwarding or TUI emacs but those are shenanigans


Not having gpg-agent is a huge deal breaker for me. I feel gpg-agent doesn't get enough love. Not only can it do all the ssh-agent operations, it can also be used with gpgme-json[1] to do web authentication with your [A] key. It's truly a shame that hardly any applications leverage the powerful cryptography afforded by GPG.

[1]: https://manpages.debian.org/trixie/gpgme-json/gpgme-json.1.e...


I knew about gpgme-json, but I didn't knew, you could do web auth with that. I though the usecase was mainly mailvelope. How does that work?


I want to know as well, I just read gpgme-json page posted, but it doesn't include anything about WebAuthn (aka passkeys).

Can you use GPG-agent for non-resident passkey challenges?

I also have Yubikey setup, but haven't thought of this.


> Not only can it do all the ssh-agent operations

It can not. Doesn't work with PKCS#11 PIV. In general GPG's behavior with SmartCards is idiotic and interferes with many other applications.

It's good that people don't use GPG more often and I can just purge it from my systems.


What do you mean? I use GPG with SSH (or SSH with GPG) all the time, and I need gpg-agent for that. GPG's agent replaces ssh-agent and serves SSH keys derived from your GPG key.

Can you do this with Age? If not, then I am going to stick to GPG.


I'm unsure what was unclear. It simply does not provide PIV support and it interferes with other software that wants to utilise SmartCards.

Can Age interfere with all SmartCard usage? No clue.


Oh well, let us just agree on that comparing Age to GPG is silly, ergo "Switching from GPG to Age" is silly, unless it is "Switching from GPG to Age for file encryption".

Age doesn't do signing, key infrastructure, or email. Minisign/signify only sign. None are GPG replacements. They're partial feature subsets that are simpler because they do less.

So, to summarize these tools:

- Age: Only does file encryption, no signing, no key management infrastructure, no email integration

- Minisign/Signify: Only signing, no encryption

- GPG: Encryption, signing, key management, email integration, multiple recipients, subkeys, revocation certificates, web of trust (even if unused), smart card support, etc.

You cannot just simply switch from GPG to Age unless you are only doing file encryption. If this is the case, then sure, you can.


One distinction is that compilers generally translate from a higher-level language to a lower-level language whereas Transpilers target two languages which are very close in the abstraction level. For example a program that translated x86 assembly to RISC-V assembly would be considered a transpiler.


The article we are discussing has "Transpilers Target the Same Level of Abstraction" as "Lie #3", and it clearly explains why that is not true of the programs most commonly described as "transpilers". (Also, I've never heard anyone call a cross-assembler a "transpiler".)


I don't really agree with their argument, though. Pretty much all the features that Babel deals with are syntax sugar, in the sense that if they didn't exist, you could largely emulate them at runtime by writing a bit more code or using a library. The sugar adds a layer of abstraction, but it's a very thin layer, enough that most JavaScript developers could compile (or transpile) the sugar away in their head.

On the other hand, C to Assembly is not such a thin layer of abstraction. Even the parts that seem relatively simple can change massively as soon as an optimisation pass is involved. There is a very clear difference in abstraction layer going on here.

I'll give you that these definitions are fuzzy. Nim uses a source-to-source compiler, and the difference in abstraction between Nim and C certainly feels a lot smaller than the difference between C and Assembly. But the C that Nim generates is, as I understand it, very low-level, and behaves a lot closer to assembly, so maybe in practice the difference in abstraction is greater than it initially seems? I don't think there's a lot of value in trying to make a hard-and-fast set of rules here.

However, it's clear that there is a certain subset of compilers that aim to do source-to-source desugaring transformations, and that this subset of compilers have certain similarities and requirements that mean it makes sense to group them together in some way. And to do that, we have the term "transpiler".


Abstraction layers are close to the truth, but I think it's just slightly off. It comes down to the fact that transpilers are considered source-to-source compilers, but one man's intermediate code is another man's source code. If you are logically considering neither the input and the output to be "source code", then you might not consider it to be a transpiler for the same reasons that an assembler is rarely called a compiler, even though assemblers can have compiler-like features: consider LLVM IR, for example. This is why a cross-assembler is not often referred to as a transpiler. Of course, terminology is often tricky: the term "recompiler" is often used for this sort of thing, even though neither the input nor the output is generally considered "source code", probably because they are designed to essentially construct a result as similar as possible to if you were able to recompile the source code for another target. This seems to contrast fairly well with "decompiler", as a recompiler may perform similar reconstructive analysis to a decompiler, but ultimately outputs more object code. Not that I am an authority on anything here, but I think these terms ultimately do make sense and reconcile with each-other.

When people say "Same Level of Abstraction", I think what they are expressing is that they believe both of the programming languages for the input and output are of a similar level of expressiveness, though it isn't always exact, and the example of compiling down constructs like async/await shows how this isn't always cut-and-dry. It doesn't imply that source-to-source translations, though, are necessarily trivial, either: A transpiler that tries to compile Go code to Python would have to deal with non-trivial transformations even though Python is arguably a higher level of abstraction and expressiveness, not lower. The issue isn't necessarily the abstraction level or expressiveness, it's just an impedance mismatch between the source language and the destination language. It also doesn't mean that the resulting code is readable or not readable, only that the code isn't considered low level enough to be bytecode or "object code". You can easily see how there is some subjectivity here, but usually things fall far away enough from the gray area that there isn't much of a need to worry about this. If you can decompile Java bytecode and .NET IL back to nearly full-fidelity source code, does that call into question whether they're "compilers" or the bytecode is really object code? I think in those cases it gets close and more specific factors start to play into the semantics. To me this is nothing unusual with terminology and semantics, they often get a lot more detailed as you zoom in, which becomes necessary when you get close to boundaries. And that makes it easier to just apply a tautological definition in some cases: like for Java and .NET, we can say their bytecode is object code because that's what they're considered to be already, because that's what the developers consider them to be. Not as satisfying, but a useful shortcut: if we are already willing to accept this in other contexts, there's not necessarily a good reason to question it now.

And to go full circle, most compilers are not considered transpilers, IMO, because their output is considered to be object code or intermediate code rather than source code. And again, the distinction is not exact, because the intermediate code is also turing complete, also has a human readable representation, and people can and do write code in assembly. But brainfuck is also turing complete, and that doesn't mean that brainfuck and C are similarly expressive.


> Lie #3: Transpilers Target the Same Level of Abstraction

> This is pretty much the same as (2). The input and output languages have the syntax of JavaScript but the fact that compiling one feature requires a whole program transformation gives away the fact that these are not the same language

It is not really the same as (2), you can't cherry pick the example of Babel and generalise it to every transpiler ever. There are several transpilers which transpile from one high-level language to another high-level language such as kotlin to swift. i.e; targeting the same level of abstraction.

Wonder what this person would say about macro expansions in scheme, maybe that should also be considered a compiler as per their definition.


BabelJS is the central example of "transpilers"; if BabelJS lacks some purported defining attribute of "transpilers", that definition is unsalvageable, even if there are other programs commonly called "transpilers" that do have that attribute.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: