Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Ask HN: AI to Replace Compiled Languages?
1 point by exodys 18 hours ago | hide | past | favorite | 5 comments
Consider the following scenario: Bob is making some compiled software that will be run on a host machine. Bob knows what operating systems Bob wants to support, and knows the ISAs those operating systems supports. Bob is also a huge vibe coder, and Bob learned that 'assembly is the best language you can write in, as you have the most control' (this is debatable, but gives Bob modus operandi).

So Bob builds his project, moving subroutine and subroutine so that his project is built. Bob then realizes something; he didn't have to compile a single piece of code. He just had to assemble it.

Is this as far as AI can go? Assumedly, it could spit out correct binaries if it was trained well enough.





I don't see why it's not possible.

In your example scenario, it sounds like Bob is vibe-coding a project in assembly language, and then using an existing assembler to create the final binary?

I have not tried it, but I would imagine that current LLMs are relatively weak on generating assembly language, just due to less thorough training compared to higher-level languages, but even if so, that's surmountable.

As for, what I think you are suggesting, having the LLM also do the assembly step? Again, in theory, sure, but I would think just using the existing known-good assembler would be preferable to training an LLM to convert assembly language to binary. I'm not sure what you would gain in terms of either speed or overall productivity to have the LLM itself do the assembler step?


I didn't think there would be a gain of speed or productivity, but just a (possibly ominous) idea of 'cutting out the middleman'. Granted, that middleman is very important.

Natural language is not the best language for formal descriptions of projects, but it is the one that humans use most day to day. Perhaps a chain like this would be the start of on demand programs.


This will essentially always be more expensive than generating code a compiler can use, because the compiler and intermediate representation rule out, ex ante, a vast logical space of binaries that will crash.

The basis of this idea is grounded on fantasy.

You understand that one wrong byte means that the program crashes? Why use a tool that is totally probabilistic to generate assembly directly which must be correct?

Why?


You are right that this is an error prone process, but as these tools are used now, are much like how compilers were used in the past. It is, after all, another layer of abstraction.

Is it foolproof? No. I do think the idea had some merit, though.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: