Rust happens to be an extremely good tool. There are definitely situations where it absolutely sucks. e.g. Zed is a heroic effort, but look at the code and you'll see that we still haven't figured out how to do Rust UIs.
We may disagree on the premise that humans are generally incapable of correct and safe manual memory management, but that's a degree of distrust I hold for myself. You may have never written a memory bug in your life, but I have, and that renders me completely incompetent.
If a project in an unsafe language has ever had a memory bug (I'm looking at you, Bun), the maintainers objectively have a track record of not being capable of manual memory management. You wouldn't put a person who has a track record of crashing busses at the wheel of a school bus.
And Rust isn't the only memory-safe language. You can turn to Java, Go, C#, Type/JavaScript, and whole bunch of others. Rust just so happens to have ocaml tendencies and other things that make it a joy to read and write, so that's definitely preference on my part. One of these days I'll learn ocaml and possibly drop Rust :)
> You may have never written a memory bug in your life, but I have, and that renders me completely incompetent.
This feels overly binary. Memory management bugs is just one class of bugs, and there have been many other bugs leading to security issues or defects.
If you apply the standard "has ever written a bug" → "completely incompetent" you will have to stop using software, and if you think about it most other technology too
Memory safety is a very useful trait for a language though, and as you say provided by a whole bunch of different languages nowadays
Even the statement that (100% safe) Rust does not have memory bugs/mutable aliasing is not always true.
It's well known that Rust has difficulty representing graph-like memory structures, and people have taken to using arrays of `Node`-s to represent graphs, where each graph edge represents a pointer to another node.
This both efficient, and fast, but this approach sidesteps the borrow checker.
If you had a method that 2 mutable `Node` references as parameters, the borrow checker would complain if they'd point to the same struct. If you pass 2 ints, it won't.
Likewise, since liveness is tracked by user logic, you can refer to stale, deallocated `Node`-s or ones that haven't been initialized yet.
I've had people argue this is not a true memory bug, since you're not causing 'real' memory faults, but in C, `malloc` is just a function that hands you pointers into chunks of pre-allocated memory space most of the time, when it doesn't have to ask the OS for more.
I know from experience some people see this criticism as an attack on their favourite language and instantly rebuke it.
But I'd like to argue that there's something there, and it bears thinking about how 'memory allocation exisitng outside Rust' and 'memory allocating existing inside Rust' behave differently might be seen as an interesting dicothomy that needs to be resolved and that resolution might improve Rust's (or some successor language's) memory model.
The difference is the checking, and actual enforcement of it.
Go and use get_unchecked if you want to and get C like behavior. But the safety note tells you the potential issues:
Safety
Calling this method with an out-of-bounds index is undefined behavior even if the resulting reference is not used.
You can think of this like .get(index).unwrap_unchecked(). It’s UB to call .get_unchecked(len), even if you immediately convert to a pointer. And it’s UB to call .get_unchecked(..len + 1), .get_unchecked(..=len), or similar.
> what's the justification of not using a memory-safe language
Use Go, Java or Fil-C, and memory safety is achieved at the expense of runtime performance. Tracing garbage collectors make your programs run slower and use more RAM.
With Rust you pay with complexity. Rust has new, weird syntax (lifetimes, HRTB, etc) and invisible borrow checker state that you've gotta understand and keep track of while programming. Rust is a painful language to learn, because lots of seemingly valid programs won't pass the borrow checker. And it takes awhile to internalise those rules.
I personally think the headache of rust is worth it. But I can totally understand why people come to the opposite conclusion.
Rusts memory safety constructs do also impose a (much smaller) runtime performance penalty. Every Arc / Rc is a memory safe abstraction with runtime cost since rust has no way to prove cyclic reference graphs are safe at compile time.
> It's a particularly bad one though because it always leads to UB, which means you can't say anything about what happens next.
This is also why memory safety is table-stakes when it comes to formal verification of the underlying program logic. You can't solve logic bugs (even where that's known to be feasible, such as for tightly self-contained, library-like features) without solving memory safety first.
Do note that with modern compilers it's surprisingly hard to accidentally do something that is always guaranteed to write to 0. Because it is UB, and an optimizing compiler is allowed assume that it doesn't happen. This can lead to seemingly crazy things like a variable that is set to zero, and when you deref through it it gives you something completely different instead. Because if a variable is first set to zero in all code paths, and then complex logic usually sets it to something else, after which it is dereferenced, the compiler is allowed to notice that the path where it is accessed without being first set to something else never happens, and then it is allowed to notice that the first write to the variable is dead because it's never read before being set to something else, and thus can be eliminated.
I assume this is a product of sufficiently advanced compilers. Other LLVM languages almost certainly suffer from this too, including Zig, Swift and unsafe rust.
> Rust happens to be an extremely good tool. There
Sir (or ma’am), you stole literally the line I came to write in the comments!
To anyone new picking up Rust, beware of shortcuts (unwrap() and expect() when used unwisely). They are fine for prototyping but will leave your app brittle, as it will panic whenever things do not go the expected way. So learn early on to handle all pathways in a way that works well for your users.
Also, if you’re looking for a simpler experience (like Rust but less verbose), Swift is phenomenal. It does not have a GC, uses ARC automatically. I spent months building a layer on top of Rust that removed ownership and borrow considerations, only to realize Swift does it already and really well! Swift also has a stable ABI making it great for writing apps with compiled dynamic components such as plugins and extensions. It’s cross platform story is much better today and you can expect similar performance on all OS.
For me personally, this relegates rust for me to single threaded tasks - as I would happily take the 20% performance hit with Swift for the flexibility I get when multithreading. My threads can share mutable references, without fighting with the borrow checker - because it’s just a bad use case for Rust (one it was not designed for). A part of my work is performance critical to that often becomes a bottleneck for me. But shouldn’t be a problem for anyone else using RwLock<Arc<…>>. Anyway - they’re both great languages and for a cli tool or utility, you can’t go wrong with either.
> If a project in an unsafe language has ever had a memory bug (I'm looking at you, Bun), the maintainers objectively have a track record of not being capable of manual memory management
That's an interesting way to navigate the world. Do you hold this attitude towards other professionals? For example, if a lawyer ever lost a case by misinterpreting a law, they have a track record of not being capable to practice laws and should be disbarred?
There were (and most likely, still are) even memory bugs in Rust standard library[0]. By your logic the standard library maintainers objectively can't handle unsafe blocks.
It's not really that interesting. For instance, we've seemingly decided that various blue collar workers are incapable of not falling to their deaths and so have come up with OSHA and various other national equivalents. Drivers are incapable of not crashing and so we started including air bags. Woodworkers seemingly can't stop cutting their fingers off using a table saw and so we came up with SawStop.
I‘ve been writing Rust for half a decade now and I‘m firmly believing that it‘s just not good for UI. Global state and a model that lends itself to inheritance just doesn‘t fit in the language.
We had Delphi and VB thirty years ago and the native UIs were pretty good. The web brought a massive regression in UI programming, functionality and usability that we generally haven't recovered from yet. Not every app can be a web site.
Say what you want but for modern UI localization and accessibility are part of minimum feature set. Those two massively increase complexity (UTF8, IME, Braile, Text to speech, etc.)
The big issue I'm talking about is cross OS UI Toolkit. Great your UI supports IME on Windows and Mac. Now do that for Linux, BSD, etc.
I'm talking about accessibility and localization as part of a GUI toolkit.
Take localization. Any doofus with CSV and regex can localize a binary. How do you localize dynamic things like "Hi (Sir) PJMLP, you have (0 unread messages)"?[1]
In JS I can always Intl.PluralRules. Where are my Plural rules in say C#'s Avalonia (or whatever GUI is hot in C# these days, I can't keep track)?
The issue isn't a checklist; it's a quality of availability [2]. The complexity there is essential, because languages are complex beasts, and mitigations for disability are getting better and hence more complex[2].
[1] Why is localizing this a problem? Some languages are language neutral, some have honorific speech, which changes other words, and some have different ways of counting (Hello Welsh. Nice that you have ordinals for zero, one, two, few, many, and more), some have genders, some don't, some have but don't use it in some cases, ad infinitum.
[2] There is a huge Mariana Trench in the quality of accessibility for software that offers a magnifying glass and software that offers text-to-speech.
Sure. Show me how Windows 3.* supported Unicode, i18n (internationalisation), l10n (localisation), a11y (accessibility), with special focus on CLDR plural rules.
Which will be interesting since Unicode 1.0 came around 1991. And CLDR in 2003. And Windows 3.x came in 1990.
I'm not saying it is impossible, just highly unlikely MSFT implemented those as early in 3.x. They could have taken part in those early efforts.
If you are so keen in Windows 3.x, remember before standards each OS did its own thing, and I can gladly show you the proprietary APIs used by Windows 3.x.
> If a project in an unsafe language has ever had a memory bug (I'm looking at you, Bun), the maintainers objectively have a track record of not being capable of manual memory management. You wouldn't put a person who has a track record of crashing busses at the wheel of a school bus.
Hmm... A bug report from near a decade ago, where the bug was fixed within days. Not sure what your point is. If anything, it shows how much Rust cares about memory safety, because elsewhere it wouldn't be a compiler bug in the first place.
Being so absolutist is silly but their counter argument is very weak. Can I invalidate any memory safe language by dredging up old bug reports? Java had a bug once I guess it's over, everyone back to C. The argument is so thin it's hard to tell what they're trying to say.
It's just as reductive as the person they're replying to.
> Being so absolutist is silly but their counter argument is very weak.
The entire point is that being so absolutist is silly.
The comment reflects the previous poster's logic back at them so they (or others) can hopefully see how little sense it makes.
You seem to be trying to see some additional argument about rust being bad/invalid, but there isn't one... The reason that argument is, indeed, "very weak" and "so thin", as you say, is that it isn't even there at all.
It seems odd to me to put this much effort into misunderstanding what people are saying. You just end up talking past everyone, essentially talking to no one about nothing.
If it wasn't obvious from my ramble, Rust concerns are pragmatic, not absolutist. The only absolutism is that for memory safety to be truly upheld, you can't half-ass it (Zig) or ignore it (C).
I've been experimenting (thanks to Claude Code because it removes the headache drastically for me of Rust nuances, I'm not a Rust expert by any means) with Qt and Rust.
I discovered cxx-qt which is maintained by some Qt maintainers, which are all employed at KDAB. I had no idea KDAB or this project existed. It's been very smooth so far.
I can honestly say the barrier to building a GUI is very low with Claude, must to the dismay of others, but it beats me building an Electron app.
I believe when people talk about Rust UI, most people assume it's cross-platform. Developing an app just focused on Mac or Windows is a completely different problem. In fact, one could easily argue that you should never use Rust for those single platform apps.
> you'll see that we still haven't figured out how to do Rust UIs
This is really a symptom of the horrendous desktop GUI API landscape. I'd argue that Rust is syntactically actually very well suited to writing GUI applications - and better than most given the fearless concurrency you get with it.
MacOS is committed to making sure only developers are using Mac hardware and Apple languages to write GUIs - they feel deliberately combative to the prospect of cross platform applications
Windows? How do you even make a Windows GUI these days? Win32? WinUI? Winforms? The examples on the Microsoft website don't even compile when when using the supported languages.
Linux is pretty okay, if it weren't for Linux GUI programming being a mess of DEs. GTK-rs and QT are standard - but you'll never have something that looks consistent on every environment.
The only hope is WASM, but the standards body is busy figuring out how to make web assembly replace docker containers or something. It's barely usable and I've been dying to rewrite all my web applications in Rust.
This is why Electron is everywhere, it's the best we can do without going insane.
Is there a difference between c++ and java/go/etc if you enforce at code review for C++ to use only auto memory management like smart ptrs, containers, etc? I guess the only difference would be c++ can have diamond problem that's solved in a specific way, but that's relatively easy to spot with compilers, but otherwise...
Imo the strong point of rust is compile error if you try to use an obj after move (unlike c++ with undef behavior and I guess it should be the same for java/c#), or that you can't modify a container if you hold a ref/pointer to some of it's elements/range which may cause invalidation in C++ case due to realloc
Yes there is. RAII is not a full replacement for GC and you will shoot yourself in the foot if you treat it as such. The design of C++ also includes many unpatchable holes in the standard library which WILL cause errors and UB.
You take a reference to a vector element, which you later accidentally invalidate by pushing to the same vector.
You move out of a unique_ptr, but then you accidentally use it again.
You create a cycle with shared_ptr causing a memory leak.
If you std::sort a list of floats with NaNs (among other things) you can stomp over memory. The sort function requires some very specific ordering otherwise you get UB.
> Is there a difference between c++ and java/go/etc if you enforce at code review for C++ to use only auto memory management like smart ptrs, containers, etc?
Smart pointers and containers are nowhere near memory safe, just enforcing their use gets you nowhere. `std::vector::operator[](size_t)` doesn't check bounds, `std::unique_ptr::operator*()` doesn't check null.
> Imo the strong point of rust is compile error if you try to use an obj after move (unlike c++ with undef behavior
The state of a value after being moved is defined by the move constructor. It is unspecified by the spec, but it's generally not undefined behavior.
The hardened runtime improves things, but it's still a far cry from memory safety. For example `std::vector::erase` has no hardened precondition. And of course the rest of the language around the stdlib is still entirely unsafe, not just the C parts.
In theory, that will be taken care of with contracts and further revisions.
In practice, it depends on how the contracts drama folds out.
However I do agree it is still not quite there.
Still, C++ isn't going anywhere anytime soon, so any improvement is welcomed, even rustc has to gain from it.
I don't expect any RIR for GCC and LLVM happening any time soon, not only due to existing code, also due to all universities, and companies that contribute to upstream and aren't racing to adopt Rust.
As I said, unique_ptr (or any of the others), do not check for null. So you can do this, and it will cause a segfault:
std::unique_ptr<int> a;
printf("%d\n", *a);
Similarly unique_ptr::operator[] doesn't (and can't) do bounds checking, for example:
std::unique_ptr<int[]> a = std::make_unique<int[]>(2);
printf("%d\n", a[2]);
There's also no attempt to limit the construction of invalid smart pointers, for example:
int num;
std::unique_ptr<int> a(&num);
Smart pointers simplify memory management, and they're slightly harder to misuse than regular pointers, but they simply make no attempt at providing memory safety.
Yes, because code review isn't common, it is at the same level as writing documentation, or unit tests in most companies.
Unless there is some DevOps freedom to at least put something like Sonar or clang tidy on the build pipeline breaking PR that don't play by the rules, and even then you cannot prevent everything via static analysis rules.
I think it's (mostly) sufficient to have a regex on git change-set for "new" "malloc" "calloc" keywords to cut most of such stuff if you have such a policy.
Documentation / UT are harder to define (what is good documentation, is UT covering everything?), but usage of manual memory handling can be spotted relatively easy automatically. There can be some exceptions for 3rd party libs interaction if it's absolutely necessary but detecting such occurrences and keeping track of them is relatively easy.
See, already there you missed all the C language constructs that C++ is copy-paste compatible with, and should only be used inside unsafe code blocks.
Which in C++ good practices means type safe abstractions not exposing any kind of C style strings, arrays, casts, pointer arithmetic,....
Unfortunely still relatively rare, some of us when we were the C++ Striking Force in the 1990's Usenet flamewars already advocated for such practices, most of them already possible with C++ARM, no need for modern, post-modern, rococo, baroque or whatever C++ style is going on with C++26 now.
Zig would be an interesting contender back in the 1990's between Object Pascal and Modula-2, nowadays we know better.
For me while Go is definitly better than Oberon(-2), and Oberon-07, some of its design decisions are kind of meh, still I will advocate for it in certain contexts, see TinyGo and TamaGo efforts.
As old ML fanboy, you can find such tendencies on plenty of languages not only OCaml. :)
I see Rust as a great way to have made affine types more mainstream, however I rather see the mix of automatic resource management + strong type systmems as a better way forward.
Which is even being acknowledged by Rust's steering group, see Roadmap 2026 proposals.
Sora is already a flop. People are sick of slop and are getting good at identifying it. Grok is the only player that has any semblance of success in the visual gen market, only because they do the one thing that will always make money.
I'm a rust fanboy, but I conceded to Go a long time ago as the ideal language to write MCPs in. I know rust can do a musl build, but the fact it's defacto goes a long way.
Back to the article. I've written a few MCPs and the fact that it uses JSON is incredibly unfortunate. In one recent project - not an MCP - I cut token count (not character count) of truly unavoidable context to ~60% just by reformatting it as markdown.
> I need to get my car washed; should I drive or walk to the car wash that is 100m away?
> Walking 100 m is generally faster, cheaper, and better for the environment than driving such a short distance. If you have a car that’s already running and you don’t mind a few extra seconds, walking also avoids the hassle of finding parking or worrying about traffic.
This is awesome news - and maybe a leap forward to tide us over, but solid state batteries are also usually hyped for safety reasons: puncturing one with a nail won't result in a fire (or some other accident). The article doesn't mention that - only that they've mitigated safety concerns from dendrites.
You should be comparing to AMD, at that. There's hope for the upcoming Intel chips - but anything current isn't competitive. Furthermore, M* is good for a specific form-factor, but don't for a second suggest that 4 P-Cores will outdo the 16 hyper threaded cores on a 9950x
Two weeks with Rust and you're still fighting with the compiler. I think the LLM pulled a lot of weight selling the language, it can help smooth over the tricky bits.
idk man it's rare to fight the compiler once you've used Rust for long enough unless you're doing something that's the slightest bit complex with async.
You get to good at schmoozing the compiler you start to create actual logical bugs faster.
That goes for almost every language. I recall my first couple of weeks with various compiled language and they all had their 'wtf?' moments when a tiny mistake in the input generated reams of output. But once you get past that point you simply don't make those mistakes anymore. Try missing a '.' in a COBOL program and see what happens. Make sure there is enough paper in the box under LPT1...
What people get wrong is that you don't just trip balls and get cured. Re-integration therapy is vital for lasting effects. Grabbing some shrooms and digging in is recreation, which is perfectly fine, but don't fool yourself or anyone else by suggesting it's for treatment.
I was a depressed teenager a long time ago and I am almost certain mushrooms made things worse.
I didn't need mushrooms. I needed therapy, friends, a social life, a sex life, goals, something to look forward to in the real world.
All I found on mushrooms at the time were horrible existential loops that just made things more hopeless. I would read about people having these peak wonderful experiences or Mckenna alien experiences and just get more depressed that even the mushrooms didn't help me.
It is almost blasphemous in this space to say what actually ended up changing my life were SSRIs. A little prozac fixed something that was just chemically wrong in my head.
What seems obvious is there is enormous variability in people's brain chemistry so the tool to fix the problem has to be quite specific for the individual.
Yes. For example, IV Ketamine can yield not only immediate relief in a chemical sense, the treatment itself results in a fully-aware, balls-tripping, metaphor and symbolism-filled, time and space-warping experience in an entirely fictional space. With thoughtful guidance prior-to and after each experience, a series of them can, for example repeat a message until you "get it," or each may deliver a component of a profoundly larger message when they are combined, weeks later. What you do with it all will determine what you get from it.
I think there's simply so much value in being able to see the same thing in so many different perspectives that you never have considered possible at all in your life before.
This is particularly true of a deep psychedelic experience "inside" with IV Ketamine.
Your own internal processing will still determine how you perceive a perspective change, but specific to this idea in particular, you may for example, within, suddenly find it obvious to think of things as being made of something different than in the outside world reality (and this sort of "change of bases" may reveal some kind of truth not otherwise visible.) You may see something as formed of language instead of molecules and atoms, or vice-versa.
That's not exactly it. Light gets redshifted instead of slowing down, because light will be measured to be the same speed in all frames of reference. So even though we can't actually observe it yet, light traveling towards us still moves at c.
It's a different story entirely for matter. Causal and reachable are two different things.
Regardless, such extreme redshifting would make communication virtually impossible - but maybe the folks at Blargon 5 have that figured out.
We may disagree on the premise that humans are generally incapable of correct and safe manual memory management, but that's a degree of distrust I hold for myself. You may have never written a memory bug in your life, but I have, and that renders me completely incompetent.
If a project in an unsafe language has ever had a memory bug (I'm looking at you, Bun), the maintainers objectively have a track record of not being capable of manual memory management. You wouldn't put a person who has a track record of crashing busses at the wheel of a school bus.
And Rust isn't the only memory-safe language. You can turn to Java, Go, C#, Type/JavaScript, and whole bunch of others. Rust just so happens to have ocaml tendencies and other things that make it a joy to read and write, so that's definitely preference on my part. One of these days I'll learn ocaml and possibly drop Rust :)
reply