I feel the same way, but I think my feelings may change if I didn't actually think the person was a good enough person that deserves to have their writing immortalized, like in this case. Of course, we only have his side, but the GP doesn't seem to think his dad was a good person and wrote some hurtful things in the diary about someone they cared about, which I feel as though is justification for their actions.
>Knowledge distillation works like this: you take a large model, have it perform tasks with detailed reasoning, then feed those reasoning traces to a smaller model until the student learns to mimic the teacher. The smaller model ends up far more capable than if you’d trained it from scratch on the same data. Apple can now do this with the full Gemini, not just their own in-house models, and the distilled output runs locally. No internet required.
No freaking way. AI companies see this as tantamount to pirating their models. There is no way that Google is not explicitly banning this in their agreement to allow Apple use their models.
Yeah, it's weird they even included that. It reads like a psych shelf exam question to test if you know the connection between marijuana use and acute psychosis. But still, it is difficult to completely separate the AI being a possible catalyst for it.
Yes, and the natural extension is that a lot of what people do day to day is not work-driven by intelligence; it is just reusing a known solution to a presented problem in a bespoke manner. However, this is something that AI excels at.
> This is focused product development and craftsmanship which is very different from Vibe coding something. So let this be a reminder to all the "I can vibe code this or that in a weekend". Good products / experiences take time.
How do you know? There isn't a git repo that one can see the history of, he could have coded this in one weekend and used the rest of the time doing noncoding activities. Also, he could have made the entire thing by prompting without any hands on coding at all. The fact that it is a web app with a SaaS platform (the thing that LLM-assisted coding is the best at) doesn't inspire confidence.
I signed up and gave the product a spin and its clear that its not some vibe coded weekend project. Clearly a lot of effort has gone into it and OP also was clear that they've spent 10 months on this.
I hate it when people use commentary articles as fake sources for their points. It's even more aggravating when the "journalists" are making points that play to the ignorance and outrage of the reader, as they know those readers are the easiest to bait for clicks and mislead. For instance, how is Anthropic claiming that its total revenue since January 2025 as $5 billion contradict that its expected run-rate revenue for the year 2026 is $19 billion?
> Anthropic claiming that its total revenue since January 2025 as $5 billion contradict that its expected run-rate revenue for the year 2026 is $19 billion?
Isn’t the “exceeding $5BN” comment a lifetime revenue? … on $30BN (edit: previously said spent) raised (or something ridiculous.)
A lot of the commentary on the frontier model companies is based on how much money they’ve spent to the relatively small amount they’ve made in return, and the skepticism, especially given almost continuous reporting, that deploying AI in a variety of situations doesn’t seem to yield favorable business outcomes. OpenAI shifting to enterprise / coding type stuff this week seems, also, potentially informative. Is Gen AI actually useful for anything but code? Signs keep pointing to no… and even then, we’re in the early stages of figuring out how to build without destroying everything… something Amazon just recognized as possible with their recent shopping outage.
If you don't wanna pay, Library Genesis has the first edition (2004), but, if you didn't find the examples to be at least modestly interesting in themselves, is this even your bag? As a Linux sysadmin and occasional writer of lousy C programs, I often consult NetBSD's source tree for when I want good examples that aren't as complex as GNU's, so I expect to come back to these.
Judging by the publisher's sample,[1] the second edition (2025) looked like a worthwhile upgrade, so I ordered it. Much of the material is in the manpages, but this presents it with better explanations.
Static linking libraries for MacOS or Windows is contaminated by GPL/LGPL code, and this is why wxwidgets excludes the disclosure requirement.
Also, if you are looking for a VueJS cross-platform GUI framework for most Desktop and Mobile platforms (modern MacOS hardware and developer account is a requirement):
"contaminated by GPL/LGPL code"? really? if you want to make profit off of your code buy for several thousands of dollars commercial libraries and you are in the clear. if you don't want to pay for libraries you have to accept their conditions
I usually prefer releasing under Apache 2.0 license, as I can't predict what people will need 10 years from now.
People use LGPL libraries in commercial software all the time, as the shared objects (.so) do not contaminate the source application license. The instant someone static links LGPL/GPL lib into an application binary its source also must carry a compatible open license. Note this obligation differs from publishing patches back to the main lib branch.
It gets messy when people release libraries under multiple licenses. Porting games and applications from *nix systems can be non-trivial. Best regards =3
Yup. And right now I'm straight-up breaking Claude's TOS by modifying OpenCode to still accept tokens. But I only have a few days left and don't care if they ban me. I'm using what I paid for.
reply