Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

iOS 18.1 and macOS 15.1 -- the versions with the full Apple Intelligence -- are both in betas, and there are enormous numbers of people blogging and vlogging about their experiences. And yes we all realize that on-device models aren't going to be competing with trillion parameter models, but people are finding it pretty useful.

They just don't feel like it's release ready so they're refining. Apple has done this with major releases over several iterations now where a couple of features are held back for the .1 release while it's perfected.



At least in Photos, the model is not what one would consider working. A quick example: "Halloween with [x]" works, because Halloween is recognized as a specific date. It works well. Christmas throws it for a loop. Christmas isn't thought of as a time but a state. So it'll show photos that look appropriately Christmassy, but the picks appear semi-random: it'll show one of about five taken in quick succession, but the one it picks is someone wearing a Santa hat in profile, but the other 4 are straight on and one would think more likely to be returned.

I don't think anyone can say with a straight face that being unable to properly grok the biggest holiday of the year for many Western countries is sufficient.


That isn't "Apple Intelligence" (which is the generative stuff held over to the .1 releases). What you're describing is inference metadata + basic search logic and has been in iOS and macOS for several major versions now, constantly improving.

And given that Christmas resolves to a date (it actually offers up the autocomplete of "Christmas Day" to make it easier, then simply making the search criteria a calendar date), for me it literally shows all photos on Dec 25. I guess mileage varies.


Well, except Apple specifically calls out this exact photo and video search function on their page titled "Apple Intelligence." Sure they've had some basic search already, but in their demos and advertising, they promise that Apple Intelligence is used to find photos by descriptions.

>Search for photos and videos in the Photos app simply by describing what you’re looking for. Apple Intelligence can even find a particular moment in a video clip that fits your search description and take you right to it.

https://www.apple.com/apple-intelligence/


This exact search functionality?

If we accept Apple's claim that Apple AI is coming in beta in fall 2024 (or per the Canada page December 2024), and I have the release of 18 which by that restriction is not Apple AI enabled, I can already do complex semantic search which means that functionality can't be "Apple AI", right? And person on date has been in iOS for at least two prior generations as well. iOS has been allowing you to search by people in images, places, events, and text appearing in the image, along with broad categorizations like "sunset", "beach", etc, for at least two generations.

However when you're typing in a search, for each term it tries to contextualize it via selections. For instance if you already had "{Person} on " and then typed Christmas, it will let you pick if you mean Christmas the event, Christmas an "object", or Christmas the literal text (an icon of a couple of lines of text in a photo frame). I suspect the poster selected, probably unintentionally, Christmas the text and it gave them images where that text appears somewhere in the image. Just out of curiosity I did that and it gave me a set of images that I thought must be mistaken, but on each image somewhere the text Christmas could be found. In one it was a crazily distorted cursive writing on a table cloth hanging over the edge of a shelf, which is just crazy.


'perfected' is an interesting word choice.

I'd be more inclined to use 'iterated'.


"iterated" implies that there is no improvement. Why do you thing 15.1 wouldn't be an improvement over 15.0? I do agree with you that "perfected" is also not the correct word choice. I think I would have gone with "refined" or "improved"


Unless it can be completely turned off I will never upgrade and I guess I will be selling my year-old M3 Max in favor of some shitty PC (or I’ll eventually just run Asahi once it supports my hardware well).


Apples pretty decent about putting toggle switches on stuff, for instance you don't have to enable iCloud or even associate it with an Apple account if you don't care for FindMy and remote erase etc

But I'm with you, since Apple signaled going all-in on "an assistant that has access to everything" I switched to an android with the intention of never enabling Google services and certainly not the voice assistant. Unfortunately I've found it too annoying to go completely without Google, I've read that RCS messaging won't work with an unlocked bootloader, nor will precise location, so I'm stuck with some evil in the name of features parity.


Why is this the line for you? macOS is already doing plenty of things it sounds like you wouldn't like.


I don’t want an LLM taking 8GB of RAM to do things I don’t value.


You can turn it off very easily.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: