Do you guys considering of support llm
Model like deepseek or doubao or gpt4o like craft or plugin like obsidian or pal can direct integrate llm model?
We have support for Apple Intelligence (eg Writing Tools), but not yet full support of an LLM.
What types of functions were you looking for? Semantic search of you content? The ability to ask questions about your notes?
Apple Intelligence should give basic features, like proofreading, summarizing etc. But it won’t give global queries of your content.
just reference Craft use slash to invoke an assistant and do whatever you want with AI
I’m not sure how pertinent it would be. The text-features are covered by Apple Intelligence (proofread, summarize, etc.). It will be available for more people and languages in April.
For content queries, Apple Intent is also a good way to give access to your notes to Siri (when it will be more efficient, in 2026).
A full LLM support as Craft one will imply a consequent extra-cost. The requests will be expensive for Agenda team, so they will have to create a dedicated subscription.
Gemini, Siri, ChatGPT agentic mode, etc. will have soon access to the content of the apps (if the user allowed them of course!)
So, they will be the best access to LLM for ALL of our content (not only Agenda) with the possibility to cross-over the content between the apps. If each app develops its own LLM system, it will be very expensive for customers and less efficient.
The future of LLM is system-wide models (best if running locally) for a lot of daily tasks. This will be more useful and less expensive ! The more heavy usages will stay online, but these don’t apply on notes app like Agenda.
So I don’t think Agenda team should give time and ressources to “pretty soon obsolete” usage of LLM.
Wise words. This is pretty much our feeling too at this point.
For the record, we have added support for writing tools, and the new Intents. Not everyone has these yet, but I’m sure Apple will roll them out sooner rather than later.
The open question is how well the new intents work with the data in Agenda. We have done our bit, but we need to see how well it works in practice. If it is subpar, we can revisit our approach.
The LLM stuff is moving so fast at the moment, it’s hard to see what a lasting approach is. Last week OpenAI seemed a safe bet, this week it’s the Chinese You just can’t nail it down, and I don’t think anyone really knows where it will end.
Apple Intelligence, starting in iOS 18.4/macOS 15.4, will be able to use capabilities exposed by applications such as Agenda via App Intents and App Entities. That includes content within the applications – I hope that you’ll be taking full advantage of it.
Additionally, Siri will have true on-screen awareness via the hooks in the display system, and the ability to take actions in, and across apps as described in the snippet below. Hoping that Agenda will be able to take full advantage. Also hoping that the betas will be available in the EU, so you can make the magic happen
Yep, we already added this intents support in iOS 18.0. We have been waiting for Apple so we can see how it works