The final day of the Linking Your Thinking Conference from Nick Milo happened yesterday. When I saw the line up for the conference, I was particularly interested to see that Christian Houmann, the developer of the popular plugin QuickAdd, was going to be giving a talk on how he utilises the power of AI in his Obsidian vault.
With a lot of my attention upon this topic because of experimenting with ways to reduce the time spent on pointless housekeeping tasks for my university degree, I decided to attend Christian’s section of the conference to see what he had to say on the topic. After all, Christian had developed QuickAdd’s AI Assistant, which revolutionised the way I work in almost all areas, so he was bound to add some insight to an already promising topic. This article covers some of Christian’s use cases for AI within Obsidian, and gives some of the more unique and original tips surrounding AI in Obsidian.
QuickAdd AI Assistant Explained
QuickAdd is a plugin in Obsidian made to automate the addition of notes, content in notes and application of templates in your vault. The plugin can also run macros, which are chains of QuickAdd commands, executed one after the other. Recently, Christian added an AI Assistant which added the function of prompting AI for an output from within one of these macros. Additionally, you can customise the prompt templates that you use, and contain them in a folder in your vault so that they can be referenced repeatedly by your macros.
By using this flexibility, and even chaining together prompts, the possibilities for using AI to augment and improve your own thinking are very wide. Christian demonstrated an MoC generator, book summarizer and YouTube video summarizer, which I will cover now in the following paragraphs, but I have already used created my own outline generation system, and created a macro that turns university content into flashcards for revision. Christian also has more examples of use cases on his blog post on the plugin’s update.
Dev’s Tip I — Few-Shot Prompting
The concept of few-shot prompting is that you provide the AI with examples of correctly generated responses, so as to better inform the output that is provided. This is especially useful with QuickAdd’s AI Assistant, because all the prompts are stored as templates in a folder, giving you more flexibility with creating a longer prompts that are more specific to what you want generated. In the same way, 0-shot prompting and 1-shot prompting are when you provide the AI with no examples or 1 example to work with. The most effective choice would be to pick a few examples of the best possible responses and include these in the prompt, hence the tip being ‘few-shot prompting’.
Use Case I — Map of Content Generator
Using DataviewJS, Christian can generate a set of links to all the notes that link to a note that he specifies. In his AI MoC generator, the prompt asks Christian to specify a note, which he does, and then goes about creating the list of linked notes, and turning these linked notes to a series of coherent paragraphs about the topic. The linked notes are scattered throughout the paragraphs and can be read like normal text. The purpose behind this function is to give some context to related topics, which can better inform or inspire your own knowledge and ideas.
Christian noted that the essence of the QuickAdd plugin is to be a tool that does its job automating tasks in your vault when you need, but stays out of the way for you to focus on the tasks that require a large amount of human input. He also specifies that this use case is only really relevant if you plan to use the generated MoC. The key here is to focus on efficiency at all times, so that you don’t find that you are creating unnecessary tasks and prompts that don’t inform your work in any way.
Dev’s Tip II — Build an Intellectual Sparring Partner
You can make the AI prompts as simple or as complex as you understand. There is a potential to extract a lot of value from a very simple prompt. Christian demonstrated one that gave feedback, suggestions and alternative perspectives on a given input. The prompt was very simple, but the responses hugely informative, inspiring additions that would have given more depth and interest to a piece of writing. Related to an ‘intellectual sparring partner’, this made sure that you were still writing your own original content, but harnessing the depth and insight of AI at the same time.
Use Case II — Book Summarizer
Christian uses the software Readwise, which is a service that aggregates all the highlights of content that you might have taken on various different platforms. This could include Kindle, a read-it-later tool such as Instapaper, or even interesting Twitter threads. Christian developed a script that allows for the highlights of a specific piece of content to be scraped from his own Readwise account. Since the introduction of AI Assistant however, he can take the workflow a little further.
Christian now retrieves his highlights using the same script, but now feeds them into an AI prompt that generates a concise summary of the main points of the piece from the highlights. This is placed at the top of the content note, allowing for original thoughts to be added underneath the summary. Being a believer of reading to take action on what’s been learned, Christian also added a section to the macro that again feeds the highlights to the AI, but this time outputs a list of actionable steps that have been inspired by the highlights from the content. Because the highlights are from Christian’s own Readwise account, these steps are a customised procedure that can be taken to implement the advice from the book, making the summaries even more valuable.
It’s also worth noting that you can add annotations to your highlights in many circumstances in the apps that Readwise aggregates its content from. Briefly discussed in the conference, I see potential in creating a prompt that recognises any statement preceded by ‘Note:’ as the thoughts and opinions of the one doing the highlighting, and incorporates these thoughts accordingly into the generated summary.
Dev’s Tip III — Alter System Prompt Settings
You can change the default behaviour of the AI in the default system prompt dialogue in the QuickAdd AI Assistant settings. At the moment it’s set to format its outputs in Obsidian markdown syntax, using $LaTeX$ for the formatting of equations and symbols. You can play with these to create the output you desire, but a more interesting setting is the ‘temperature’ of the AI. Varying the temperature of the AI changes its creativity, with 0 being the lowest and 1 being the highest creativity setting. What this does in reality is alter the randomness of the next word that the AI generates, but it serves to increase something similar to creativity in the generated answer, allowing you to gain more or less abstract responses to your prompts in relation to what you desire from the model.
Use Case III — YouTube Video Summarizer
Created by Christian just a few hours prior to the conference, this macro summarizes YouTube videos in a similar way to the book summarizer, except with a longer and more complicated DataviewJS script that calls to a website to access the transcript of the video that you provide a link to. Because of the token limit enforced on prompts this is only useful with shorter videos or models that support larger contexts, such as GPT-4, but hopefully this technology will become more readily available in the near future so we can all create summaries of our favourite YouTubers’ videos. This video summarizer uses the same prompt as the book summarizer for creating a concise roundup of the main points conveyed. This is the same with the actionable steps generation section.
Dev’s Tip IV — Use AI Assistant as You Would Lego Blocks
Illustrated well by both summarizer examples, the same prompt template can be referenced in different macros to produce customised outputs. Christian likens this to Lego blocks, which you can customise individually, but where the most flexibility and functionality comes when you combine them with other Lego blocks. This implies that you should be able to build a bank of high quality prompts performing common functions, and chain them together to build customised workflows for specific use cases when required. Like I explained earlier, it’s this flexibility that sets apart QuickAdd AI Assistant from pretty much any other AI capabilities in other Obsidian plugins and indeed in many other PKM applications in general.
Conclusion
That’s it for my summary of Christian Houmann’s presentation at the LYT Conference recently. Thanks to Nick Milo for organising the conference, I will be checking out the replays of some of the other interviews and demonstrations. Thanks again to Christian for continued development of such a useful plugin, it’s exciting to imagine what the future holds for PKM and AI working together. If you learned something from the article, give a clap, and consider following, it’s much appreciated!