thih9 8 days ago

Note that this is a very simple library and not very efficient. E.g. for the code that filters an array, it would run N prompts[1]:

    `You are a filter agent.\nYour job is to return whether an item matches the criteria: ${criteria}\nRespond only with true or false.`
It's a cool demo, but I wouldn't use that in production; IMO having that code in a separate library offers little benefit and increases the risk of misuse.

[1]: https://github.com/montyanderson/incant/blob/73606e826d6e5b0...

voidUpdate 8 days ago

So this is just asking an LLM to filter or select from an array? Where do the magic spells come in?

helloplanets 8 days ago

How does this differ from function calling? For example, the basic enums example for Gemini function calling:

> color_temp: { type: Type.STRING, enum: ['daylight', 'cool', 'warm'], description: 'Color temperature of the light fixture, which can be `daylight`, `cool` or `warm`.', }

https://ai.google.dev/gemini-api/docs/function-calling?examp...

  • supermatt 8 days ago

    It’s the inverse of function calling. Here the function is calling the LLM, not vice versa.

marcus_holmes 8 days ago

I'm curious how the hallucination-free guarantee works? Does it only guarantee that the output is a subset of the input?

In the case of the male names, if I include a gender-neutral name like "Sam" does that include it because it is a male name, or exclude it because it is a female name? Can I set this to be inclusive or exclusive?

Looks interesting, though. Nice work.

jollyllama 8 days ago

vibecoding is a hell of a drug

supermatt 8 days ago

> no hallucinations possible

It can still hallucinate a response that is defined in the filter.

E.g if you have a filter with names of capital cities [“London”, “Paris”, “Madrid”] , and you ask “What is the capital of France” it could respond “Madrid”

  • ycombinatrix 8 days ago

    Is that a hallucination, or is it just plain wrong?

    • supermatt 8 days ago

      An AI hallucination is any response that contains false or misleading information presented as fact. So a wrong answer is an hallucination.