[Obs AI Assistant] Expose recall function as API#185058
[Obs AI Assistant] Expose recall function as API#185058dgieselaar merged 5 commits intoelastic:mainfrom
Conversation
c9f3344 to
45a4f1f
Compare
faa5f3a to
8691972
Compare
8691972 to
53d2a00
Compare
|
/ci |
| if (typeof disableFunctions === 'object') { | ||
| systemFunctions = systemFunctions.filter((fn) => disableFunctions.except.includes(fn.name)); | ||
| } |
There was a problem hiding this comment.
I find this API weird. What about keeping disableFunctions: boolean, then have a separate allowedFunctions: string[]. By default all functions are included. With disableFunctions=true no functions are included. And with allowedFunctions=func1,func2 only specified functions are included.
The allowedFunctions and disableFunctions should be mutually exclusive (although we don't have to enforce it).
Btw We could also have disallowedFunctions but I'm not sure that's really necessary
There was a problem hiding this comment.
Agree that it is a little weird, but I use this structure precisely so we don't have to add any type or runtime guards around it. Plus indeed we might end up wanting to disable only specific functions (which would be disableFunctions.only). But maybe it's not worth it, WDYT?
There was a problem hiding this comment.
@sorenlouv I will just merge this in, in the interest of time but I'm happy to make this change separately if you feel it's worth it.
| const mappedQueries = queries.map((query) => | ||
| typeof query === 'string' ? { boost: 1, text: query } : query |
There was a problem hiding this comment.
When is query a string and not an object?
There was a problem hiding this comment.
Leftover from merging your work into mine I think. Will fix!
|
@elasticmachine merge upstream |
💛 Build succeeded, but was flaky
Failed CI StepsTest Failures
Metrics [docs]Module Count
Public APIs missing comments
Async chunks
Public APIs missing exports
Page load bundle
History
To update your PR or re-run it, just comment with: cc @dgieselaar |
Exposes a
POST /internal/observability_ai_assistant/chat/recallendpoint for Investigate UI . It is mostly just moving stuff around, some small refactorings and a new way to generate short ids. Previously we were using indexes for scoring suggestions, we are now generating a short but unique id (ie 4-5 chars) which generates a fairly unique token which strengthens the relationship between the id and the object but still allows for quick output. LLMs are slow to generate UUIDs, but indexes are very generic and the LLM might not pay a lot of attention to it.