Skip to content

[Serverless] Preconfigured Connectors: adds new connectors and updates existing one#242791

Merged
alvarezmelissa87 merged 10 commits intoelastic:mainfrom
alvarezmelissa87:preconfig-connector-updates
Dec 3, 2025
Merged

[Serverless] Preconfigured Connectors: adds new connectors and updates existing one#242791
alvarezmelissa87 merged 10 commits intoelastic:mainfrom
alvarezmelissa87:preconfig-connector-updates

Conversation

@alvarezmelissa87
Copy link
Contributor

@alvarezmelissa87 alvarezmelissa87 commented Nov 12, 2025

Summary

Serverless and on prem kibana pull from these yaml files for the preconfigured connector configurations. ESS changes are made in the cloud repo. A PR has also been raised for that.

Related issue #243094
This PR is part of the work to support multiple managed LLMs.

This PR:

  • Adds the "General Purpose LLM v2" which uses .gp-llm-v2-chat_completion as the inference id and "General Purpose LLM v3" which uses .gp-llm-v3-chat_completion as the inference id.
    • It should be okay to add before the inference endpoint is available as it will be filtered out of the connectors returned to Kibana if the endpoint does not exist. See relevant code.
    • Once the endpoint is available, the preconfigured connector will show up in the connectors list.
  • Renames "Elastic Managed LLM" to "General Purpose LLM v1"
    • This change also required some updates in places that rely on the old name when checking if a connector is internal/preconfigured.
    • In those cases, both the new connector names have been added
image

Checklist

Check the PR satisfies following conditions.

Reviewers should verify this PR satisfies this list as well.

  • Any text added follows EUI's writing guidelines, uses sentence case text and includes i18n support
  • Documentation was added for features that require explanation or tutorials
  • Unit or functional tests were updated or added to match the most common scenarios
  • If a plugin configuration key changed, check if it needs to be allowlisted in the cloud and added to the docker list
  • This was checked for breaking HTTP API changes, and any breaking changes have been approved by the breaking-change committee. The release_note:breaking label should be applied in these situations.
  • Flaky Test Runner was used on any tests changed
  • The PR description includes the appropriate Release Notes section, and the correct release_note:* label is applied per the guidelines
  • Review the backport guidelines and apply applicable backport:* labels.
@alvarezmelissa87 alvarezmelissa87 self-assigned this Nov 12, 2025
@alvarezmelissa87 alvarezmelissa87 added release_note:enhancement :ml backport:skip This PR does not require backporting Feature:Inference UI ML Inference endpoints UI and AI connector v9.3.0 Feature: AI Infra Relating to the AI Assistant flow and any work impacting/involving the AI/Inference Connector labels Nov 12, 2025
@alvarezmelissa87 alvarezmelissa87 force-pushed the preconfig-connector-updates branch from 5cf73af to 37451e2 Compare November 18, 2025 18:45
@alvarezmelissa87 alvarezmelissa87 changed the title [Serverless] Preconfigured Connectors: add new connector and update existing one Nov 18, 2025
@alvarezmelissa87 alvarezmelissa87 marked this pull request as ready for review November 18, 2025 20:23
@alvarezmelissa87 alvarezmelissa87 requested review from a team as code owners November 18, 2025 20:23
@elasticmachine
Copy link
Contributor

Pinging @elastic/ml-ui (:ml)

@alvarezmelissa87
Copy link
Contributor Author

/ci

@paul-tavares paul-tavares requested review from joeypoon and removed request for paul-tavares November 19, 2025 14:01
Copy link
Contributor

@Samiul-TheSoccerFan Samiul-TheSoccerFan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Left few questions in the PR. Are the inference endpoints are also preconfigured or only the connectors?

};
export const INTERNAL_INFERENCE_CONNECTORS = ['Elastic-Managed-LLM'];
export const INTERNAL_INFERENCE_CONNECTORS = [
'Elastic-Managed-LLM',
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same question if we need Elastic-Managed-LLM in here and in the following file.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ya, it looks like this PR removes Elastic-Managed-LLM and adds General-Purpose-LLM-v{1,2,3}.

Anyone who has a saved object that references Elastic-Managed-LLM won't have access to it after this change. Will be returned as "not found". I'm not sure what, if anything, might have a reference. I don't think these can be used directly as actions from alerting rules. If they can, then I think this will be a problem.

Or perhaps these connector id's are never referenced directly, just used internally?

Copy link
Contributor Author

@alvarezmelissa87 alvarezmelissa87 Nov 24, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@pmuellr - thanks for taking a look! 🙏
This updates the name of the preconfigured connector -Elastic-Managed-LLM to General-Purpose-LLM-v1 - the underlying inference endpoint and model are unchanged. I also updated the places in kibana that reference the connector id directly to look for both the old and new name. We've updated the naming/connector ids for preconfigured connectors in the past and it should not present a problem. The connector ids are internal - it's the underlying inference endpoint ids that are used.

The additional preconfigured connectors added (v2, v3) will only be returned by the actions client getAll once the backing inference endpoint exists, so no issue there either.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sounds good. I'd go ahead and try testing an upgrade, if you haven't already, by running

yarn es serverless --license trial --projectType $PROJECT_TYPE

and then run Kibana from main to do something to get a reference to Elastic-Managed-LLM, then kill Kibana, then run the same from your PR to see if everything survived "migration".

yarn start --no-base-path --serverless $PROJECT_TYPE
@alvarezmelissa87 alvarezmelissa87 added the ci:project-deploy-security Create a Security Serverless Project label Nov 21, 2025
@alvarezmelissa87 alvarezmelissa87 removed the request for review from TinaHeiligers November 21, 2025 20:40
Copy link
Contributor

@MichelLosier MichelLosier left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good from Fleet!

@alvarezmelissa87
Copy link
Contributor Author

@elasticmachine merge upstream

Copy link
Contributor

@tomsonpl tomsonpl left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Defend Workflows code changes LGTM 👍

Copy link
Contributor

@pmuellr pmuellr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ResponseOps changes LGTM. I noted some concerns here: #242791 (comment)

Sounds like the connector id's are never persisted anywhere (like other saved objects), so when the existing connector Elastic-Managed-LLM disappears when this PR is merged, nothing will break.

inferenceId: ".gp-llm-v2-chat_completion"
providerConfig:
model_id: "gp-llm-v2"
General-Purpose-LLM-v1:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We want Rainbow Sprinkles to be the "first" in this list, so that it's what's used by default for various AI tools. But I have no issue with adding more models.

Copy link
Contributor Author

@alvarezmelissa87 alvarezmelissa87 Nov 25, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for taking a look, @seanstory! 🙏

So in this PR I updated the places where the old name was being specifically checked for things like order/defaults to make sure both names are recognized (in case of timing issues with the cloud changes).

I will go through and ensure the default is unchanged for assistants/playground. Other entry points I may have missed will need to be addressed specifically.
I will also have a follow up to update all the copy that references the old name but was trying to keep this PR smaller and isolate the required changes more for better tracking.

The plan is to update default to v2 once this change is in and all teams are happy with changes. Aiming for before FF 9.3.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To be clear, I don't care about the name. What I care about is the order of this list for xpack.actions.preconfigured.

A number of AI features in Kibana predate any settings/configurations for when you might have multiple LLM connectors. Instead, they do something naive, like llmConnection = listConnectors().filter(c -> c.type == 'llm')[0], so it's important that we keep the expected default LLM as the "first" in the list.

It's possible that I'm out of touch, and that the new v3/v2 LLMs are already what we want the "default" to be. If that's the case, I retract this comment thread.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point! To be safe - updated the order so it's v1, v2, v3 in 0146ed4

@seanstory seanstory dismissed their stale review November 25, 2025 19:01

I'm dismissing my review so that I'm not blocking you. As long as someone from Inference is happy with this and how it changes the "default LLM connector" for AI assistants and Agent Builder, I'm sure it's fine. :)

@alvarezmelissa87 alvarezmelissa87 added the ci:project-deploy-observability Create an Observability project label Nov 25, 2025
@github-actions
Copy link
Contributor

🤖 GitHub comments

Expand to view the GitHub comments

Just comment with:

  • /oblt-deploy : Deploy a Kibana instance using the Observability test environments.
  • run docs-build : Re-trigger the docs validation. (use unformatted text in the comment!)

@alvarezmelissa87
Copy link
Contributor Author

/ci

1 similar comment
@alvarezmelissa87
Copy link
Contributor Author

/ci

@alvarezmelissa87
Copy link
Contributor Author

@elasticmachine merge upstream

@elasticmachine
Copy link
Contributor

elasticmachine commented Dec 3, 2025

💚 Build Succeeded

Metrics [docs]

Async chunks

Total size of all lazy-loaded chunks that will be downloaded as the user navigates the app

id before after diff
securitySolution 11.1MB 11.1MB +78.0B

Page load bundle

Size of the bundles that are downloaded on every page load. Target size is below 100kb

id before after diff
elasticAssistant 307.4KB 307.5KB +41.0B

History

cc @alvarezmelissa87

@alvarezmelissa87 alvarezmelissa87 merged commit e161e4e into elastic:main Dec 3, 2025
12 checks passed
@alvarezmelissa87 alvarezmelissa87 deleted the preconfig-connector-updates branch December 3, 2025 19:48
alvarezmelissa87 added a commit that referenced this pull request Dec 8, 2025
## Summary

Related issue #243094
Follow up to this PR #242791 -
naming request updated.

This PR is part of the work to support multiple managed LLMs. This PR
moves away from generic naming and updates names for preconfigured
connectors.

Because these configurations are in yaml files, the connector id cannot
use a period without escaping it.
There seems to be way to escape special chars by using something like
`preconfigured.["sonnet-3.7"].actionTypeId` but it hasn't been used so
far in kibana and would need to be tested.

Given the time sensitive nature of this change, I updated the id and
replaced the period with a dash in the PRs. E.g.
`Anthropic-Claude-Sonnet-3.7` becomes `Anthropic-Claude-Sonnet-3-7`
From what I can see in kibana, there is no logic depending on the id to
match the name exactly so this solution should be fine.


### Checklist

Check the PR satisfies following conditions. 

Reviewers should verify this PR satisfies this list as well.

- [ ] Any text added follows [EUI's writing
guidelines](https://elastic.github.io/eui/#/guidelines/writing), uses
sentence case text and includes [i18n
support](https://github.com/elastic/kibana/blob/main/src/platform/packages/shared/kbn-i18n/README.md)
- [ ]
[Documentation](https://www.elastic.co/guide/en/kibana/master/development-documentation.html)
was added for features that require explanation or tutorials
- [ ] [Unit or functional
tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html)
were updated or added to match the most common scenarios
- [ ] If a plugin configuration key changed, check if it needs to be
allowlisted in the cloud and added to the [docker
list](https://github.com/elastic/kibana/blob/main/src/dev/build/tasks/os_packages/docker_generator/resources/base/bin/kibana-docker)
- [ ] This was checked for breaking HTTP API changes, and any breaking
changes have been approved by the breaking-change committee. The
`release_note:breaking` label should be applied in these situations.
- [ ] [Flaky Test
Runner](https://ci-stats.kibana.dev/trigger_flaky_test_runner/1) was
used on any tests changed
- [ ] The PR description includes the appropriate Release Notes section,
and the correct `release_note:*` label is applied per the
[guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process)
- [ ] Review the [backport
guidelines](https://docs.google.com/document/d/1VyN5k91e5OVumlc0Gb9RPa3h1ewuPE705nRtioPiTvY/edit?usp=sharing)
and apply applicable `backport:*` labels.

---------

Co-authored-by: Elastic Machine <elasticmachine@users.noreply.github.com>
JordanSh pushed a commit to JordanSh/kibana that referenced this pull request Dec 9, 2025
…existing one (elastic#242791)

## Summary

Related issue elastic#243094
This PR is part of the work to support multiple managed LLMs.

This PR:
- Adds the "General Purpose LLM v2" which uses
`.gp-llm-v2-chat_completion` as the inference id and "General Purpose
LLM v3" which uses `.gp-llm-v3-chat_completion` as the inference id.
- It should be okay to add before the inference endpoint is available as
it will be filtered out of the connectors returned to Kibana if the
endpoint does not exist. See relevant
[code.](elastic#217641)
- Once the endpoint is available, the preconfigured connector will show
up in the connectors list.
- Renames "Elastic Managed LLM" to "General Purpose LLM v1" 
- This change also required some updates in places that rely on the old
name when checking if a connector is internal/preconfigured.
     - In those cases, both the new connector names have been added


<img width="1679" height="393" alt="image"
src="https://github.com/user-attachments/assets/fc7b1921-1c21-4eb9-8d6d-711ba4db783c"
/>



### Checklist

Check the PR satisfies following conditions. 

Reviewers should verify this PR satisfies this list as well.

- [ ] Any text added follows [EUI's writing
guidelines](https://elastic.github.io/eui/#/guidelines/writing), uses
sentence case text and includes [i18n
support](https://github.com/elastic/kibana/blob/main/src/platform/packages/shared/kbn-i18n/README.md)
- [ ]
[Documentation](https://www.elastic.co/guide/en/kibana/master/development-documentation.html)
was added for features that require explanation or tutorials
- [ ] [Unit or functional
tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html)
were updated or added to match the most common scenarios
- [ ] If a plugin configuration key changed, check if it needs to be
allowlisted in the cloud and added to the [docker
list](https://github.com/elastic/kibana/blob/main/src/dev/build/tasks/os_packages/docker_generator/resources/base/bin/kibana-docker)
- [ ] This was checked for breaking HTTP API changes, and any breaking
changes have been approved by the breaking-change committee. The
`release_note:breaking` label should be applied in these situations.
- [ ] [Flaky Test
Runner](https://ci-stats.kibana.dev/trigger_flaky_test_runner/1) was
used on any tests changed
- [ ] The PR description includes the appropriate Release Notes section,
and the correct `release_note:*` label is applied per the
[guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process)
- [ ] Review the [backport
guidelines](https://docs.google.com/document/d/1VyN5k91e5OVumlc0Gb9RPa3h1ewuPE705nRtioPiTvY/edit?usp=sharing)
and apply applicable `backport:*` labels.

---------

Co-authored-by: Elastic Machine <elasticmachine@users.noreply.github.com>
@peteharverson peteharverson changed the title [Serverless] Preconfigured Connectors: add new connectors and update existing one Dec 15, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

backport:skip This PR does not require backporting ci:project-deploy-elasticsearch Create an Elasticsearch Serverless project ci:project-deploy-observability Create an Observability project ci:project-deploy-security Create a Security Serverless Project Feature: AI Infra Relating to the AI Assistant flow and any work impacting/involving the AI/Inference Connector Feature:Inference UI ML Inference endpoints UI and AI connector :ml release_note:enhancement Team:Fleet Team label for Observability Data Collection Fleet team v9.3.0