Configuring the LLM secret key
You must configure the Kubernetes secret for the large language model (LLM) provider in the Kubernetes namespace where you installed the Konveyor Operator.
You must create a LLM API key secret in your Kubernetes cluster to produce the resources necessary for the Solution Server. If you do not configure the LLM API key secret, Konveyor AI does not create the resources necessary to run the Solution Server.
Procedure
-
Create a credentials secret named
kai-api-keysin thekonveyor-aiproject.- For Amazon Bedrock as the provider, type:
kubectl create secret generic aws-credentials \ --from-literal=AWS_ACCESS_KEY_ID=<YOUR_AWS_ACCESS_KEY_ID> \ --from-literal=AWS_SECRET_ACCESS_KEY=<YOUR_AWS_SECRET_ACCESS_KEY> - For Azure OpenAI as the provider, type:
kubectl create secret generic kai-api-keys -n konveyor-ai \ --from-literal=AZURE_OPENAI_API_KEY='<YOUR_AZURE_OPENAI_API_KEY>' - For Google as the provider, type:
kubectl create secret generic kai-api-keys -n konveyor-ai \ --from-literal=GEMINI_API_KEY='<YOUR_GOOGLE_API_KEY>' - For the OpenAI-compatible providers, type:
You can also set the base URL as the
kubectl create secret generic kai-api-keys -n konveyor-ai \ --from-literal=OPENAI_API_BASE='https://example.openai.com/v1' \ --from-literal=OPENAI_API_KEY='<YOUR_OPENAI_KEY>'kai_llm_baseurlvariable in the Tackle custom resource.
- For Amazon Bedrock as the provider, type:
-
(Optional) Force a reconcile so that the Konveyor operator picks up the secret immediately
kubectl patch tackle tackle -n konveyor-ai --type=merge -p \ '{"metadata":{"annotations":{"konveyor.io/force-reconcile":"'"$(date +%s)"'"}}}'