Inferences
Setup for simple use case.
- Create and set up a Cloud resource connection.
- Target Vertex AI LLM (see below)
How to do multi-language text embedding in BigQuery
You can select model from here
CREATE OR REPLACE MODEL `llm.text_inference_model`
REMOTE WITH CONNECTION `project-a.asia-northeast1.connection-b`
OPTIONS (ENDPOINT = 'gemini-1.5-flash-preview-0514');
How to infer via BigQuery and return with JSON
struct
SELECT
TO_JSON_STRING(STRUCT(
ml_generate_text_llm_result AS result
)) AS json_output
FROM
ML.GENERATE_TEXT(MODEL `llm.text_inference_model`,
(
SELECT
CONCAT('''Hello World''', content) AS prompt,
*
FROM
`project-a.asia-northeast1.connection-b`
WHERE
id = "baz"
),
STRUCT(
0.5 AS temperature,
TRUE AS flatten_json_output
)
);