Snowflake Native App
Introduction
Honeydew enables deep integration in Snowflake with a snowflake native application.
It can be installed from the Snowflake Marketplace Listing.
When the native app is installed, Honeydew API can be accessed directly from a Snowflake connection or the web interface, to do things such as:
- Query data based on a semantic query
- Generate Snowflake SQL query from a semantic query
- Consume any metadata such as field names and descriptions
- Update the semantic layer definitions
Installation
Honeydew account
Honeydew native application requires a Honeydew account. If you don’t have one yet, schedule a 20-min onboarding here. Review the initial setup documentation for a full list of setup steps.
Native App installation name
Note the name you have given to the application (i.e. SEMANTIC_LAYER_ENTERPRISE_EDITION
) and replace SEMANTIC_LAYER_ENTERPRISE_EDITION
in all examples below with the actual application name.
You can find the names of the installed applications by running the following command:
Create API Key
In Honeydew application, navigate to the Settings
page, and generate an API Key
to be used for this integration. Copy the generated API Key
and API Secret
values.
Set API Credentials
Set the Honeydew API Credentials for the Honeydew application, replacing api_key
and api_secret
with the generated API Key values:
Create Honeydew API Integration
Create a Snowflake integration to the Honeydew API:
This integration is used to allow the Honeydew native application to access the Honeydew API.
The API_NETWORK_RULE
and AUTH_NETWORK_RULE
are automatically created by
the Honeydew native application setup process.
They are pointing to the following endpoints:
API_NETWORK_RULE
:api.honeydew.cloud
AUTH_NETWORK_RULE
:auth.honeydew.cloud
Make sure to name the integration HONEYDEW_API_ACCESS_INTEGRATION
- do not change that name.
Grant Integration Access
Grant the Honeydew API integration to the Honeydew application:
Enable External Access
Enable the Honeydew API integration for all Honeydew code functions and procedures:
You will receive the following result upon success:
Grant Native App Access
Optionally, grant native app access to any additional Snowflake roles:
Only the ACCOUNTADMIN role has the CREATE INTEGRATION privilege by default. The privilege can be granted to additional roles as needed.
Upgrade
The Honeydew native application is upgraded automatically when new versions are released.
To upgrade the Honeydew application manually, follow these steps:
Upgrade
Upgrade the Honeydew application:
Re-enable API Integration
Re-enable the Honeydew API integration for all Honeydew code functions and procedures (some functions or procedures may have been added as part of the upgrade):
Re-grant Access
- Re-grant the Honeydew native application with all the originally provided access.
Native App Access
Grant native app access to any additional Snowflake roles:
Usage Examples
Note that most calls require to choose the workspace and working branch.
The convention here is to set those as variables, i.e.
Workspace and Branches
List all workspaces and branches:
Create a new branch named branch
for a given workspace:
Reload a given workspace and branch:
Reload all existing workspaces:
Reload a given workspace and branch for all users:
Reload all existing workspaces for all users:
Schema
Parameters
List all global parameters in the given workspace and branch:
Entities
List all entities in the given workspace and branch:
List all entity relations in the given workspace and branch:
Fields Metadata
List all fields in the given workspace and branch:
List all fields in the given workspace and branch, for a specific domain:
Fields include metrics and attributes
Fields Add/Update/Delete
Change attribute field
of entity entity
Change metric field
of entity entity
Domains
List all domains in the given workspace and branch:
Queries
Get data from a semantic query
The following stored procedure allows to run a semantic query and get the resulting data.
It also possible to only generate the SQL (see next section).
You might need to grant access for the native application to the relevant data, for example:
If Honeydew parameters are used, their default values will be used. To control parameter values, generate the SQL and set parameter values within it - see the next section.
Attributes and metrics may either refer to named fields in the semantic layer, or to new ad-hoc calculations based on them.
Metric ad-hoc expressions can do anything a metric can do.
In particular may use qualifiers such as FILTER (WHERE ...)
and GROUP BY (...)
to create ad-hoc filtered and/or partial metrics.
All the filters apply.
To allow better performance optimizations, it is recommended to pass multiple filters that will all apply rather
than a single one with multiple conditions and an AND
between them.
When passing filters to the native app, may use both attributes and metrics for filtering.
Attributes filter data similar to how WHERE
behaves in SQL (only rows that match the expression are returned)
When a metric is used in a filter (entity.count > 0
) it will be grouped by attributes before filtering,
similar to how HAVING
behaves in SQL. Only rows that the metric aggregation matches are returned.
Generate SQL for a query
The following function generates SQL for an ad-hoc semantic query
Using parameters with generated SQL
If a query is using Honeydew parameters then they will be generated as session variables in the query, and can be
set with SET
.
This is typically used for automation, with code calling the API setting parameter values.
A sample Snowflake stored procedure that sets parameters can look like:
BI SQL Interface wrapper
It is possible to use the native application as a wrapper to the live SQL interface:
Get data from ad-hoc SQL:
Get compiled SQL query from ad-hoc SQL:
Dynamic Datasets
Metadata
List all dynamic datasets in the given workspace and branch:
Add/Update/Delete
Add or update a dynamic dataset dataset
in given workspace and branch, and for a specific domain.
Delete a dynamic dataset dataset
from a given workspace and branch
Get Data or SQL
Get the data for a dynamic dataset dataset
:
Get the SQL query for a dynamic dataset dataset
:
Deployment
Deploy the dynamic dataset dataset
according to its deployment settings:
Plaintext questions to data (using Cortex LLMs)
Ask the semantic layer plaintext questions that are translated to the correct Snowflake query:
Returns a JSON object containing the following keys:
llm_response
: A descriptive response from the language modelllm_response_json
: The JSON representation of the semantic queryperspective
: A JSON representing the attributes, metrics and filters of the querysql
: The generated SQL queryerror
: An error message if an error was encountered
Only the llama3.1-405b
and mistral-large2
models demonstrated adequate performance.
Other models may not perform as expected.
Was this page helpful?