Live Connection

Create a data source in Tableau Online, Tableau Server or Tableau Desktop to connect to Honeydew.

Connect JDBC parameters to configure a connection to the SQL interface, and after connecting choose the catalog based on the workspace and branch you would like to use.

A Honeydew domain will be presented to the data model as a single virtual table.

Workspaces are presented as multiple catalogs in Tableau - one for each workspace and branch.

Each branch is a separate data source. Different branch data sources can be used interchangeably with Tableau worksheets.

Metadata Sync

When Metadata Sync is enabled, Honeydew synchronizes the following additional metadata to the Tableau Data Source:

  • Field type (metric or attribute) - every attribute is defined as a Tableau dimension, and every metric as a measure.
  • Field display names. When no display name is set, Honeydew creates an automatic user friendly name.
  • Field folders. When no folder is set, Honeydew uses the uses the entity name as folder names.
  • Field hidden property.
  • Field descriptions.
  • Field format and semantic role (through metadata configuration).

You can also access field metadata from Tableau with the native application API to read metadata into a Tableau workbook.

Tableau-specific Metadata

You can add Tableau-specific metadata, that specifies the default formatting or the semantic role, to an attribute or a metric in the schema:

...
metadata:
  - name: tableau
    metadata:
      - name: default-format
        value: C1033%
      - name: semantic-role
        value: '[Geographical].[Latitude]'

Some examples for default-format values (Tableau documentation):

  • C1033%
  • 'n#,##0;-#,##0'
  • *dd/mmm/yy
  • p0.0%

Some examples for semantic-role values are:

  • [Geographical].[Latitude]
  • [Geographical].[Longitude]
  • [City].[Name]
  • [State].[Name]
  • [Country].[ISO3166_2]
  • [ZipCode].[Name]

Tableau Catalog

Honeydew can also synchronize metadata (including descriptions and back-links to Honeydew definitions) with Tableau Catalog.

When enabled, this is in addition to the metadata synchronized with the Tableau Data Source.

Tableau Catalog Sync requires the Tableau Data Management feature enabled in your Tableau.

Dimensions

Any Honeydew attribute can be used as a Tableau dimension.

Measures

Any Honeydew metric can be used as a Tableau measure.

To use a metric use the “use metric in Tableau” functionality to generate its expression based on its type. The metrics are defined as Tableau as regular measures - any aggregation will return the actual metric computation.

For example, SUM([Lineitem Median Revenue]) in Tableau would translate into calculating the median revenue.

If you want to use a metric as a dimension, define it as a calculated attribute in Honeydew first.

Filters

Any Honeydew attribute or metric can be used as a filter.

Filters are automatically pushed down to the source in order to optimize performance.

Cross-entity queries

Any entity in Honeydew can be used in a Tableau Workbook.

Joins are automatically constructed by Honeydew. They are based on the relationships between entities that are used in a particular Tableau Sheet.

There is no need to use Tableau Level of Detail calculations. Honeydew automatically maintains the entity granularity for each metric.

For example, an average balance metric defined in the customer entity will always be counted on a customers level of detail. There is no need for a LOD field such as { FIXED [customer.c_custkey]: AVG([customer.c_custbal]) }.

More complex Level of Details are possible with grouped metrics.

Calculated Fields

Custom fields are supported, including Bins, Groups, custom dimension aggregations and custom field expressions.

If you want to reuse a custom field defined in Tableau: save it in Honeydew to a named attribute or metric.

That way, the code to compute it moves to the shared semantic layer.

Live Connection Limitations

  1. Conditional and Top/Bottom filters are not supported. Instead, build them as a calculated attribute in Honeydew and use it as a regular filter in Tableau.
  2. Joins on Honeydew datasets in the Data Source definition are not supported. Instead, add the required entities to the dynamic dataset, and Honeydew will figure out on the fly the needed joins for each user query in a Tableau Sheet.
  3. Level of Detail expression are not supported. There is no need for them to control granularity as it is done automatically. For complex use cases, use grouped metrics.
  4. For performance considerations when working with a live connection, see performance acceleration in Honeydew.

Extracts

Tableau Extracts can be built on top of any dynamic dataset in Honeydew.

They can be done in two modes:

  1. On a TABLE or a VIEW managed by Honeydew (requires to set up Snowflake delivery)
  2. By using the Snowflake Native Application

With a VIEW or a TABLE

Dynamic datasets can be set up deployment to Snowflake.

The resulting VIEW or TABLE can be used for a Tableau Extract.

With Snowflake Native Application

Create a Snowflake connection, and in the initial SQL add the following code to read from a dynamic dataset:

CALL SEMANTIC_LAYER_ENTERPRISE_EDITION.API.SELECT_FROM_DYNAMIC_DATASET('workspace', 'branch', 'dataset');
CREATE TEMPORARY TABLE dataset AS SELECT * FROM table(result_scan(last_query_id()));

Or to dynamically create a semantic query:

CALL SEMANTIC_LAYER_ENTERPRISE_EDITION.API.SELECT_FROM_FIELDS(
        -- workspace / branch
        'workspace', 'branch',
        -- domain (can be NULL)
        'domain',
        -- dimensions - can be ad-hoc expressions
        ['entity.attr', ...],
        -- metrics - can be named or ad-hoc expressions
        ['entity.metric', 'SUM(entity.measure)', ...],
        -- filters - can be named or ad-hoc expressions
        ['entity.filter', 'entity.attr > 0', 'entity.name like ''%cheese%''', ...]
    );
CREATE TEMPORARY TABLE dataset AS SELECT * FROM table(result_scan(last_query_id()));

Given a workspace, a working branch in it (usually would be prod), and a dynamic dataset dataset.

This calls a stored procedure to retrieve dynamic dataset data, and stores it in a temporary table used to build the extract.

Then create a Custom SQL data source with the query

SELECT * FROM dataset