Did you know you can add Snowflake GenAI Insight Summaries powered by Snowflake (and table/view reporting) to Slack & Teams, directly using Snowflake?

Imagine topline analytics delivered right to your team’s workspace, encouraging them to log in to Snowflake or your BI tools — no files, no hassle.

I’ve made it easy: one function works for both Slack and Teams. If you’re an admin, just copy-paste the code, and you’re set. I’ve done all the heavy lifting 😎.

Why I’ve built this

  1. Serve users where they work — on Slack and Teams. It’s the only guide you’ll need for sending AI summaries and table data to both.
  2. We already use Astrato’s Scheduled PDF & Excel Reporting internally. It is excellent for sending pixel-perfect, rich reports - this function sends short & snappy, native messages in Slack or Teams, far less detailed but impactful nevertheless

❄️ The code — all plug and play

To allow Snowflake network access, you need to use an ACCOUNT_ADMIN role.

Network Access: Enable access for Slack & Teams.



-- 1a. Setup network rule to access external source
CREATE OR REPLACE NETWORK RULE MESSENGER_WEBHOOKS_RULE
MODE = EGRESS
TYPE = HOST_PORT
VALUE_LIST = ('hooks.slack.com','astratoviz.webhook.office.com');

-- 1b. Setup access intregration, using the rules set
CREATE OR REPLACE EXTERNAL ACCESS INTEGRATION   MESSENGER_WEBHOOKS
ALLOWED_NETWORK_RULES = (MESSENGER_WEBHOOKS_RULE)
ENABLED=TRUE;

You can simply copy & paste, and start running. Just be sure you have access to Snowflake Cortex LLM functions.

How it works:

  1. This stored procedure sends formatted messages to a specified Slack or Microsoft Teams webhook, using either table data or an AI-generated summary.
  2. Input Parameters:
  • table_name: The name of the Snowflake table or view to query.
  • webhook_url: The Slack or Teams webhook URL to send the message.
  • format: The message format, either AISUMMARY (to generate an AI-based summary) or TABLE (to display table data).
  • custom_prompt (optional): A custom prompt for the AI to generate a summary. Probably best to leave an empty string.

2. Detects Webhook Type:

  • Checks if the provided URL is a Slack webhook (hooks.slack.com), otherwise assumes it's for Microsoft Teams.
  • Executes SQL Query:
  • Queries the first 23 rows of the table based on the four selected columns.

3.a AISUMMARY Option:

  • Converts the result set into a string.
  • Sends the data string along with a custom or default prompt to Snowflake Cortex LLM to generate an AI summary.
  • Formats the summary appropriately for Slack or Teams using mrkdwn for Slack or Adaptive Cards for Teams.

3.b TABLE Option:

  • Formats the table data as blocks for Slack or Adaptive Cards for Teams.
  • Iterates through each row of data to construct a well-structured table or card.

4. Sends Message:

  • Sends the generated content (either summary or table) to the specified webhook (Slack or Teams) via an HTTP POST request using the requests package.

5. Handles Success or Failure:

  • Returns a message indicating whether the message was sent successfully or if there was an error, including the HTTP status code and response text.

ℹ️ Data Limits in Table mode:

A limit of 23 rows & 4 columns is set. This is a technical limitation of message blocks — plus, why clog up your channels? You can alter this for use with teams, althouygh I’m not sure on their limit.

Create procedure


CREATE OR REPLACE PROCEDURE format_and_send_message(
   table_name STRING,
   webhook_url STRING,
   format STRING,  -- "AISUMMARY" or "TABLE"
   custom_prompt STRING   -- New optional parameter for prompt customization
)
RETURNS STRING
LANGUAGE PYTHON
RUNTIME_VERSION = '3.8'
PACKAGES = ('snowflake-snowpark-python', 'requests')
HANDLER = 'format_and_send_message'
EXTERNAL_ACCESS_INTEGRATIONS = (MESSENGER_WEBHOOKS)
AS
$$
import json
import requests
import snowflake.snowpark as snowpark

def format_and_send_message(session, table_name, webhook_url, format, custom_prompt=None):
   # Force table_name to uppercase
   table_name = table_name.upper()

   # Detect if it's a Slack webhook by checking for "https://hooks.slack.com"
   is_slack = 'hooks.slack.com' in webhook_url

   # Query to get the column names dynamically
   get_columns_query = f"""
       SELECT COLUMN_NAME
       FROM INFORMATION_SCHEMA.COLUMNS
       WHERE TABLE_NAME = '{table_name}'
       AND TABLE_SCHEMA = (SELECT TABLE_SCHEMA FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_NAME = '{table_name}'
                           UNION
                           SELECT TABLE_SCHEMA FROM INFORMATION_SCHEMA.VIEWS WHERE TABLE_NAME = '{table_name}')
       ORDER BY ORDINAL_POSITION
   """
   
   columns_result = session.sql(get_columns_query).collect()

   # Initialize an array to hold the column names
   column_names = [row['COLUMN_NAME'] for row in columns_result]

   # If format is 'TABLE', limit the number of columns to 4
   if format.upper() == 'TABLE':
       column_names = column_names[:4]
       if len(column_names) < 4:
           raise ValueError("The table must have at least 4 columns.")

   # Define the row limit based on the format
   row_limit = 23 if format.upper() == 'TABLE' else 1000

   # Build the SQL query dynamically based on the column names and row limit
   sql_command = f"""
   SELECT
       {', '.join(column_names)}
   FROM {table_name}
   ORDER BY {column_names[0]}
   LIMIT {row_limit}
   """

   # Execute the dynamic SQL query and fetch data
   result_set = session.sql(sql_command).collect()

   # Option 1: Generate an AI Summary (if format is AISUMMARY)
   if format.upper() == 'AISUMMARY':
       # Convert the result set into a string for AI summary
       data_string = '\n'.join(
           ', '.join(str(row[col]) for col in column_names)
           for row in result_set
       )

       # Use the custom prompt if provided, otherwise use the default prompt
       if custom_prompt:
           ai_prompt = custom_prompt + ': ' + data_string
       else:
           ai_prompt = 'Summarize this data in an analytics/insights format: ' + data_string

       # Send the data to Snowflake Cortex LLM for summarization
       summary_query = f"""
       SELECT SNOWFLAKE.CORTEX.COMPLETE(
           'mixtral-8x7b',
           CONCAT('{ai_prompt}')
       )
       """
       summary_result = session.sql(summary_query).collect()

       # Get the AI summary result
       ai_summary = summary_result[0][0]

       # Adjust mrkdwn formatting for Slack
       if is_slack:
           formatted_summary = ai_summary.replace('**', '*').replace('*', '_')
           message_payload = {
               "blocks": [
                   {
                       "type": "section",
                       "text": {
                           "type": "mrkdwn",
                           "text": f"*AI Summary for {table_name}:*\n{formatted_summary}"
                       }
                   }
               ]
           }
       else:
           # For Teams, just use plain markdown or rich text
           message_payload = {
               "type": "message",
               "attachments": [
                   {
                       "contentType": "application/vnd.microsoft.card.adaptive",
                       "content": {
                           "type": "AdaptiveCard",
                           "version": "1.2",
                           "body": [
                               {"type": "TextBlock", "text": f"AI Summary for {table_name}", "weight": "bolder", "size": "medium"},
                               {"type": "TextBlock", "text": ai_summary, "wrap": True}
                           ]
                       }
                   }
               ]
           }
   
   # Option 2: Output a table (if format is TABLE)
   else:
       if is_slack:
           blocks = []

           # Add a header block (field names)
           blocks.append({
               "type": "header",
               "text": {
                   "type": "plain_text",
                   "text": f"📊 Data Report from {table_name} [Built by Piers Batchelor from Astrato Analytics]",
                   "emoji": True
               }
           })

           # Add a field section for dynamically retrieved column headers
           blocks.append({
               "type": "section",
               "fields": [
                   { "type": "mrkdwn", "text": f"*{column_names[0]}*" },
                   { "type": "mrkdwn", "text": f"*{column_names[1]}*" },
                   { "type": "mrkdwn", "text": f"*{column_names[2]}*" },
                   { "type": "mrkdwn", "text": f"*{column_names[3]}*" }
               ]
           })

           # Add a divider to separate headers from data
           blocks.append({ "type": "divider" })

           # Loop through the result set and format the data into rows (sections)
           for row in result_set:
               row_values = [str(row[col]) if row[col] else "N/A" for col in column_names]

               blocks.append({
                   "type": "section",
                   "fields": [
                       { "type": "mrkdwn", "text": row_values[0] },
                       { "type": "mrkdwn", "text": row_values[1] },
                       { "type": "mrkdwn", "text": row_values[2] },
                       { "type": "mrkdwn", "text": row_values[3] }
                   ]
               })

               # Add a divider between rows
               blocks.append({ "type": "divider" })

           # Convert the blocks array into a JSON string
           message_payload = {
               "blocks": blocks
           }

       else:
           # Adaptive Card message with columns for Teams
           columns_headers = [
               {
                   "type": "Column",
                   "items": [{"type": "TextBlock", "text": f"{column_names[0]}", "weight": "bolder"}],
                   "width": "auto"
               },
               {
                   "type": "Column",
                   "items": [{"type": "TextBlock", "text": f"{column_names[1]}", "weight": "bolder"}],
                   "width": "auto"
               },
               {
                   "type": "Column",
                   "items": [{"type": "TextBlock", "text": f"{column_names[2]}", "weight": "bolder"}],
                   "width": "auto"
               },
               {
                   "type": "Column",
                   "items": [{"type": "TextBlock", "text": f"{column_names[3]}", "weight": "bolder"}],
                   "width": "auto"
               }
           ]

           rows = []
           # Add each row of data into the column set
           for row in result_set:
               row_values = [str(row[col]) if row[col] else "N/A" for col in column_names]
               rows.append({
                   "type": "ColumnSet",
                   "columns": [
                       {"type": "Column", "items": [{"type": "TextBlock", "text": row_values[0]}], "width": "auto"},
                       {"type": "Column", "items": [{"type": "TextBlock", "text": row_values[1]}], "width": "auto"},
                       {"type": "Column", "items": [{"type": "TextBlock", "text": row_values[2]}], "width": "auto"},
                       {"type": "Column", "items": [{"type": "TextBlock", "text": row_values[3]}], "width": "auto"}
                   ]
               })

           # Adaptive Card message with columns
           message_payload = {
               "type": "message",
               "attachments": [
                   {
                       "contentType": "application/vnd.microsoft.card.adaptive",
                       "content": {
                           "type": "AdaptiveCard",
                           "version": "1.2",
                           "body": [
                               {"type": "TextBlock", "text": f"📊 Data Report from {table_name}", "weight": "bolder", "size": "medium"},
                               {"type": "ColumnSet", "columns": columns_headers}
                           ] + rows
                       }
                   }
               ]
           }

   # Send the message using the webhook URL
   response = requests.post(
       webhook_url,
       headers={'Content-Type': 'application/json'},
       data=json.dumps(message_payload)
   )
   
   # Check if the request was successful
   if response.status_code == 200:
       return "Message sent successfully."
   else:
       return f"Failed to send message. Status Code: {response.status_code}, Response: {response.text}"

$$;

Example Call

CALL format_and_send_message(/*Table/View name:*/       'CARS', /*Webhook slack/teams:*/   'https://astratoviz.webhook.office.com/webhookb2/036f7721-21a0-43db-acb2-bb59ae9d3ddb@5753bd06-fa16-42b2-b8e1-f3c5b849d021/IncomingWebhook/56323407dc284515b613ed4a0534bf09/9368cdc6-6046-4568-b746-0bd106c10f36/V2EVzk22GNH9wJ0dTdg-7NplmzM0IUNHDVeB6eC--90bE1',/*[AISUMMARY/TABLE]:*/     'AISUMMARY', /*Custom Prompt:*/         '''This data contains a summary of... tell us in total if volume has ⬆️increased or ⬇️descreased''');

⚠️ Troubleshooting

If Cortex LLM says “model not found,” allow cross-region use.

ALTER ACCOUNT SET CORTEX_ENABLED_CROSS_REGION = 'ANY_REGION';

Bonus code: Set a task

  • WAREHOUSE = my_warehouse: This is the warehouse that will run the task. Replace my_warehouse with the name of your Snowflake warehouse.
  • SCHEDULE = ‘USING CRON 0 9 * * 1 UTC’: This CRON expression runs the task every Monday at 9 AM UTC, which corresponds to 10 AM UK time during standard UK time.
  • CALL: The task executes the format_and_send_message() procedure, sending the usage report to the provided Slack webhook.

CREATE OR REPLACE TASK weekly_message_task  WAREHOUSE = my_warehouse  SCHEDULE = 'USING CRON 0 9 * * 1 UTC'  -- This runs every Monday at 9 AM UTC, which is 10 AM UK time  COMMENT = 'Send usage report message to Slack every week at 10 AM UK time.'AS  CALL format_and_send_message('vw_usage', 'https://hooks.slack.com/services/xxx/yyy/zzz');ALTER TASK weekly_message_task RESUME;

Appendicies — setting up Slack & Teams

You may not be a Slack or Teams admin, if so, share this with your admins to help you out. If you are an admin — all the info is below to get going!

Setting up a Slack App & Webhook

Where work happens
Slack is where work flows. It's where the people you need, the information you share, and the tools you use come…api.slack.com

Setting up Teams Webhook connectors

To set up a webhook to Teams, you’ll need to follow these general steps:

Create a webhook in Teams:

  • Go to the channel where you want to receive notifications.
  • Click on the three dots (ellipsis) next to the channel name.
  • Select “Connectors.”
  • Search for “Incoming webhook” and add it.
  • Give the webhook a name and configure its settings (e.g., icon, color).
  • Click “Create.”
  • Copy the webhook URL.

Piers Batchelor
Education
Apr 4, 2025

Turn insights into action - right inside your BI

Book a demo

Astrato is a game changer. It integrated directly into our Data Cloud. Security and data privacy are critical for our work with behavioral health, addiction, and recovery support providers. Astrato allows us to maintain our high security in the Snowflake Data Cloud while opening more insights to more levels of care. Astrato is significantly faster with dashboards loading almost instantly.

Melissa Pluke
Co-Founder
Previously used Qlik Sense

Before, we had a separate analytics page, and nobody used it. Now, every customer at least checks the analytics, and for some, it’s the main thing they care about

Claudio Paolicelli
CTO
Self-hosted

Astrato acts as the shop window for everything happening in Snowflake, while all computation and governance remain in code within our data warehouse. That means anyone can access insights without relying on complex BI tools.

Chanade Hemming
Head of Data Products
Previoulsy used Tableau

Astrato is helping us win new customers as a result (of our Self-service embedded dashboard in Astrato), and we are on target to double the number of units (users) this year.

Beau Dobbs
Director of Business Intelligence & Operations
Previously used Tableau

Our customers are already thrilled by the improvement in user experience we have seen from switching to Astrato, which is enabling their non-technical users to self-serve for the insights they need to make informed decisions and be far more productive. This is helping us win and retain more customers.

Zachary Paz
Chief Operating Officer & EVP, Product
Evaluated Sigma, Thoughtspot & Qlik

Astrato offers a 50-75% cost saving over Qlik, with 25-50% faster development, seamless self-service analytics, and easy adoption which enables quick, customizable insights and actions.

Jeff Morrison
Chief of Analytics & Data Management
Previously used Qlik Sense & QlikView

Given Astrato is 100% cloud-native live-query, tightly integrated with the speed and scalability of Snowflake, we can now rapidly process a customer's data and build streamlined actionable analytics, in just hours/days compared to weeks/months previously. We have been able to automate almost everything, which just wasn't possible with PowerBI and our skill sets.

David Beto
Co-Founder & CEO
Previously used Power BI

Astrato is a game changer. It integrated directly into our Data Cloud. Security and data privacy are critical for our work with behavioral health, addiction, and recovery support providers. Astrato allows us to maintain our high security in the Snowflake Data Cloud while opening more insights to more levels of care. Astrato is significantly faster with dashboards loading almost instantly.

Melissa Pluke

Before, we had a separate analytics page, and nobody used it. Now, every customer at least checks the analytics, and for some, it’s the main thing they care about

Claudio Paolicelli

Astrato acts as the shop window for everything happening in Snowflake, while all computation and governance remain in code within our data warehouse. That means anyone can access insights without relying on complex BI tools.

Chanade Hemming

Astrato is helping us win new customers as a result (of our Self-service embedded dashboard in Astrato), and we are on target to double the number of units (users) this year.

Beau Dobbs

Our customers are already thrilled by the improvement in user experience we have seen from switching to Astrato, which is enabling their non-technical users to self-serve for the insights they need to make informed decisions and be far more productive. This is helping us win and retain more customers.

Zachary Paz

Astrato offers a 50-75% cost saving over Qlik, with 25-50% faster development, seamless self-service analytics, and easy adoption which enables quick, customizable insights and actions.

Jeff Morrison

Given Astrato is 100% cloud-native live-query, tightly integrated with the speed and scalability of Snowflake, we can now rapidly process a customer's data and build streamlined actionable analytics, in just hours/days compared to weeks/months previously. We have been able to automate almost everything, which just wasn't possible with PowerBI and our skill sets.

David Beto