Skip to main content
Blip is a messaging and chatbot platform that helps businesses create conversational experiences for customer communication. It provides tools for building chatbots, managing messaging channels, and automating customer interactions to improve engagement and support.

Configuring Blip as a Source

In the Sources tab, click on the “Add source” button located on the top right of your screen. Then, select the Blip option from the list of connectors. Click Next and you’ll be prompted to add your access.

1. Add account access

You’ll need the following credentials from your Blip account:
  • Authorization token: The token to authenticate against the API service - please note it must be a HTTP token. It should be generated using the Connect using HTTP option. For more information on how to generate the token, please check the Blip documentation.
  • Company identifier (contract ID): The company identifier (also known as contract ID) used to send commands through the API. Its value can be identified as being part of your URL, in the following format: https://{contract_id}.http.msging.net/commands.
  • Start date: Records created or updated after the start date will be extracted from the source. Format: YYYY-MM-DD.
Once you have all the required credentials, add the account access and click Next.

2. Select streams

Choose which data streams you want to sync - you can select all streams or pick specific ones that matter most to you.
Tip: The stream can be found more easily by typing its name.
Select the streams and click Next.

3. Configure data streams

Customize how you want your data to appear in your catalog. Select the desired layer where the data will be placed, a folder to organize it inside the layer, a name for each table (which will effectively contain the fetched data) and the type of sync.
  • Layer: choose between the existing layers on your catalog. This is where you will find your new extracted tables as the extraction runs successfully.
  • Folder: a folder can be created inside the selected layer to group all tables being created from this new data source.
  • Table name: we suggest a name, but feel free to customize it. You have the option to add a prefix to all tables at once and make this process faster!
  • Sync Type: you can choose between INCREMENTAL and FULL_TABLE.
    • Incremental: every time the extraction happens, we’ll get only the new data - which is good if, for example, you want to keep every record ever fetched.
    • Full table: every time the extraction happens, we’ll get the current state of the data - which is good if, for example, you don’t want to have deleted data in your catalog.
Once you are done configuring, click Next.

4. Configure data source

Describe your data source for easy identification within your organization, not exceeding 140 characters. To define your Trigger, consider how often you want data to be extracted from this source. This decision usually depends on how frequently you need the new table data updated (every day, once a week, or only at specific times). Optionally, you can define some additional settings:
  • Configure Delta Log Retention and determine for how long we should store old states of this table as it gets updated. Read more about this resource here.
  • Determine when to execute an Additional Full Sync. This will complement the incremental data extractions, ensuring that your data is completely synchronized with your source every once in a while.
Once you are ready, click Next to finalize the setup.

5. Check your new source

You can view your new source on the Sources page. If needed, manually trigger the source extraction by clicking on the arrow button. Once executed, your data will appear in your Catalog.
For you to be able to see it on your Catalog, you need at least one successful source run.

Streams and Fields

Below you’ll find all available data streams from Blip and their corresponding fields:
Stream for tracking daily message activity metrics.Key fields:
FieldTypeDescription
idStringUnique identifier for the report
interval_startStringStart time of the reporting interval
interval_endStringEnd time of the reporting interval
countIntegerNumber of active messages in the interval
Stream for real-time agent performance metrics.Key fields:
FieldTypeDescription
sync_dateStringWhen the metrics were collected
identityStringAgent identifier
statusStringCurrent agent status
is_enabledBooleanWhether the agent is enabled
opened_ticketsIntegerNumber of open tickets
agent_nameStringName of the agent
break_duration_in_secondsIntegerDuration of break time
current_status_date_timeStringWhen the current status was set
closed_ticketsIntegerNumber of closed tickets
average_attendance_timeStringAverage time spent on tickets
average_response_timeStringAverage time to respond
tickets_countIntegerTotal number of tickets handled
Stream for daily agent performance metrics.Key fields:
FieldTypeDescription
idStringUnique identifier for the report
sync_dateStringDate of the report
identityStringAgent identifier
statusStringAgent status
is_enabledBooleanWhether the agent is enabled
opened_ticketsIntegerNumber of open tickets
agent_nameStringName of the agent
break_duration_in_secondsIntegerDuration of break time
current_status_date_timeStringWhen the current status was set
closed_ticketsIntegerNumber of closed tickets
average_attendance_timeStringAverage time spent on tickets
average_response_timeStringAverage time to respond
average_first_response_timeStringAverage time for first response
average_wait_timeStringAverage wait time for customers
tickets_countIntegerTotal number of tickets handled
Stream for overall ticket performance metrics.Key fields:
FieldTypeDescription
sync_dateStringWhen the metrics were collected
max_queue_timeStringMaximum time in queue
max_first_response_timeStringMaximum time for first response
max_without_first_response_timeStringMaximum time without first response
avg_queue_timeStringAverage time in queue
avg_first_response_timeStringAverage time for first response
avg_wait_timeStringAverage wait time
avg_response_timeStringAverage response time
avg_attendance_timeStringAverage attendance time
tickets_per_attendantIntegerNumber of tickets per agent
Stream for detailed ticket information.Key fields:
FieldTypeDescription
idStringUnique identifier for the ticket
sequential_idIntegerSequential ticket number
owner_identityStringIdentity of the ticket owner
customer_identityStringIdentity of the customer
customer_domainStringDomain of the customer
agent_identityStringIdentity of the assigned agent
providerStringService provider
statusStringCurrent ticket status
storage_dateStringWhen the ticket was stored
open_dateStringWhen the ticket was opened
close_dateStringWhen the ticket was closed
status_dateStringWhen the status was last updated
ratingIntegerCustomer rating
teamStringAssigned team
unread_messagesIntegerNumber of unread messages
closedBooleanWhether the ticket is closed
closed_byStringWho closed the ticket
tagsStringTags associated with the ticket
first_response_dateStringWhen the first response was sent
priorityIntegerTicket priority level
is_automatic_distributionBooleanWhether automatically distributed
distribution_typeStringType of distribution
Stream for daily ticket statistics.Key fields:
FieldTypeDescription
idStringUnique identifier for the report
dateStringDate of the report
waitingIntegerNumber of tickets waiting
openIntegerNumber of open tickets
closedIntegerNumber of closed tickets
closed_attendantIntegerTickets closed by attendants
closed_clientIntegerTickets closed by clients
transferredIntegerNumber of transferred tickets
missedIntegerNumber of missed tickets
in_attendanceIntegerNumber of tickets in attendance

Use Cases for Data Analysis

Here are some valuable business intelligence use cases when consolidating Blip data, along with ready-to-use SQL queries that you can run on Explorer.

1. Agent Performance Analysis

Track agent productivity and response times. Business Value:
  • Monitor individual agent performance
  • Identify training needs
  • Optimize workload distribution
  • Improve customer response times

SQL code

WITH
	agent_daily_stats AS (
		SELECT
			"identity",
			DATE_TRUNC ('day', "sync_date") AS "report_date",
			"opened_tickets",
			"closed_tickets",
			"tickets_count",
			CAST(
				SPLIT_PART ("average_response_time", ':', 1) AS DOUBLE
			) * 3600 + CAST(
				SPLIT_PART ("average_response_time", ':', 2) AS DOUBLE
			) * 60 + CAST(
				SPLIT_PART ("average_response_time", ':', 3) AS DOUBLE
			) AS "avg_response_seconds",
			CAST(
				SPLIT_PART ("average_first_response_time", ':', 1) AS DOUBLE
			) * 3600 + CAST(
				SPLIT_PART ("average_first_response_time", ':', 2) AS DOUBLE
			) * 60 + CAST(
				SPLIT_PART ("average_first_response_time", ':', 3) AS DOUBLE
			) AS "avg_first_response_seconds",
			CAST(
				SPLIT_PART ("average_attendance_time", ':', 1) AS DOUBLE
			) * 3600 + CAST(
				SPLIT_PART ("average_attendance_time", ':', 2) AS DOUBLE
			) * 60 + CAST(
				SPLIT_PART ("average_attendance_time", ':', 3) AS DOUBLE
			) AS "avg_attendance_seconds",
			"break_duration_in_seconds"
		FROM
			"nekt_raw"."blip_agents_daily_report"
		WHERE
			"sync_date" >= CURRENT_DATE - INTERVAL '30' DAY
	),
	agent_summary AS (
		SELECT
			"identity",
			COUNT(DISTINCT "report_date") AS "active_days",
			SUM("closed_tickets") AS "total_closed_tickets",
			SUM("tickets_count") AS "total_tickets_handled",
			AVG("avg_response_seconds") AS "avg_response_time_seconds",
			AVG("avg_first_response_seconds") AS "avg_first_response_time_seconds",
			AVG("avg_attendance_seconds") AS "avg_attendance_time_seconds",
			SUM("break_duration_in_seconds") / 3600.0 AS "total_break_hours"
		FROM
			agent_daily_stats
		GROUP BY
			"identity"
	)
SELECT
	"identity",
	"active_days",
	"total_closed_tickets",
	"total_tickets_handled",
	ROUND(
		CAST("total_closed_tickets" AS DOUBLE) / NULLIF("total_tickets_handled", 0) * 100,
		2
	) AS "resolution_rate",
	ROUND(
		"total_tickets_handled" / NULLIF("active_days", 0),
		1
	) AS "avg_daily_tickets",
	ROUND("avg_response_time_seconds" / 60.0, 2) AS "avg_response_time_minutes",
	ROUND("avg_first_response_time_seconds" / 60.0, 2) AS "avg_first_response_time_minutes",
	ROUND("avg_attendance_time_seconds" / 60.0, 2) AS "avg_attendance_time_minutes",
	ROUND("total_break_hours", 2) AS "total_break_hours"
FROM
	agent_summary
ORDER BY
	"total_tickets_handled" DESC

2. Ticket Resolution Analysis

Analyze ticket resolution patterns and identify bottlenecks. Business Value:
  • Improve ticket resolution efficiency
  • Reduce customer wait times
  • Identify common issues
  • Optimize team allocation

SQL code

WITH daily_metrics AS (
  SELECT
    DATE_TRUNC('day', date) AS report_date,
    waiting,
    open,
    closed,
    closed_attendant,
    closed_client,
    transferred,
    missed,
    in_attendance,
    closed + waiting + open + in_attendance AS total_tickets,
    ROUND(
      CAST(closed AS DOUBLE) / NULLIF(closed + waiting + open + in_attendance, 0) * 100,
      2
    ) AS resolution_rate
  FROM nekt_raw.blip_tickets_daily_report
  WHERE date >= CURRENT_DATE - INTERVAL '30' DAY
),
ticket_timing AS (
  SELECT
    DATE_TRUNC('day', open_date) AS created_date,
    status,
    priority,
    team,
    CASE
      WHEN close_date IS NOT NULL 
      THEN DATE_DIFF('minute', open_date, close_date)
    END AS resolution_time_minutes
  FROM nekt_raw.blip_tickets
  WHERE open_date >= CURRENT_DATE - INTERVAL '30' DAY
)
SELECT
  dm.report_date,
  dm.total_tickets,
  dm.waiting AS tickets_waiting,
  dm.open AS tickets_open,
  dm.closed AS tickets_closed,
  dm.transferred AS tickets_transferred,
  dm.missed AS tickets_missed,
  dm.resolution_rate AS daily_resolution_rate,
  ROUND(AVG(tt.resolution_time_minutes), 2) AS avg_resolution_time_minutes,
  COUNT(DISTINCT CASE WHEN tt.priority >= 2 THEN tt.id END) AS high_priority_tickets,
  COUNT(DISTINCT CASE WHEN tt.resolution_time_minutes > 120 THEN tt.id END) AS long_resolution_tickets
FROM daily_metrics dm
LEFT JOIN ticket_timing tt ON dm.report_date = tt.created_date
GROUP BY
  dm.report_date,
  dm.total_tickets,
  dm.waiting,
  dm.open,
  dm.closed,
  dm.transferred,
  dm.missed,
  dm.resolution_rate
ORDER BY dm.report_date DESC

Implementation Notes

Data Quality Considerations

  • Monitor agent status changes for accurate reporting
  • Validate ticket resolution times for outliers
  • Ensure consistent rating data collection
  • Track message delivery status

Automation Opportunities

  • Schedule daily agent performance reports
  • Set up alerts for long wait times
  • Automate customer satisfaction reporting
  • Generate team workload distribution reports
These SQL transformations provide a foundation for comprehensive Blip analytics. Customize the date ranges, filters, and metrics based on your specific business requirements and reporting needs.

Skills for agents

Download Blip skills file

Blip connector documentation as plain markdown, for use in AI agent contexts.