Skip to main content
Google Search Console is a free web service offered by Google that helps you monitor, maintain, and troubleshoot your site’s presence in Google Search results. It provides insights into how your website performs in search results, including data on clicks, impressions, click-through rates, and search queries.

Configuring Google Search Console as a Source

In the Sources tab, click on the “Add source” button located on the top right of your screen. Then, select the Google Search Console option from the list of connectors. Click Next and you’ll be prompted to add your access.

1. Add account access

You’ll need to authorize Nekt to access your Google Search Console data. Click on the Google Authorization button and log in with your Google account. Grant the necessary permissions for the property you want to extract data from. The following configurations are available:
  • Site domain: The property domain configured in Google Search Console (for example, example.com).
  • Start Date: The earliest date from which records will be synced.
  • Lookback Window: (Default: 3 days) Number of days to backfill on each run to capture retroactive corrections and updates from Google Search Console.
The authenticated Google account must have access to the configured site domain.
Once you’re done, click Next.

2. Select streams

Choose which data streams you want to sync. For faster extractions, select only the streams that are relevant to your analysis. You can select all streams or only specific ones.
Tip: You can quickly find streams by typing their names.
Select the streams and click Next.

3. Configure data streams

Customize how you want your data to appear in your catalog. Select the desired layer where the data will be placed, a folder to organize it inside the layer, a name for each table (which will effectively contain the fetched data) and the type of sync.
  • Layer: choose between the existing layers on your catalog. This is where you will find your new extracted tables as the extraction runs successfully.
  • Folder: a folder can be created inside the selected layer to group all tables being created from this new data source.
  • Table name: we suggest a name, but feel free to customize it. You have the option to add a prefix to all tables at once and make this process faster.
  • Sync Type: you can choose between INCREMENTAL and FULL_TABLE.
    • Incremental: every time the extraction happens, we’ll get only the new data.
    • Full table: every time the extraction happens, we’ll get the current state of the data.
Once you are done configuring, click Next.

4. Configure data source

Describe your data source for easy identification within your organization, not exceeding 140 characters. To define your Trigger, consider how often you want data to be extracted from this source. This decision usually depends on how frequently you need the new table data updated (every day, once a week, or only at specific times). Optionally, you can define some additional settings:
  • Configure Delta Log Retention and determine for how long we should store old states of this table as it gets updated. Read more about this resource here.
  • Determine when to execute an Additional Full Sync. This will complement the incremental data extractions, ensuring that your data is completely synchronized with your source every once in a while.
Once you are ready, click Next to finalize.

5. Check your new source

You can view your new source on the Sources page. If needed, manually trigger the source extraction by clicking on the arrow button. Once executed, your data will appear in your Catalog.
To see data in the Catalog, you need at least one successful source run.

Streams and Fields

Below you’ll find all available data streams from Google Search Console and their corresponding fields:
Daily Search Console performance aggregated by date.Key Fields:
  • date - Report date
  • clicks - Number of clicks from Google Search
  • impressions - Number of impressions in Google Search
  • ctr - Click-through rate
  • position - Average search position
  • site_url - Search Console property domain configured in the source
Daily performance segmented by user country.Key Fields:
  • date - Report date
  • country - Country code from Search Console
  • clicks - Number of clicks from Google Search
  • impressions - Number of impressions in Google Search
  • ctr - Click-through rate
  • position - Average search position
  • site_url - Search Console property domain configured in the source
Daily performance segmented by device type.Key Fields:
  • date - Report date
  • device - Device type (for example, desktop, mobile, tablet)
  • clicks - Number of clicks from Google Search
  • impressions - Number of impressions in Google Search
  • ctr - Click-through rate
  • position - Average search position
  • site_url - Search Console property domain configured in the source
Daily performance segmented by page URL.Key Fields:
  • date - Report date
  • page - Canonical page URL returned by Search Console
  • clicks - Number of clicks from Google Search
  • impressions - Number of impressions in Google Search
  • ctr - Click-through rate
  • position - Average search position
  • site_url - Search Console property domain configured in the source
Daily performance segmented by search query.Key Fields:
  • date - Report date
  • query - Search query term
  • clicks - Number of clicks from Google Search
  • impressions - Number of impressions in Google Search
  • ctr - Click-through rate
  • position - Average search position
  • site_url - Search Console property domain configured in the source

Implementation Notes

  • All streams use incremental replication by date.
  • Primary keys include stream dimensions plus site_url.
  • The connector validates that the configured domain exists in the authenticated Google Search Console account before running.
  • The connector applies the configured lookback window on every run to backfill recent days.

Skills for agents

Download Google Search Console skills file

Google Search Console connector documentation as plain markdown, for use in AI agent contexts.