Skip to main content
To minimize latency between data changes in the connected tool and your system, and to prevent data drift, we recommend combining data-changed notifications with periodic full fetches. Best-practice implementation:
  • Listen to our data-changed webhook to receive notifications of data changes.
  • Fetch only updated data.
  • Periodically fetch all data to fully align your dataset and correct any potential data drift.
  • Combine both strategies for a setup that is robust and efficient.

Overview

Listening to data-changed webhook

We provide a webhook called data-changed. This is sent to your system whenever data has changed inside Kombo, for example after we finish syncing (full or delta sync). By listening to this webhook, you can receive updates from Kombo efficiently, allowing for the best possible UX. The webhook will look similar to this, with an array of models that have changed in our database since we last sent you that webhook:
{
  "id": "udcUMbfv9YkHqPY3uGiCpWsz",
  "type": "data-changed",
  "data": {
    "integration_id": "sandbox:7C9M5sXAQx2cNMpVHLojrFpL",
    "integration_tool": "sandbox",
    "integration_category": "LMS",
    "changed_models": [
      {
        "name": "lms_course_revisions"
      },
      {
        "name": "lms_courses"
      }
    ]
  }
}
Simplified approach To make things simple, you can, whenever you receive a new data-changed webhook, pull the models you’re interested in, independent of the models we’re telling you changed. Recommended approach You can also listen to the data models we’re telling you changed and pull data based on those models. For that please look into the list of models that can appear. Based on this list of changed models, figure out which requests you would like to send. Keep in mind that models do not map 1:1 to our endpoints. For example, you might be using the /courses endpoint to fetch both courses and their associated skills. Furthermore, after every successful fetch from us, store the timestamp of when the respective fetch started. Use this timestamp during your next fetch and pass it with your requests to the Kombo API as the updated_after query parameter. Kombo will then only return the entries that changed since that timestamp. Good to know: The updated_after filter also considers changes in models that are returned by the endpoint as nested values. For example, the /course-progressions endpoint will include every progression that has had changes to related courses or users, even if the progression itself did not change. Read more below.

All models

You should call the endpoints you care about based on the models we tell you have changed. The following table shows endpoints connected to the relevant models that will be part of the data-changed webhook. We recommend the following approach:
  1. Decide which endpoints are relevant for your use case (e.g., if your system is centered around course completions, you primarily use Get course progressions).
  2. Map changed models to those endpoints using the table.
    For instance, if both lms_course_progressions and lms_courses are listed in the webhook, but you primarily care about progressions, you only need to call Get course progressions, since course data is included there too.
This helps avoid unnecessary API calls while still ensuring your data is up to date.
EndpointModels
Get userslms_users
Get courseslms_courses
lms_course_revisions
lms_course_providers
lms_skills
lms_join_revisions_skills
Get course progressionslms_course_progressions
lms_courses
lms_course_revisions
lms_users

Preventing Data Drift

In addition to fetching data from Kombo’s endpoints in response to receiving the data-changed webhook, we recommend running a periodic full data fetch for the data models you care about. In most cases, a 7-day schedule is ideal. This helps to remedy any drift that may occur in your data, e.g. from accidental manual changes or lost webhooks. To implement this, perform GET requests for the Kombo data models you care about without passing the updated_after query parameter. Then upsert the data returned by Kombo and merge it with your existing data copy.

Full Example Code

async function handleDataChangedWebhook(body) {
  // Verify webhook sender here

  // Assuming we are using prisma as our ORM
  const integration = await prisma.integration.findUniqueOrThrow({
    where: {
      kombo_integration_id: body.data.integration_id,
    },
    select: {
      id: true, // Need the customer ID for the update
      last_fetched_from_kombo_at: true,
      kombo_integration_id: true, // Also select the integration ID
    },
  })

  // Track when WE start fetching (not when Kombo synced from the LMS)
  const fetchStartDate = new Date() // This time should be in UTC
  const lastFetchStartDate =
    integration.last_fetched_from_kombo_at?.toISOString()

  // Fetch data only when data-changed webhook indicates a change
  const progressionsChanged = !!body.data.changed_models.find(
    entry => entry.name === 'lms_course_progressions',
  )
  if (progressionsChanged) {
    await fetchCourseProgressions(
      integration.kombo_integration_id,
      lastFetchStartDate,
    )
  }

  await prisma.integration.update({
    where: {
      id: integration.id,
    },
    data: {
      last_fetched_from_kombo_at: fetchStartDate,
    },
  })
}

async function handleCronJob(integration) {
  const fetchStartDate = new Date() // This time should be in UTC

  // Do not pass a starting date here; fetch everything
  // Fetch the models you care about (repeat for users, courses, progressions, ...)
  await fetchCourseProgressions(integration.kombo_integration_id)

  await prisma.integration.update({
    where: {
      id: integration.id,
    },
    data: {
      last_fetched_from_kombo_at: fetchStartDate,
    },
  })
}

async function fetchCourseProgressions(
  integrationId: string,
  updatedAfter?: string,
) {
  let cursor
  do {
    const resp = await axios.get(
      'https://api.kombo.dev/v1/lms/course-progressions',
      {
        headers: {
          Authorization: `Bearer ${KOMBO_API_KEY}`,
          'X-Integration-Id': integrationId, // Use the stored integration ID
        },
        params: {
          cursor: cursor,
          updated_after: updatedAfter,
        },
      },
    )

    cursor = resp.data.data.next

    // Implement your handling logic here
    // Usually, you will upsert the data into your database and build specific
    // domain logic here.
    await handleProgressionData(integrationId, resp.data.data.results)
  } while (cursor)
}

Understanding changed_at vs updated_after Behavior

A common source of confusion is understanding when records are returned by the updated_after filter and how this relates to each record’s changed_at timestamp. Here’s the key distinction:

Record-level changed_at Field

Each record has a changed_at timestamp that only updates when properties directly on that record change. For example:
  • If a course’s title changes, the course’s changed_at updates
  • If a progression’s status changes, the progression’s changed_at updates
  • However: If a course’s title changes, related progressions’ changed_at fields do NOT update

Endpoint Filtering with updated_after

The updated_after parameter works differently - it returns records when either the record itself OR its nested data has been updated:

Example: Course Progressions Endpoint

When you call GET /course-progressions with updated_after, you’ll receive progressions if:
  1. Direct progression changes: The progression itself was modified (status, enrolled_at, etc.)
  2. Nested course revision changes: The course revision data was updated (title, description, etc.)
  3. Nested user changes: The user data was updated (name, email, etc.)
This means a progression can be returned even if its own changed_at timestamp hasn’t changed.

Concrete Scenario

Let’s say you call GET /course-progressions at 9:00 AM and get:
{
  "id": "prog123",
  "status": "IN_PROGRESS",
  "changed_at": "2023-10-01T08:00:00Z",
  "course_revision": {
    "id": "revision456",
    "title": "Introduction to Sales",
    "changed_at": "2023-10-01T08:00:00Z"
  }
}
At 10:00 AM, the course revision’s title is changed to “Advanced Sales Techniques” in the LMS. If you then call GET /course-progressions?updated_after=2023-10-01T09:00:00Z, you’ll receive:
{
  "id": "prog123",
  "status": "IN_PROGRESS",
  "changed_at": "2023-10-01T08:00:00Z", // <- Same timestamp!
  "course_revision": {
    "id": "revision456",
    "title": "Advanced Sales Techniques", // <- Updated data
    "changed_at": "2023-10-01T10:00:00Z" // <- New timestamp
  }
}
Key Point: The progression’s changed_at remains unchanged, but the progression is still returned because it contains updated nested course data.

Best Practice

When using updated_after filtering:
  1. Don’t assume a record was directly modified just because it’s returned
  2. Compare nested data to determine what actually changed
  3. Use the nested objects’ changed_at fields to identify which parts were updated
  4. Design your sync logic to handle both direct and indirect changes

FAQ

Why do you not provide the changed data inside of data-changed?

We want to make integrating with Kombo as simple as possible for you. Since we recommend running occasional full fetches anyways, it means you can re-use most of your logic for both fetch types. Furthermore, this allows you to iterate and upsert data at your own pace. Unlike if we were to send you a really large payload after an initial sync. By listening to just this single unified webhook, you are always guaranteed to benefit from our latest innovations to help you get the freshest data.

How often do you send the data-changed event? Do you debounce the webhook?

Yes, by default we debounce data-changed with a 30-second window. That means no matter how many events we’re receiving, you’ll receive at most one webhook every 30 seconds. This works as follows: The first time this event fires, we will pass it through to you right away. Then we will wait 30 seconds. All the events we receive in those 30 seconds will then be sent to you as a single update with the next webhook. For more details, please refer to the webhooks page.