Skip to main content
This guide assumes you are already familiar with the basics of fetching data from Kombo. If you need a refresher or want to see the full “from scratch” guide, check out Fetching Data.
We are moving away from the traditional “sync-finished” pattern towards a more modern, event-driven architecture. This guide will help you refactor your existing integration to be faster, more efficient, and more reliable.

The Shift in Thinking

Traditionally, integrations were process-driven:
  1. Kombo finishes a sync.
  2. We tell you “Sync is done”.
  3. You fetch everything (or a time-based delta) to see if anything changed.
The new approach is event-driven:
  1. Something changes in the underlying tool (candidate added, stage moved).
  2. We tell you “Data changed”.
  3. You fetch only what you need.
  4. (Recommended) You run a periodic cleanup job once a week as a safety net.

Migration Strategy

You can migrate incrementally. You don’t need to stop your current sync logic immediately.

Phase 1: Implement data-changed Handler

First, let’s implement the new webhook handler. You can do this alongside your existing sync-finished handler.
async function handleDataChangedWebhook(body) {
  // Verify webhook sender here

  // Assuming we are using prisma as our ORM
  const integration = await prisma.integration.findUniqueOrThrow({
    where: {
      kombo_integration_id: body.data.integration_id,
    },
    select: {
      id: true, // Need the customer ID for the update
      last_fetched_from_kombo_at: true,
      kombo_integration_id: true, // Also select the integration ID
    },
  })

  // Track when WE start fetching (not when Kombo synced from the ATS)
  const fetchStartDate = new Date() // This time should be in UTC
  const lastFetchStartDate =
    integration.last_fetched_from_kombo_at?.toISOString()

  // Fetch data only when data-changed webhook indicates a change
  const applicationsChanged = !!body.data.changed_models.find(
    entry => entry.name === 'ats_applications',
  )
  if (applicationsChanged) {
    await fetchApplications(
      integration.kombo_integration_id,
      lastFetchStartDate,
    )
  }

  await prisma.integration.update({
    where: {
      id: integration.id,
    },
    data: {
      last_fetched_from_kombo_at: fetchStartDate,
    },
  })
}
While the event-driven approach is robust, webhooks can fail or get lost. To ensure your data stays consistent over time, we recommend adding a periodic full fetch. Create a Cron job (or scheduled task) that runs once a week (e.g., Sunday at 2 AM).
async function handleCronJob(integration) {
  const fetchStartDate = new Date() // This time should be in UTC

  // Do not pass a starting date here; fetch everything
  // Fetch the models you care about (repeat for applications, candidates, jobs, ...)
  await fetchApplications(integration.kombo_integration_id)

  await prisma.integration.update({
    where: {
      id: integration.id,
    },
    data: {
      last_fetched_from_kombo_at: fetchStartDate,
    },
  })
}

Phase 3: Implement the Fetch Function

The fetchApplications function is shared between the webhook handler and the cron job. It handles pagination and passes the updated_after parameter when provided.
This part may be simplified by using one of our official server-side SDKs.
async function fetchApplications(integrationId: string, updatedAfter?: string) {
  let cursor
  do {
    const resp = await axios.get('https://api.kombo.dev/v1/ats/applications', {
      headers: {
        Authorization: `Bearer ${KOMBO_API_KEY}`,
        'X-Integration-Id': integrationId, // Use the stored integration ID
      },
      params: {
        cursor: cursor,
        updated_after: updatedAfter,
      },
    })

    cursor = resp.data.data.next

    // Implement your handling logic here
    // Usually, you will upsert the data into your database and build specific
    // domain logic here.
    await handleApplicationData(integrationId, resp.data.data.results)
  } while (cursor)
}

Phase 4: Cutover

Once Phase 1 and 2 are deployed and you verify that data-changed events are triggering syncs:
  1. Disable sync-finished: Stop listening to the sync-finished webhook.
  2. Disable remote-event-received: If you were using this, data-changed replaces it entirely.
  3. Rely on the Safety Net: Your weekly cron covers any edge cases (like missed webhooks or bugs).

FAQ

Is this a breaking change?

No. The sync-finished webhook still works. However, we are focusing all performance improvements on the data-changed pipeline. Migrating ensures you get the fastest, most reliable experience.

How do I know which models map to which endpoint?

We provide a comprehensive table in the Fetching Data guide.

What if I miss a webhook?

That is exactly what the “Safety Net” (Phase 2) is for. Even if your server is down for a day, the weekly full sync will catch up any missed records. You can also trigger a manual full sync at any time if you suspect drift.