We are improving how you fetch data from Kombo. If you are looking for the old guide, please visit Syncing Data (Old Approach). If you are looking to migrate to the new approach, please visit Migrate to Data Changed.
To minimize latency between data changes in the connected tool and your system, and to prevent data drift, we recommend combining data-changed notifications with periodic full fetches. Best-practice implementation:
  • Listen to our data-changed webhook to receive notifications of data changes.
  • Fetch only updated data.
  • Periodically fetch all data to fully align your dataset and correct any potential data drift.
  • Combine both strategies for a setup that is robust and efficient.

Overview

Listening to data-changed webhook

We provide a webhook called data-changed. This is sent to your system whenever data has changed inside Kombo. For example, after:
  • We finish syncing (full or delta sync)
  • We receive an update through a webhook
By listening to this webhook, you can receive updates from Kombo efficiently, allowing for the best possible UX. The webhook will look similar to this, with an array of models that have changed in our database since we last sent you that webhook:
{
  "id": "FhghqjnCi9WuAoLT8Z75CFcs",
  "type": "data-changed",
  "data": {
    "integration_id": "bombohr:ats-dev",
    "integration_tool": "bombohr",
    "integration_category": "ATS",
    "changed_models": [
      {
        "name": "ats_applications"
      }
    ]
  }
}
Simplified approach To make things simple, you can, whenever you receive a new data-changed webhook, pull the models you’re interested in, independent of the models we’re telling you changed. Recommended approach You can also listen to the data models we’re telling you changed and pull data based on those models. For that please look into the list of models that can appear. Based on this list of changed models, figure out which requests you would like to send. Keep in mind that models do not map 1:1 to our endpoints. For example, you might be using the /candidates endpoint to fetch both candidates and their applications. Furthermore, after every successful fetch from us, store the timestamp of when the respective fetch started. Use this timestamp during your next fetch and pass it with your requests to the Kombo API as the updated_after query parameter. Kombo will then only return the entries that changed since that timestamp. Read more about this here. Good to know: The updated_after filter also considers changes in models that are returned by the endpoint as nested values. For example, the /candidates endpoint will include every candidate that has had changes to their list of applications, even if the candidate profile itself did not change.

All models

You should call the endpoints you care about based on the models we tell you have changed. The following table shows endpoints connected to the relevant models that will be part of the data-changed webhook. We recommend the following approach:
  1. Decide which endpoints are relevant for your use case (e.g., if your system is centered around applications, you primarily use Get Applications).
  2. Map changed models to those endpoints using the table.
    For instance, if both ats_applications and ats_candidates are listed in the webhook, but you primarily care about applications, you only need to call Get applications, since candidate data is included there too.
This helps avoid unnecessary API calls while still ensuring your data is up to date.
EndpointModels
Get applicationsats_applications
ats_application_stages
ats_candidates
ats_interviews
ats_jobs
ats_join_candidates_tags
ats_offers
ats_tags
Get candidatesats_candidates
ats_application_stages
ats_applications
ats_jobs
ats_join_candidates_tags
ats_join_jobs_application_stages
ats_tags
Get interviewsats_interviews
ats_applications
ats_candidates
ats_jobs
ats_join_interviews_users
ats_users
Get jobsats_jobs
ats_application_stages
ats_job_postings
ats_join_jobs_application_stages
ats_join_jobs_screening_questions
ats_join_jobs_users
ats_screening_questions
ats_users
Get offersats_offers
ats_applications
ats_candidates
ats_jobs
Get rejection reasonsats_rejection_reasons

Preventing Data Drift

In addition to fetching data from Kombo’s endpoints in response to receiving the data-changed webhook, we recommend running a periodic full data fetch for the data models you care about. In most cases, a 7-day schedule is ideal. This helps to remedy any drift that may occur in your data, e.g. from accidental manual changes or lost webhooks. To implement this, perform GET requests for the Kombo data models you care about without passing the updated_after query parameter. Then upsert the data returned by Kombo and merge it with your existing data copy.

Full Example Code

async function handleDataChangedWebhook(body) {
  // Verify webhook sender here

  // Assuming we are using prisma as our ORM
  const integration = await prisma.integration.findUniqueOrThrow({
    where: {
      kombo_integration_id: body.data.integration_id,
    },
    select: {
      id: true, // Need the customer ID for the update
      last_fetched_from_kombo_at: true,
      kombo_integration_id: true, // Also select the integration ID
    },
  })

  const fetchStartDate = new Date() // This time should be in UTC
  const lastFetchStartDate =
    integration.last_fetched_from_kombo_at?.toISOString()

  // Fetch data only when data-changed webhook indicates a change
  const applicationsChanged = !!body.data.changed_models.find(
    entry => entry.name === 'ats_applications',
  )
  if (applicationsChanged) {
    await fetchApplications(
      integration.kombo_integration_id,
      lastFetchStartDate,
    )
  }

  await prisma.integration.update({
    where: {
      id: integration.id,
    },
    data: {
      last_fetched_from_kombo_at: fetchStartDate,
    },
  })
}

async function handleCronJob(integration) {
  const fetchStartDate = new Date() // This time should be in UTC

  // Do not pass a starting date here; fetch everything
  // Fetch the models you care about (repeat for applications, candidates, jobs, ...)
  await fetchApplications(integration.kombo_integration_id)

  await prisma.integration.update({
    where: {
      id: integration.id,
    },
    data: {
      last_fetched_from_kombo_at: fetchStartDate,
    },
  })
}

async function fetchApplications(integrationId: string, updatedAfter?: string) {
  let cursor
  do {
    const resp = await axios.get('https://api.kombo.dev/v1/ats/applications', {
      headers: {
        Authorization: `Bearer ${KOMBO_API_KEY}`,
        'X-Integration-Id': integrationId, // Use the stored integration ID
      },
      params: {
        cursor: cursor,
        updated_after: updatedAfter,
      },
    })

    cursor = resp.data.data.next

    // Implement your handling logic here
    // Usually, you will upsert the data into your database and build specific
    // domain logic here.
    await handleApplicationData(integrationId, resp.data.data.results)
  } while (cursor)
}

FAQ

Why do you not provide the changed data inside of data-changed?

We want to make integrating with Kombo as simple as possible for you. Since we recommend running occasional full fetches anyways, it means you can re-use most of your logic for both fetch types. Furthermore, this allows you to iterate and upsert data at your own pace. Unlike if we were to send you a really large payload after an initial sync. By listening to just this single unified webhook, you are always guaranteed to benefit from our latest innovations to help you get the freshest data.

How often do you send the data-changed event? Do you debounce the webhook?

Yes, by default we debounce data-changed with a 30-second window. That means no matter how many events we’re receiving, you’ll receive at most one webhook every 30 seconds. This works as follows: The first time this event fires, we will pass it through to you right away. Then we will wait 30 seconds. All the events we receive in those 30 seconds will then be sent to you as a single update with the next webhook.