We are improving how you fetch data from Kombo. If you are looking for the old
guide, please visit Syncing Data (Old Approach). If you
are looking to migrate to the new approach, please visit Migrate to Data
Changed.
- Listen to our data-changed webhook to receive notifications of data changes.
- Fetch only updated data.
- Periodically fetch all data to fully align your dataset and correct any potential data drift.
- Combine both strategies for a setup that is robust and efficient.
Overview
Listening to data-changed webhook
We provide a webhook calleddata-changed
. This is sent to your system whenever data has changed inside Kombo. For example, after:
- We finish syncing (full or delta sync)
- We receive an update through upstream webhooks
For near real-time updates, the connected tool must have upstream webhooks
enabled for the integration. Some tools allow Kombo to enable these
automatically, while others require a one-time manual step in the
integration’s Setup Flow by the end-customer. If upstream webhooks are not
enabled or not supported,
data-changed
will reflect updates after the next
scheduled sync./candidates
endpoint to fetch both candidates and their applications.
Furthermore, after every successful fetch from us, store the timestamp of when the respective fetch started. Use this timestamp during your next fetch and pass it with your requests to the Kombo API as the updated_after
query parameter. Kombo will then only return the entries that changed since that timestamp.
Read more about this here.
Good to know: The updated_after
filter also considers changes in models that are returned by the endpoint as nested values. For example, the /candidates
endpoint will include every candidate that has had changes to their list of applications, even if the candidate profile itself did not change.
All models
You should call the endpoints you care about based on the models we tell you have changed. The following table shows endpoints connected to the relevant models that will be part of the data-changed webhook. We recommend the following approach:- Decide which endpoints are relevant for your use case (e.g., if your system is centered around applications, you primarily use Get Applications).
- Map changed models to those endpoints using the table.
For instance, if both ats_applications and ats_candidates are listed in the webhook, but you primarily care about applications, you only need to call Get applications, since candidate data is included there too.
Endpoint | Models |
---|---|
Get applications | ats_applications ats_application_stages ats_candidates ats_interviews ats_jobs ats_join_candidates_tags ats_offers ats_tags |
Get candidates | ats_candidates ats_application_stages ats_applications ats_jobs ats_join_candidates_tags ats_join_jobs_application_stages ats_tags |
Get interviews | ats_interviews ats_applications ats_candidates ats_jobs ats_join_interviews_users ats_users |
Get jobs | ats_jobs ats_application_stages ats_job_postings ats_join_jobs_application_stages ats_join_jobs_screening_questions ats_join_jobs_users ats_screening_questions ats_users |
Get offers | ats_offers ats_applications ats_candidates ats_jobs |
Get rejection reasons | ats_rejection_reasons |
Preventing Data Drift
In addition to fetching data from Kombo’s endpoints in response to receiving the data-changed webhook, we recommend running a periodic full data fetch for the data models you care about. In most cases, a 7-day schedule is ideal. This helps to remedy any drift that may occur in your data, e.g. from accidental manual changes or lost webhooks. To implement this, perform GET requests for the Kombo data models you care about without passing theupdated_after
query parameter. Then upsert the data returned by Kombo and merge it with your existing data copy.
Full Example Code
Understanding changed_at
vs updated_after
Behavior
A common source of confusion is understanding when records are returned by the
updated_after
filter and how this relates to each record’s changed_at
timestamp. Here’s the key distinction:
Record-level changed_at
Field
Each record has a changed_at
timestamp that only updates when properties
directly on that record change. For example:
- If a candidate’s
first_name
changes, the candidate’schanged_at
updates - If an application’s
current_stage_id
changes, the application’schanged_at
updates - However: If a candidate’s name changes, related applications’
changed_at
fields do NOT update
Endpoint Filtering with updated_after
The updated_after
parameter works differently - it returns records when
either the record itself OR its nested data has been updated:
Example: Applications Endpoint
When you callGET /applications
with updated_after
, you’ll receive applications if:
- Direct application changes: The application itself was modified (stage change, rejection, etc.)
- Nested candidate changes: The candidate’s data was updated (name, email, etc.)
- Nested job changes: The job’s data was updated (title, status, etc.)
changed_at
timestamp hasn’t changed.
Concrete Scenario
Let’s say you callGET /applications
at 9:00 AM and get:
GET /applications?updated_after=2023-10-01T09:00:00Z
, you’ll receive:
changed_at
remains unchanged, but the
application is still returned because it contains updated nested data.
Best Practice
When usingupdated_after
filtering:
- Don’t assume a record was directly modified just because it’s returned
- Compare nested data to determine what actually changed
- Use the nested objects’
changed_at
fields to identify which parts were updated - Design your sync logic to handle both direct and indirect changes