Importing items in bulk

Learn how to create and update items in bulk via the platform API using ingest_items, backfill_items, and fetch_job_status

🚧

Only available in API versions 2026-07 and later

Using the platform API, you can import large quantities of items into a monday.com board through an asynchronous, three-step workflow:

  1. Start an import job (ingest_items or backfill_items) and receive a pre-signed upload_url and job_id
  2. Upload the CSV file to the upload_url
  3. Poll job status with fetch_job_status

The API exposes two ways to start an import. ingest_items is the default for almost every use case. backfill_items is intended for a one-time, account-admin-only initial load — for example, seeding a board before it goes into day-to-day use.


Choose how to import

ingest_items (recommended)backfill_items
Use forIntegrations, recurring imports, creating or updating items as part of normal board activityA single large initial setup import before users or agents work in the board day-to-day
Maximum rows per job10,00020,000
Who can call itAny user with board edit access via API (boards:write)Account admin and boards:write
Update or skip existing itemsYes, using on_matchNot supported; always creates new items
Hourly item create/update budgetCounts toward the 19,000 items per account per hour budgetDoes not consume this budget
Side effects on other monday featuresNone — automations and Activity Log behave normallyAutomations are not triggered. Item creation is not logged in the Activity Log.

Supported column types

Both endpoints support the following column types:

  • Date
  • Dropdown
  • Email
  • Link
  • Long Text
  • Number
  • People
  • Phone
  • Status
  • Text
  • Timeline

Rows that include unsupported column types may fail validation. See Column type validation for the per-type CSV format.


Limitations

Categorybackfill_itemsingest_items
Maximum rows per file20,00010,000
Maximum file size150 MB150 MB
Upload URL expiration10 minutes10 minutes
Report URL expiration10 minutes10 minutes
Mutation calls per account per hour100 (shared with ingest_items)100 (shared with backfill_items)
Item create/update budget per hourNot consumed19,000 items per account

Notes

  • File size is enforced after upload by the import service (not by the upload URL itself). Oversized files are rejected with failure_reason: FILE_TOO_LARGE.
  • backfill_items and ingest_items share the same hourly limit on starting jobs.
  • Cancellation of a running job is not supported.

Getting started

Pre-requisites

Step 1: Start the import job

  1. Retrieve the board_id and group_id where you want to import items by querying boards and groups.
  2. Call ingest_items unless you specifically need a one-time admin-only initial load via backfill_items.
  3. Save the returned job_id and upload_url. The upload_url is only valid for 10 minutes.

Ingest example (default path)

mutation {
  ingest_items(
    board_id: "1234567890"
    group_id: "topics"
    on_match: { behaviour: UPSERT, match_column_id: "email" }
  ) {
    job_id
    upload_url
  }
}

Backfill example (one-time initial load only)

mutation {
  backfill_items(board_id: "1234567890", group_id: "topics") {
    job_id
    upload_url
  }
}

Step 2: Upload the CSV file

Upload your CSV to the upload_url returned in Step 1.

PUT <upload_url>
Content-Type: text/csv

<CSV file content>

A successful upload returns HTTP 200 with an ETag header.

HTTP/1.1 200 OK
ETag: "abc123def456"

Step 3: Monitor the import status

Poll fetch_job_status every ~10 seconds until the job reaches a terminal state (COMPLETED, FAILED, or REJECTED).

query {
  fetch_job_status(job_id: "7c9e6679-7425-40de-944b-e07fc1f90ae7") {
    ... on ItemsJobStatus {
      status
      counts {
        submitted
        invalid
        skipped
        created
        updated
        failed
      }
      progress_percentage
      failure_reason
      failure_message
      fully_imported
      report_created
      report_url
    }
  }
}

When report_created is true, download report_url promptly. The report URL expires after 10 minutes.


Reference

CSV format

  • UTF-8 encoding is required.
  • Use comma (,) as the column separator.
  • Wrap values that contain commas, double quotes, or newlines in double quotes. Escape an embedded double quote by doubling it ("").

Headers

  • The first header value must be name.
  • The remaining header values must be exact board column_id values (case-sensitive).
  • Query columns to retrieve the correct column IDs for the target board.

Rows

  • The first value in each row is the item name.
  • The remaining values map to the column IDs in the header.
  • Empty values are allowed. Whitespace-only values are treated as empty.
  • In UPSERT mode (ingest_items only), an empty cell on a matched row is ignored — the existing column value is preserved.
  • In UPSERT mode, a cell containing exactly the string <NULL> clears the column value on the matched item.

Example

name,text,status,date,email,numeric,dropdown
Task 1,Description text,Working on it,2025-12-31,[email protected],100,"Option A, Option B"
Task 2,Another task,Done,2026-01-15,[email protected],200,Option C
Task 3,Third task,Stuck,2026-02-28,[email protected],300,
Task 4,,,,,,

Column type validation

Column typeCSV value formatValidation rules
DateValid dateISO format (YYYY-MM-DD). Other formats may be normalized.
DropdownComma-separated labelsEach label must already exist on the column and be active. All labels in a comma-separated list must be valid; otherwise the row fails.
EmailValid email or Name <email>Must be a valid email format.
LinkValid URL or [Display Text](URL)URL must start with http:// or https://.
Long TextText stringEmpty values allowed.
NumberNumeric stringInteger or decimal.
PeopleComma-separated identifiers, each one of: email ([email protected]), user:<id> or user:<name>, team:<id> or team:<name>, or a bare name (Jeremy) resolved as user or team.Each identifier must resolve to exactly one active user or team in the account. Fails on no match, multiple matches, or resolution errors. Duplicate identifiers are deduped. Respects the column's max_people_allowed setting.
PhonePhone number stringVarious formats accepted; optional country hints.
StatusLabel text (exact match)Label must already exist on the column and be active. Case-sensitive.
TextText stringEmpty values allowed.
TimelineYYYY-MM-DD/YYYY-MM-DD for a range, or single YYYY-MM-DD (sets from = to = date)Both dates must be valid ISO dates. Other separators (dash with spaces, pipe, JSON) are not accepted.

Best practices

  1. Prefer ingest_items for integrations and any import that should behave like normal board activity.
  2. Use backfill_items only for a planned, one-time initial load.
  3. Poll job status about every 10 seconds — not faster.
  4. Validate the CSV header against the board schema before uploading.
  5. Handle GraphQL errors from the API separately from HTTP errors from the file upload step.
  6. Download the report as soon as report_created is true.
  7. On RATE_LIMIT_EXCEEDED, wait using extensions.retryAfterMs instead of retrying immediately.
  8. For ingest, spread large volumes across the hour where possible to reduce ACCOUNT_CAPACITY_EXCEEDED rejections.

Troubleshooting

IssueSymptomsResolution
Upload URL expiredHTTP 403 on the PUT upload (S3 returns "Request has expired")Start a new job and upload within 10 minutes.
Invalid CSV formatStatus REJECTED, failure_reason: INVALID_UPLOADVerify that the first header is name, all column IDs exist on the board, the file is UTF-8, the separator is comma, and embedded quotes are escaped ("").
Items not createdStatus COMPLETED with high counts.invalid or counts.failed and low counts.createdDownload the report from report_url and inspect per-row errors. Check column ID typos, status/dropdown label spelling and case, date format, and that referenced users or teams exist.
Status stuck at UPLOAD_PENDINGNo progress after uploading the CSVConfirm the upload returned HTTP 200 with an ETag header. If not, start a new job and re-upload.
Backfill permission deniedStatus REJECTED, failure_reason: PERMISSION_DENIED, or a GraphQL error "Only admin users can perform this operation."backfill_items requires an account admin. Use ingest_items instead.
File too largeStatus REJECTED, failure_reason: FILE_TOO_LARGESplit the file into chunks at or below 150 MB.
Hourly capacity exceeded (ingest)Status REJECTED, failure_reason: ACCOUNT_CAPACITY_EXCEEDEDThe job's valid row count would exceed the remaining 19,000 items-per-hour budget. Spread the load across the hour or wait for the budget to reset.
Rate limit on starting jobsGraphQL error RATE_LIMIT_EXCEEDED (HTTP 429) on ingest_items or backfill_itemsHonor extensions.retryAfterMs before retrying. The 100-call-per-hour limit is shared between ingest_items and backfill_items.

Related reference pages