CODING_FAILED — Coding step failed mid-run
Summary
Section titled “Summary”Stage 3 of the pipeline — assigning codes to verbatims — failed. The job’s data.error field has the specific cause. Any verbatims coded before the failure are still available and only those were charged.
HTTP status
Section titled “HTTP status”500 Internal Server Error. Standard envelope.
Example response
Section titled “Example response”{ "success": false, "error": { "code": "CODING_FAILED", "message": "Coding failed at batch 7 of 12. 600 of 1000 responses were coded successfully.", "request_id": "req_01HXJZK4ABCDEF", "doc_url": "https://docs.surveycoder.io/errors/coding-failed" }}Why this happens
Section titled “Why this happens”- An upstream LLM provider hit a transient outage.
- A specific verbatim triggered a content-policy refusal that wasn’t retried in time.
- The codebook was edited mid-run (rare — usually blocked by
INVALID_STATE).
How to fix it
Section titled “How to fix it”- Fetch the job:
GET /v1/jobs/{job_id}. Partial results are indata.partial_result. - To finish the remaining verbatims, call
POST /v1/codeagain on the same question — the API skips already-coded responses by default. - If the same job keeps failing at the same batch, isolate the offending verbatims (
data.failed_response_indices) and either reword or skip them. - Email support@surveycoder.io with the
request_idif you see persistent failures across different questions.