createJob

Prev Next

Available in VPC

Create a job.

Request

This section describes the request format. The method and URI are as follows:

Method URI
POST /api/v1/jobs

Request headers

For information about the headers common to all Data Flow APIs, see Data Flow request headers.

Request body

You can include the following data in the body of your request:

Field Type Required Description
name String Required Job name
  • Enter 3-100 characters by combining English letters, numbers, and special characters "_" and "-".
  • It must start with an English letter or "-".
nodes Array Required Job node information

Request example

The request example is as follows:

curl --location --request POST 'https://dataflow.apigw.ntruss.com/api/v1/jobs' \
--header 'x-ncp-apigw-timestamp: {Timestamp}' \
--header 'x-ncp-iam-access-key: {Access Key}' \
--header 'x-ncp-apigw-signature-v2: {API Gateway Signature}' \
--data '{
  "name": "job001",
  "nodes": [
        {
          "type": "SOURCE_OBS",
          "id": 169777*******,
          "name": "Object Storage",
          "parentNodeIds": [],
          "regionNo": "1",
          "bucketName": "aitems",
          "prefix": "dataflow1",
          "dataType": "CSV",
          "fieldList": [
            {
              "name": "id",
              "type": "string"
            },
            {
              "name": "name",
              "type": "string"
            },
            {
              "name": "description",
              "type": "string"
            }
          ]
        },
        {
          "type": "TRANSFORM_FILTER",
          "id": 169777*******,
          "name": "Filters",
          "parentNodeIds": [
            169777*******
          ],
          "filterType": "AND",
          "filterConditionList": [
            {
              "name": "name",
              "operator": "EQ",
              "value": "A"
            }
          ]
        },
        {
          "type": "TARGET_OBS",
          "id": 169777*******,
          "name": "Object Storage",
          "parentNodeIds": [
            169777*******
          ],
          "regionNo": "1",
          "bucketName": "aitems",
          "prefix": "dataflow1",
          "dataType": "CSV",
          "updateType" : "OVERWRITE",
          "fieldList": [
            {
              "name": "id",
              "type": "string"
            },
            {
              "name": "name",
              "type": "string"
            },
            {
              "name": "description",
              "type": "string"
            }
          ]
        }
      ]
}'

Response

This section describes the response format.

Response body

The response body includes the following data:

Field Type Required Description
jobId String - Job ID
name String - Job name
description String - Job description
type String - Job type
  • DATAFLOW (default)
status String - Job status
  • RUNNABLE | RUNNING | DELETED | UNKNOWN | DRAFT | STOPPED | EDITING
    • RUNNABLE: job execution available
    • RUNNING: job execution in progress
    • DELETED: job being deleted or deletion completed
    • UNKNOWN: other
    • DRAFT: job edit incomplete
    • STOPPED: job being stopped
    • EDITING: job being edited (validation required)
nodes Array - Job node information
runCondition Object - Job execution option
runCondition.workerType String - Worker type
  • DEFAULT (default)
runCondition.numWorker Integer - Number of workers
  • 2 (default)
runCondition.timeout Integer - Execution timeout (minute)
  • Time spent waiting for the results of a job when it is executed once
  • 0-1440 (default: 360)
runCondition.nrn String - NAVER Cloud Platform resource identification value for job
runCondition.scriptPath String - Job execution script storage path
runCondition.logPath String - Job execution history storage path
createdDate String - Job creation date and time
  • ISO 8601 format (including UTC+9)
updatedDate String - Job modification date and time
  • ISO 8601 format (including UTC+9)

Response status codes

For response status codes common to all Data Flow APIs, see Data Flow API response status codes.

Response example

The response example is as follows:

{
      "jobId": "5Yns7J******",
      "name": "job001",
      "description": "",
      "type": "DATAFLOW",
      "status": "RUNNABLE",
      "nodes": [
        {
          "type": "SOURCE_OBS",
          "id": 169777,
          "name": "Object Storage",
          "parentNodeIds": [],
          "regionNo": "1",
          "bucketName": "aitems",
          "prefix": "dataflow1",
          "dataType": "CSV",
          "fieldList": [
            {
              "name": "id",
              "type": "string",
              "properties": []
            },
            {
              "name": "name",
              "type": "string",
              "properties": []
            },
            {
              "name": "description",
              "type": "string",
              "properties": []
            }
          ]
        },
        {
          "type": "TRANSFORM_FILTER",
          "id": 169777*******,
          "name": "Filters",
          "parentNodeIds": [
            169777*******
          ],
          "filterType": "AND",
          "filterConditionList": [
            {
              "name": "name",
              "operator": "EQ",
              "value": "A"
            }
          ]
        },
        {
          "type": "TARGET_OBS",
          "id": 169777*******,
          "name": "Object Storage",
          "parentNodeIds": [
            169777*******
          ],
          "regionNo": "1",
          "bucketName": "aitems",
          "prefix": "dataflow1",
          "dataType": "CSV",
          "updateType" : "OVERWRITE",
          "fieldList": [
            {
              "name": "id",
              "type": "string",
              "properties": []
            },
            {
              "name": "name",
              "type": "string",
              "properties": []
            },
            {
              "name": "description",
              "type": "string",
              "properties": []
            }
          ]
        }
      ],
      "runCondition": {
        "workerType": "DEFAULT",
        "numWorker": 2,
        "timeout": 2880,
        "nrn": "",
        "scriptPath": "dataflow-2706412-****/scripts/",
        "logPath": "dataflow-2706412-****/sparkHistoryLogs/"
      },
      "createdDate": "2025-03-21T10:02:02+09:00",
      "updatedDate": "2025-03-21T10:02:02+09:00"
    }