Available in VPC
Edit job execution options.
Request
This section describes the request format. The method and URI are as follows:
Method | URI |
---|---|
PUT | /api/v1/jobs/{jobId}/executions |
Request headers
For information about the headers common to all Data Flow APIs, see Data Flow request headers.
Request path parameters
You can use the following path parameters with your request:
Field | Type | Required | Description |
---|---|---|---|
jobId |
String | Required | Job ID
|
Request body
You can include the following data in the body of your request:
Field | Type | Required | Description |
---|---|---|---|
jobCondition |
Object | Required | Job execution option information |
jobCondition.workerType |
String | Optional | Worker type
|
jobCondition.numWorker |
Integer | Optional | Number of workers
|
jobCondition.timeout |
Integer | Optional | Execution timeout (minute)
|
jobCondition.nrn |
String | Required | NAVER Cloud Platform resource identification value for job |
jobCondition.scriptPath |
String | Required | Job execution script storage path |
jobCondition.logPath |
String | Required | Job execution history storage path |
Request example
The request example is as follows:
curl --location --request PUT 'https://dataflow.apigw.ntruss.com/api/v1/jobs/gqigvH******/executions' \
--header 'x-ncp-apigw-timestamp: {Timestamp}' \
--header 'x-ncp-iam-access-key: {Access Key}' \
--header 'x-ncp-apigw-signature-v2: {API Gateway Signature}' \
--data '{
"jobCondition": {
"workerType": "DEFAULT",
"numWorker": 2,
"timeout": 360,
"nrn": "nrn:PUB:IAM::*******:Role/********-0496-11f0-baf6-246e96591a38",
"scriptPath": "dataflow-2706412-****/scripts/",
"logPath": "dataflow-2706412-****/sparkHistoryLogs/"
}
}'
Response
This section describes the response format.
Response body
The response body includes the following data:
Field | Type | Required | Description |
---|---|---|---|
jobId |
String | - | Job ID |
name |
String | - | Job name |
description |
String | - | Job description |
type |
String | - | Job type
|
status |
String | - | Job status
|
nodes |
Array | - | Job node information
|
runCondition |
Object | - | Job execution option |
runCondition.workerType |
String | - | Worker type
|
runCondition.numWorker |
Integer | - | Number of workers
|
runCondition.timeout |
Integer | - | Execution timeout (minute)
|
runCondition.nrn |
String | - | NAVER Cloud Platform resource identification value for job |
runCondition.scriptPath |
String | - | Job execution script storage path |
runCondition.logPath |
String | - | Job execution history storage path |
createdDate |
String | - | Job creation date and time
|
updatedDate |
String | - | Job modification date and time
|
Response status codes
For response status codes common to all Data Flow APIs, see Data Flow API response status codes.
Response example
The response example is as follows:
{
"jobId": "gqigvH******",
"name": "job001",
"description": "",
"type": "DATAFLOW",
"status": "RUNNABLE",
"nodes": [
{
"type": "SOURCE_OBS",
"id": 169777*******,
"name": "Object Storage",
"parentNodeIds": [],
"regionNo": "1",
"bucketName": "aitems",
"prefix": "dataflow1",
"dataType": "CSV",
"fieldList": [
{
"name": "id",
"type": "string",
"properties": []
},
{
"name": "name",
"type": "string",
"properties": []
},
{
"name": "description",
"type": "string",
"properties": []
}
]
},
{
"type": "TRANSFORM_FILTER",
"id": 169777*******,
"name": "Filters",
"parentNodeIds": [
169777*******6
],
"filterType": "AND",
"filterConditionList": [
{
"name": "name",
"operator": "EQ",
"value": "A"
}
]
},
{
"type": "TARGET_OBS",
"id": 169777*******,
"name": "Object Storage",
"parentNodeIds": [
169777*******
],
"regionNo": "1",
"bucketName": "aitems",
"prefix": "dataflow1",
"dataType": "CSV",
"updateType": "OVERWRITE",
"fieldList": [
{
"name": "id",
"type": "string",
"properties": []
},
{
"name": "name",
"type": "string",
"properties": []
},
{
"name": "description",
"type": "string",
"properties": []
}
]
}
],
"runCondition": {
"workerType": "DEFAULT",
"numWorker": 2,
"timeout": 2880,
"nrn": "",
"scriptPath": "dataflow-****/scripts/",
"logPath": "dataflow-****/sparkHistoryLogs/"
},
"createdDate": "2025-03-19T15:03:42+09:00",
"updatedDate": "2025-03-20T13:07:34+09:00"
}