Available in VPC
Get details of a job.
Request
This section describes the request format. The method and URI are as follows:
Method | URI |
---|---|
GET | /api/v1/jobs/{jobId} |
Request headers
For information about the headers common to all Data Flow APIs, see Data Flow request headers.
Request path parameters
You can use the following path parameters with your request:
Field | Type | Required | Description |
---|---|---|---|
jobId |
String | Required | Job ID
|
Request example
The request example is as follows:
curl --location --request GET 'https://dataflow.apigw.ntruss.com/api/v1/jobs/gqigvH******' \
--header 'x-ncp-apigw-timestamp: {Timestamp}' \
--header 'x-ncp-iam-access-key: {Access Key}' \
--header 'x-ncp-apigw-signature-v2: {API Gateway Signature}'
Response
This section describes the response format.
Response body
The response body includes the following data:
Field | Type | Required | Description |
---|---|---|---|
jobId |
String | - | Job ID |
name |
String | - | Job name |
description |
String | - | Job description |
type |
String | - | Job type
|
status |
String | - | Job status
|
lastExecutionStatus |
String | - | Last job execution status
|
nodes |
Array | - | Job node information
|
runCondition |
Object | - | Job execution option |
runCondition.workerType |
String | - | Worker type
|
runCondition.numWorker |
Integer | - | Number of workers
|
runCondition.timeout |
Integer | - | Execution timeout (minute)
|
runCondition.nrn |
String | - | NAVER Cloud Platform resource identification value for job |
runCondition.scriptPath |
String | - | Job execution script storage path |
runCondition.logPath |
String | - | Job execution history storage path |
createdDate |
String | - | Job creation date and time
|
updatedDate |
String | - | Job modification date and time
|
lastExecutionDate |
String | - | Last job execution date and time
|
Response status codes
For response status codes common to all Data Flow APIs, see Data Flow API response status codes.
Response example
The response example is as follows:
{
"jobId": "gqigvH******",
"name": "job001",
"description": "",
"type": "DATAFLOW",
"status": "RUNNABLE",
"nodes": [
{
"type": "SOURCE_OBS",
"id": 169777*******,
"name": "Object Storage",
"parentNodeIds": [],
"regionNo": "1",
"bucketName": "aitems",
"prefix": "dataflow1",
"dataType": "CSV",
"fieldList": [
{
"name": "id",
"type": "string",
"properties": []
},
{
"name": "name",
"type": "string",
"properties": []
},
{
"name": "description",
"type": "string",
"properties": []
}
]
},
{
"type": "TRANSFORM_FILTER",
"id": 169777*******,
"name": "Filters",
"parentNodeIds": [
169777*******
],
"filterType": "AND",
"filterConditionList": [
{
"name": "name",
"operator": "EQ",
"value": "A"
}
]
},
{
"type": "TARGET_OBS",
"id": 169777*******,
"name": "Object Storage",
"parentNodeIds": [
169777*******
],
"regionNo": "1",
"bucketName": "aitems",
"prefix": "dataflow1",
"dataType": "CSV",
"updateType" : "OVERWRITE",
"fieldList": [
{
"name": "id",
"type": "string",
"properties": []
},
{
"name": "name",
"type": "string",
"properties": []
},
{
"name": "description",
"type": "string",
"properties": []
}
]
}
],
"runCondition": {
"workerType": "DEFAULT",
"numWorker": 2,
"timeout": 2880,
"nrn": "",
"scriptPath": "dataflow-****/scripts/",
"logPath": "dataflow-****/sparkHistoryLogs/"
},
"createdDate": "2025-03-19T15:03:42+09:00",
"updatedDate": "2025-03-20T13:07:34+09:00",
"lastExecutionDate": "2025-03-20T13:06:05+09:00"
}