Available in VPC
Get table details.
Request
This section describes the request format. The method and URI are as follows:
Method | URI |
---|---|
GET | /api/v1/catalogs/{catalogId}/databases/{databaseName}/tables/{tableName} |
Request headers
For information about the headers common to all Data Catalog APIs, see Data Catalog request headers.
Request path parameters
You can use the following path parameters with your request:
Field | Type | Required | Description |
---|---|---|---|
catalogId |
Integer | Required | Catalog ID
|
databaseName |
String | Required | Database name
|
tableName |
String | Required | Table name
|
Request example
The request example is as follows:
curl --location --request GET 'https://datacatalog.apigw.ntruss.com/api/v1/catalogs/4**/databases/my_database/tables/employeelist' \
--header 'x-ncp-apigw-timestamp: {Timestamp}' \
--header 'x-ncp-iam-access-key: {Access Key}' \
--header 'x-ncp-apigw-signature-v2: {API Gateway Signature}'
Response
This section describes the response format.
Response body
The response body includes the following data:
Field | Type | Required | Description |
---|---|---|---|
pageNo |
Integer | - | Page number |
pageSize |
Integer | - | Page output count |
totalCount |
Integer | - | Number of response results |
requestId |
String | - | ID for the request
|
tableResponseList |
Array | - | Table information |
tableResponseList
The following describes tableResponseList
.
Field | Type | Required | Description |
---|---|---|---|
catalogId |
Integer | - | Catalog ID |
databaseName |
String | - | Database name |
tableName |
String | - | Table name |
dataFormat |
String | - | Data format |
location |
String | - | Table location |
description |
String | - | Table description |
createTime |
String | - | Table creation date and time
|
updateTime |
String | - | Update date and time
|
properties |
Object | - | Table property information
|
properties.EXTERNAL |
String | - | External storage status of the table
|
properties.compressionType |
String | - | Compressed file extension
|
properties.clusterNo |
String | - | DB service number
|
properties.connectionId |
String | - | Connection ID
|
properties.connectionName |
String | - | Connection name
|
properties.created_time |
String | - | Table creation date and time
|
properties.dataFormat |
String | - | Data format |
properties.dataType |
String | - | Data type |
properties.delimiter |
String | - | Delimiter
|
properties.inputFormat |
String | - | Data read format |
properties.isDirectory |
String | - | Whether the scan target is a directory
|
properties.last_modified_time |
String | - | Update date and time
|
properties.metadata_location |
String | - | Metadata file path
|
properties.numFiles |
String | - | Total number of scanned files
|
properties.objectstorageContentLength |
String | - | Object length (byte) |
properties.objectstorageContentType |
String | - | Object data type |
properties.objectstorageLastModified |
String | - | Object update date and time
|
properties.outputFormat |
String | - | Data output format |
properties.partitioningScheme |
String | - | Partitioning schema
|
properties.scannerId |
String | - | Scanner ID |
properties.scannerName |
String | - | Scanner name |
properties.serializationLib |
String | - | Serialization and deserialization library |
properties.skip.header.line.count |
String | - | Number of excluded header lines |
properties.totalSize |
String | - | Total capacity of scanned data (byte)
|
properties.transient_lastDdlTime |
String | - | Table DDL update date and time
|
properties.serde.escapeChar |
String | - | Data recognition removal character
|
properties.serde.quoteChar |
String | - | Data recognition symbol
|
properties.serde.separatorChar |
String | - | Schema determination delimiter
|
The property information displayed may vary depending on the table type.
- For more information about the properties of Apache Iceberg tables, see Iceberg Table Metadata.
For RDB data
The following describes the additional properties displayed for RDB (MySQL, MSSQL, PostgreSQL) data.
Field | Type | Required | Description |
---|---|---|---|
properties.{dbType}Collation |
String | - | Set string sorting. |
properties.{dbType}DataSize |
String | - | Data size |
properties.{dbType}IndexSize |
String | - | Index size |
properties.{dbType}Indexes |
String | - | Index structure |
properties.{dbType}Rows |
String | - | Number of rows (records) |
properties.{dbType}TableSize |
String | - | Total table size |
For MongoDB data
The following describes the additional properties displayed for MongoDB data.
Field | Type | Required | Description |
---|---|---|---|
properties.mongodbAvgObjSize |
String | - | Collection average object size |
properties.mongodbFreeStorageSize |
String | - | Collection free storage capacity |
properties.mongodbIndexSize |
String | - | Collection index size |
properties.mongodbIndexes |
String | - | Collection index information |
properties.mongodbRowCount |
String | - | Number of collection rows (records) |
properties.mongodbSize |
String | - | Collection size
|
properties.mongodbStorageSize |
String | - | Storage size allocated to the collection |
properties.mongodbTotalSize |
String | - | Total disk capacity of the collection |
Response status codes
For response status codes common to all Data Catalog APIs, see Data Catalog response status codes.
Response example
The response example is as follows:
{
"catalogId": 4**,
"databaseName": "default",
"tableName": "employeelist",
"dataFormat": "csv",
"location": "s3a://datacatalog-c***-e******f/my_database/employeeList/",
"createTime": "2025-03-18T09:34:52+0900",
"updateTime": "2025-03-18T09:42:28+0900",
"properties": {
"EXTERNAL": "TRUE",
"compressionType": "",
"created_time": "1742258092",
"dataFormat": "csv",
"dataType": "file",
"inputFormat": "org.apache.hadoop.mapred.TextInputFormat",
"isDirectory": "TRUE",
"last_modified_time": "1742258548",
"numFiles": "1",
"objectstorageContentLength": "5095",
"objectstorageContentType": "text/csv",
"objectstorageLastModified": "1742257929",
"outputFormat": "org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat",
"scannerId": "9**",
"scannerName": "employee_list",
"serializationLib": "org.apache.hadoop.hive.serde2.OpenCSVSerde",
"skip.header.line.count": "1",
"totalSize": "5095",
"transient_lastDdlTime": "1742258092",
"serde.escapeChar": "\\",
"serde.quoteChar": "\"",
"serde.separatorChar": ","
}
}