public class Table extends TableInfo
Objects of this class are immutable. Operations that modify the table like update(com.google.cloud.bigquery.BigQuery.TableOption...)
return a new object. To get a Table
object with the most recent information use reload(com.google.cloud.bigquery.BigQuery.TableOption...)
. Table
adds a layer of service-related functionality over TableInfo
.
Modifier and Type | Class and Description |
---|---|
static class |
Table.Builder
A builder for
Table objects. |
Modifier and Type | Method and Description |
---|---|
Job |
copy(String destinationDataset,
String destinationTable,
BigQuery.JobOption... options)
Starts a BigQuery Job to copy the current table to the provided destination table.
|
Job |
copy(TableId destinationTable,
BigQuery.JobOption... options)
Starts a BigQuery Job to copy the current table to the provided destination table.
|
boolean |
delete()
Deletes this table.
|
boolean |
equals(Object obj) |
boolean |
exists()
Checks if this table exists.
|
Job |
extract(String format,
List<String> destinationUris,
BigQuery.JobOption... options)
Starts a BigQuery Job to extract the current table to the provided destination URIs.
|
Job |
extract(String format,
String destinationUri,
BigQuery.JobOption... options)
Starts a BigQuery Job to extract the current table to the provided destination URI.
|
BigQuery |
getBigQuery()
Returns the table's
BigQuery object used to issue requests. |
int |
hashCode() |
InsertAllResponse |
insert(Iterable<InsertAllRequest.RowToInsert> rows)
Insert rows into the table.
|
InsertAllResponse |
insert(Iterable<InsertAllRequest.RowToInsert> rows,
boolean skipInvalidRows,
boolean ignoreUnknownValues)
Insert rows into the table.
|
TableResult |
list(BigQuery.TableDataListOption... options)
Returns the paginated list rows in this table.
|
TableResult |
list(Schema schema,
BigQuery.TableDataListOption... options)
Returns the paginated list rows in this table.
|
Job |
load(FormatOptions format,
List<String> sourceUris,
BigQuery.JobOption... options)
Starts a BigQuery Job to load data into the current table from the provided source URIs.
|
Job |
load(FormatOptions format,
String sourceUri,
BigQuery.JobOption... options)
Starts a BigQuery Job to load data into the current table from the provided source URI.
|
Table |
reload(BigQuery.TableOption... options)
Fetches current table's latest information.
|
Table.Builder |
toBuilder()
Returns a builder for the table object.
|
Table |
update(BigQuery.TableOption... options)
Updates the table's information with this table's information.
|
getCloneDefinition, getCreationTime, getDefaultCollation, getDefinition, getDescription, getEncryptionConfiguration, getEtag, getExpirationTime, getFriendlyName, getGeneratedId, getLabels, getLastModifiedTime, getNumBytes, getNumLongTermBytes, getNumRows, getRequirePartitionFilter, getSelfLink, getTableConstraints, getTableId, newBuilder, of, toString
public boolean exists()
Example of checking if the table exists.
boolean exists = table.exists();
if (exists) {
// the table exists
} else {
// the table was not found
}
true
if this table exists, false
otherwiseBigQueryException
- upon failurepublic Table reload(BigQuery.TableOption... options)
null
if the table does not exist.
Example of fetching the table's latest information, specifying particular table fields to get.
TableField field1 = TableField.LAST_MODIFIED_TIME;
TableField field2 = TableField.NUM_ROWS;
Table latestTable = table.reload(TableOption.fields(field1, field2));
if (latestTable == null) {
// the table was not found
}
options
- table optionsTable
object with latest information or null
if not foundBigQueryException
- upon failurepublic Table update(BigQuery.TableOption... options)
Table
object is returned.
Example of updating the table's information.
Table updatedTable = table.toBuilder().setDescription("new description").build().update();
options
- dataset optionsTable
object with updated informationBigQueryException
- upon failurepublic boolean delete()
Example of deleting the table.
boolean deleted = table.delete();
if (deleted) {
// the table was deleted
} else {
// the table was not found
}
true
if table was deleted, false
if it was not foundBigQueryException
- upon failurepublic InsertAllResponse insert(Iterable<InsertAllRequest.RowToInsert> rows) throws BigQueryException
Streaming inserts reside temporarily in the streaming buffer, which has different
availability characteristics than managed storage. Certain operations do not interact with the
streaming buffer, such as #list(TableDataListOption...)
and #copy(TableId,
JobOption...)
. As such, recent streaming data will not be present in the destination table or
output.
Example of inserting rows into the table.
String rowId1 = "rowId1";
String rowId2 = "rowId2";
List<RowToInsert> rows = new ArrayList<>();
Map<String, Object> row1 = new HashMap<>();
row1.put("stringField", "value1");
row1.put("booleanField", true);
Map<String, Object> row2 = new HashMap<>();
row2.put("stringField", "value2");
row2.put("booleanField", false);
rows.add(RowToInsert.of(rowId1, row1));
rows.add(RowToInsert.of(rowId2, row2));
InsertAllResponse response = table.insert(rows);
// do something with response
rows
- rows to be insertedBigQueryException
- upon failurepublic InsertAllResponse insert(Iterable<InsertAllRequest.RowToInsert> rows, boolean skipInvalidRows, boolean ignoreUnknownValues) throws BigQueryException
Streaming inserts reside temporarily in the streaming buffer, which has different
availability characteristics than managed storage. Certain operations do not interact with the
streaming buffer, such as #list(TableDataListOption...)
and #copy(TableId,
JobOption...)
. As such, recent streaming data will not be present in the destination table or
output.
Example of inserting rows into the table, ignoring invalid rows.
String rowId1 = "rowId1";
String rowId2 = "rowId2";
List<RowToInsert> rows = new ArrayList<>();
Map<String, Object> row1 = new HashMap<>();
row1.put("stringField", 1);
row1.put("booleanField", true);
Map<String, Object> row2 = new HashMap<>();
row2.put("stringField", "value2");
row2.put("booleanField", false);
rows.add(RowToInsert.of(rowId1, row1));
rows.add(RowToInsert.of(rowId2, row2));
InsertAllResponse response = table.insert(rows, true, true);
// do something with response
rows
- rows to be insertedskipInvalidRows
- whether to insert all valid rows, even if invalid rows exist. If not set
the entire insert operation will fail if rows to be inserted contain an invalid rowignoreUnknownValues
- whether to accept rows that contain values that do not match the
schema. The unknown values are ignored. If not set, rows with unknown values are considered
to be invalidBigQueryException
- upon failurepublic TableResult list(BigQuery.TableDataListOption... options) throws BigQueryException
Example of listing rows in the table.
// This example reads the result 100 rows per RPC call. If there's no need to limit the number,
// simply omit the option.
Page<FieldValueList> page = table.list(TableDataListOption.pageSize(100));
for (FieldValueList row : page.iterateAll()) {
// do something with the row
}
options
- table data list optionsBigQueryException
- upon failurepublic TableResult list(Schema schema, BigQuery.TableDataListOption... options) throws BigQueryException
Example of listing rows in the table given a schema.
Schema schema = ...;
String field = "my_field";
Page<FieldValueList> page = table.list(schema);
for (FieldValueList row : page.iterateAll()) {
row.get(field);
}
options
- table data list optionsBigQueryException
- upon failurepublic Job copy(String destinationDataset, String destinationTable, BigQuery.JobOption... options) throws BigQueryException
Job
object.
Example of copying the table to a destination table.
String datasetName = "my_dataset";
String tableName = "my_destination_table";
Job job = table.copy(datasetName, tableName);
// Wait for the job to complete.
try {
Job completedJob = job.waitFor(RetryOption.initialRetryDelay(Duration.ofSeconds(1)),
RetryOption.totalTimeout(Duration.ofMinutes(3)));
if (completedJob != null && completedJob.getStatus().getError() == null) {
// Job completed successfully
} else {
// Handle error case
}
} catch (InterruptedException e) {
// Handle interrupted wait
}
destinationDataset
- the user-defined id of the destination datasetdestinationTable
- the user-defined id of the destination tableoptions
- job optionsBigQueryException
- upon failurepublic Job copy(TableId destinationTable, BigQuery.JobOption... options) throws BigQueryException
Job
object.
Example copying the table to a destination table.
String dataset = "my_dataset";
String tableName = "my_destination_table";
TableId destinationId = TableId.of(dataset, tableName);
JobOption options = JobOption.fields(JobField.STATUS, JobField.USER_EMAIL);
Job job = table.copy(destinationId, options);
// Wait for the job to complete.
try {
Job completedJob = job.waitFor(RetryOption.initialRetryDelay(Duration.ofSeconds(1)),
RetryOption.totalTimeout(Duration.ofMinutes(3)));
if (completedJob != null && completedJob.getStatus().getError() == null) {
// Job completed successfully.
} else {
// Handle error case.
}
} catch (InterruptedException e) {
// Handle interrupted wait
}
destinationTable
- the destination table of the copy joboptions
- job optionsBigQueryException
- upon failurepublic Job extract(String format, String destinationUri, BigQuery.JobOption... options) throws BigQueryException
Job
object.
Example extracting data to single Google Cloud Storage file.
String format = "CSV";
String gcsUrl = "gs://my_bucket/filename.csv";
Job job = table.extract(format, gcsUrl);
// Wait for the job to complete
try {
Job completedJob = job.waitFor(RetryOption.initialRetryDelay(Duration.ofSeconds(1)),
RetryOption.totalTimeout(Duration.ofMinutes(3)));
if (completedJob != null && completedJob.getStatus().getError() == null) {
// Job completed successfully
} else {
// Handle error case
}
} catch (InterruptedException e) {
// Handle interrupted wait
}
format
- the format of the extracted datadestinationUri
- the fully-qualified Google Cloud Storage URI (e.g. gs://bucket/path)
where the extracted table should be writtenoptions
- job optionsBigQueryException
- upon failurepublic Job extract(String format, List<String> destinationUris, BigQuery.JobOption... options) throws BigQueryException
Job
object.
Example of partitioning data to a list of Google Cloud Storage files.
String format = "CSV";
String gcsUrl1 = "gs://my_bucket/PartitionA_*.csv";
String gcsUrl2 = "gs://my_bucket/PartitionB_*.csv";
List<String> destinationUris = new ArrayList<>();
destinationUris.add(gcsUrl1);
destinationUris.add(gcsUrl2);
Job job = table.extract(format, destinationUris);
// Wait for the job to complete
try {
Job completedJob = job.waitFor(RetryOption.initialRetryDelay(Duration.ofSeconds(1)),
RetryOption.totalTimeout(Duration.ofMinutes(3)));
if (completedJob != null && completedJob.getStatus().getError() == null) {
// Job completed successfully
} else {
// Handle error case
}
} catch (InterruptedException e) {
// Handle interrupted wait
}
format
- the format of the exported datadestinationUris
- the fully-qualified Google Cloud Storage URIs (e.g. gs://bucket/path)
where the extracted table should be writtenoptions
- job optionsBigQueryException
- upon failurepublic Job load(FormatOptions format, String sourceUri, BigQuery.JobOption... options) throws BigQueryException
Job
object.
Example loading data from a single Google Cloud Storage file.
String sourceUri = "gs://my_bucket/filename.csv";
Job job = table.load(FormatOptions.csv(), sourceUri);
// Wait for the job to complete
try {
Job completedJob = job.waitFor(RetryOption.initialRetryDelay(Duration.ofSeconds(1)),
RetryOption.totalTimeout(Duration.ofMinutes(3)));
if (completedJob != null && completedJob.getStatus().getError() == null) {
// Job completed successfully
} else {
// Handle error case
}
} catch (InterruptedException e) {
// Handle interrupted wait
}
format
- the format of the data to loadsourceUri
- the fully-qualified Google Cloud Storage URI (e.g. gs://bucket/path) from
which to load the dataoptions
- job optionsBigQueryException
- upon failurepublic Job load(FormatOptions format, List<String> sourceUris, BigQuery.JobOption... options) throws BigQueryException
Job
object.
Example loading data from a list of Google Cloud Storage files.
String gcsUrl1 = "gs://my_bucket/filename1.csv";
String gcsUrl2 = "gs://my_bucket/filename2.csv";
List<String> sourceUris = new ArrayList<>();
sourceUris.add(gcsUrl1);
sourceUris.add(gcsUrl2);
Job job = table.load(FormatOptions.csv(), sourceUris);
// Wait for the job to complete
try {
Job completedJob = job.waitFor(RetryOption.initialRetryDelay(Duration.ofSeconds(1)),
RetryOption.totalTimeout(Duration.ofMinutes(3)));
if (completedJob != null && completedJob.getStatus().getError() == null) {
// Job completed successfully
} else {
// Handle error case
}
} catch (InterruptedException e) {
// Handle interrupted wait
}
format
- the format of the exported datasourceUris
- the fully-qualified Google Cloud Storage URIs (e.g. gs://bucket/path) from
which to load the dataoptions
- job optionsBigQueryException
- upon failurepublic BigQuery getBigQuery()
BigQuery
object used to issue requests.public Table.Builder toBuilder()
TableInfo
Copyright © 2023 Google LLC. All rights reserved.