Property

new Table(dataset, id[, options])

Example

const {BigQuery} = require('@google-cloud/bigquery');
const bigquery = new BigQuery();
const dataset = bigquery.dataset('my-dataset');

const table = dataset.table('my-table');

Parameters

Name Type Optional Description

dataset

 

 

Dataset instance.

id

 

 

The ID of the table.

options

 

Yes

Table options.

Values in options have the following properties:

Name Type Optional Description

location

 

Yes

The geographic location of the table, by default this value is inherited from the dataset. This can be used to configure the location of all jobs created through a table instance. It cannot be used to set the actual location of the table. This value will be superseded by any API responses containing location data for the table.

Property

createReadStream

Create a readable stream of the rows of data in your table. This method is simply a wrapper around Table#getRows.

Example

const {BigQuery} = require('@google-cloud/bigquery');
const bigquery = new BigQuery();
const dataset = bigquery.dataset('my-dataset');
const table = dataset.table('my-table');

table.createReadStream(options)
  .on('error', console.error)
  .on('data', row => {})
  .on('end', function() {
    // All rows have been retrieved.
  });

//-
// If you anticipate many results, you can end a stream early to prevent
// unnecessary processing and API requests.
//-
table.createReadStream()
  .on('data', function(row) {
    this.end();
  });
See also

Tabledata: list API Documentation

Returns

ReadableStream 

Methods

copy(destination[, metadata][, callback]) → Promise

Copy data from one table to another, optionally creating that table.

Example

const {BigQuery} = require('@google-cloud/bigquery');
const bigquery = new BigQuery();
const dataset = bigquery.dataset('my-dataset');

const table = dataset.table('my-table');
const yourTable = dataset.table('your-table');

table.copy(yourTable, (err, apiResponse) => {});

//-
// See the <a href="http://goo.gl/dKWIyS">`configuration.copy`</a> object
for
// all available options.
//-
const metadata = {
  createDisposition: 'CREATE_NEVER',
  writeDisposition: 'WRITE_TRUNCATE'
};

table.copy(yourTable, metadata, (err, apiResponse) => {});

//-
// If the callback is omitted, we'll return a Promise.
//-
table.copy(yourTable, metadata).then((data) => {
  const apiResponse = data[0];
});

Parameters

Name Type Optional Description

destination

Table

 

The destination table.

metadata

object

Yes

Metadata to set with the copy operation. The metadata object should be in the format of the configuration.copy property of a Jobs resource.

Values in metadata have the following properties:

Name Type Optional Description

jobId

string

Yes

Custom id for the underlying job.

jobPrefix

string

Yes

Prefix to apply to the underlying job id.

callback

function()

Yes

The callback function.

Values in callback have the following properties:

Name Type Optional Description

err

error

 

An error returned while making this request

Value can be null.

apiResponse

object

 

The full API response.

Throws

Error 

If a destination other than a Table object is provided.

Returns

Promise 

copyFrom(sourceTables[, metadata][, callback]) → Promise

Copy data from multiple tables into this table.

Example

const {BigQuery} = require('@google-cloud/bigquery');
const bigquery = new BigQuery();
const dataset = bigquery.dataset('my-dataset');
const table = dataset.table('my-table');

const sourceTables = [
  dataset.table('your-table'),
  dataset.table('your-second-table')
];

table.copyFrom(sourceTables, (err, apiResponse) => {});

//-
// See the <a href="http://goo.gl/dKWIyS">`configuration.copy`</a> object
for
// all available options.
//-
const metadata = {
  createDisposition: 'CREATE_NEVER',
  writeDisposition: 'WRITE_TRUNCATE'
};

table.copyFrom(sourceTables, metadata, (err, apiResponse) => {});

//-
// If the callback is omitted, we'll return a Promise.
//-
table.copyFrom(sourceTables, metadata).then((data) => {
  const apiResponse = data[0];
});

Parameters

Name Type Optional Description

sourceTables

(Table or Array of Table)

 

The source table(s) to copy data from.

metadata

object

Yes

Metadata to set with the copy operation. The metadata object should be in the format of the configuration.copy property of a Jobs resource.

Values in metadata have the following properties:

Name Type Optional Description

jobId

string

Yes

Custom id for the underlying job.

jobPrefix

string

Yes

Prefix to apply to the underlying job id.

callback

function()

Yes

The callback function.

Values in callback have the following properties:

Name Type Optional Description

err

error

 

An error returned while making this request

Value can be null.

apiResponse

object

 

The full API response.

Throws

Error 

If a source other than a Table object is provided.

Returns

Promise 

create([options][, callback]) → Promise

Create a table.

Example

const {BigQuery} = require('@google-cloud/bigquery');
const bigquery = new BigQuery();
const dataset = bigquery.dataset('my-dataset');

const table = dataset.table('my-table');

table.create((err, table, apiResponse) => {
  if (!err) {
    // The table was created successfully.
  }
});

//-
// If the callback is omitted, we'll return a Promise.
//-
table.create().then((data) => {
  const table = data[0];
  const apiResponse = data[1];
});

Parameters

Name Type Optional Description

options

object

Yes

See Dataset#createTable.

callback

function()

Yes

Values in callback have the following properties:

Name Type Optional Description

err

error

 

An error returned while making this request.

Value can be null.

table

Table

 

The new Table.

apiResponse

object

 

The full API response.

Returns

Promise 

createCopyFromJob(sourceTables[, metadata][, callback]) → Promise

Copy data from multiple tables into this table.

Example

const {BigQuery} = require('@google-cloud/bigquery');
const bigquery = new BigQuery();
const dataset = bigquery.dataset('my-dataset');
const table = dataset.table('my-table');

const sourceTables = [
  dataset.table('your-table'),
  dataset.table('your-second-table')
];

const callback = (err, job, apiResponse) => {
  // `job` is a Job object that can be used to check the status of the
  // request.
};

table.createCopyFromJob(sourceTables, callback);

//-
// See the <a href="http://goo.gl/dKWIyS">`configuration.copy`</a> object
for
// all available options.
//-
const metadata = {
  createDisposition: 'CREATE_NEVER',
  writeDisposition: 'WRITE_TRUNCATE'
};

table.createCopyFromJob(sourceTables, metadata, callback);

//-
// If the callback is omitted, we'll return a Promise.
//-
table.createCopyFromJob(sourceTables, metadata).then((data) => {
  const job = data[0];
  const apiResponse = data[1];
});

Parameters

Name Type Optional Description

sourceTables

(Table or Array of Table)

 

The source table(s) to copy data from.

metadata

object

Yes

Metadata to set with the copy operation. The metadata object should be in the format of the configuration.copy property of a Jobs resource.

Values in metadata have the following properties:

Name Type Optional Description

jobId

string

Yes

Custom job id.

jobPrefix

string

Yes

Prefix to apply to the job id.

callback

function()

Yes

The callback function.

Values in callback have the following properties:

Name Type Optional Description

err

error

 

An error returned while making this request

Value can be null.

job

Job

 

The job used to copy your table.

apiResponse

object

 

The full API response.

See also

Jobs: insert API Documentation

Throws

Error 

If a source other than a Table object is provided.

Returns

Promise 

createCopyJob(destination[, metadata][, callback]) → Promise

Copy data from one table to another, optionally creating that table.

Example

const {BigQuery} = require('@google-cloud/bigquery');
const bigquery = new BigQuery();
const dataset = bigquery.dataset('my-dataset');
const table = dataset.table('my-table');

const yourTable = dataset.table('your-table');
table.createCopyJob(yourTable, (err, job, apiResponse) => {
  // `job` is a Job object that can be used to check the status of the
  // request.
});

//-
// See the <a href="http://goo.gl/dKWIyS">`configuration.copy`</a> object
for
// all available options.
//-
const metadata = {
  createDisposition: 'CREATE_NEVER',
  writeDisposition: 'WRITE_TRUNCATE'
};

table.createCopyJob(yourTable, metadata, (err, job, apiResponse) => {});

//-
// If the callback is omitted, we'll return a Promise.
//-
table.createCopyJob(yourTable, metadata).then((data) => {
  const job = data[0];
  const apiResponse = data[1];
});

Parameters

Name Type Optional Description

destination

Table

 

The destination table.

metadata

object

Yes

Metadata to set with the copy operation. The metadata object should be in the format of the configuration.copy property of a Jobs resource.

Values in metadata have the following properties:

Name Type Optional Description

jobId

string

Yes

Custom job id.

jobPrefix

string

Yes

Prefix to apply to the job id.

callback

function()

Yes

The callback function.

Values in callback have the following properties:

Name Type Optional Description

err

error

 

An error returned while making this request

Value can be null.

job

Job

 

The job used to copy your table.

apiResponse

object

 

The full API response.

See also

Jobs: insert API Documentation

Throws

Error 

If a destination other than a Table object is provided.

Returns

Promise 

createExtractJob(destination[, options], callback)

Export table to Cloud Storage.

Example

const {Storage} = require('@google-cloud/storage');
const {BigQuery} = require('@google-cloud/bigquery');
const bigquery = new BigQuery();
const dataset = bigquery.dataset('my-dataset');
const table = dataset.table('my-table');

const storage = new Storage({
  projectId: 'grape-spaceship-123'
});
const extractedFile = storage.bucket('institutions').file('2014.csv');

function callback(err, job, apiResponse) {
  // `job` is a Job object that can be used to check the status of the
  // request.
}

//-
// To use the default options, just pass a {@link
https://cloud.google.com/nodejs/docs/reference/storage/latest/File File}
object.
//
// Note: The exported format type will be inferred by the file's extension.
// If you wish to override this, or provide an array of destination files,
// you must provide an `options` object.
//-
table.createExtractJob(extractedFile, callback);

//-
// If you need more customization, pass an `options` object.
//-
const options = {
  format: 'json',
  gzip: true
};

table.createExtractJob(extractedFile, options, callback);

//-
// You can also specify multiple destination files.
//-
table.createExtractJob([
  storage.bucket('institutions').file('2014.json'),
  storage.bucket('institutions-copy').file('2014.json')
], options, callback);

//-
// If the callback is omitted, we'll return a Promise.
//-
table.createExtractJob(extractedFile, options).then((data) => {
  const job = data[0];
  const apiResponse = data[1];
});

Parameters

Name Type Optional Description

destination

(string or File)

 

Where the file should be exported to. A string or a File object.

options

object

Yes

The configuration object.

Values in options have the following properties:

Name Type Optional Description

format

string

 

The format to export the data in. Allowed options are "CSV", "JSON", "AVRO", or "PARQUET". Default: "CSV".

gzip

boolean

 

Specify if you would like the file compressed with GZIP. Default: false.

jobId

string

Yes

Custom job id.

jobPrefix

string

Yes

Prefix to apply to the job id.

callback

function()

 

The callback function.

Values in callback have the following properties:

Name Type Optional Description

err

error

 

An error returned while making this request

Value can be null.

job

Job

 

The job used to export the table.

apiResponse

object

 

The full API response.

See also

Jobs: insert API Documentation

Throws

Error 

If destination isn't a File object.

Error 

If destination format isn't recongized.

createLoadJob(source[, metadata][, callback]) → Promise

Load data from a local file or Storage File.

By loading data this way, you create a load job that will run your data load asynchronously. If you would like instantaneous access to your data, insert it using {@liink Table#insert}.

Note: The file type will be inferred by the given file's extension. If you wish to override this, you must provide metadata.format.

Example

const {Storage} = require('@google-cloud/storage');
const {BigQuery} = require('@google-cloud/bigquery');
const bigquery = new BigQuery();
const dataset = bigquery.dataset('my-dataset');
const table = dataset.table('my-table');

//-
// Load data from a local file.
//-
const callback = (err, job, apiResponse) => {
  // `job` is a Job object that can be used to check the status of the
  // request.
};

table.createLoadJob('./institutions.csv', callback);

//-
// You may also pass in metadata in the format of a Jobs resource. See
// (http://goo.gl/BVcXk4) for a full list of supported values.
//-
const metadata = {
  encoding: 'ISO-8859-1',
  sourceFormat: 'NEWLINE_DELIMITED_JSON'
};

table.createLoadJob('./my-data.csv', metadata, callback);

//-
// Load data from a file in your Cloud Storage bucket.
//-
const storage = new Storage({
  projectId: 'grape-spaceship-123'
});
const data = storage.bucket('institutions').file('data.csv');
table.createLoadJob(data, callback);

//-
// Load data from multiple files in your Cloud Storage bucket(s).
//-
table.createLoadJob([
  storage.bucket('institutions').file('2011.csv'),
  storage.bucket('institutions').file('2012.csv')
], callback);

//-
// If the callback is omitted, we'll return a Promise.
//-
table.createLoadJob(data).then((data) => {
  const job = data[0];
  const apiResponse = data[1];
});

Parameters

Name Type Optional Description

source

(string or File)

 

The source file to load. A string or a File object.

metadata

object

Yes

Metadata to set with the load operation. The metadata object should be in the format of the configuration.load property of a Jobs resource.

Values in metadata have the following properties:

Name Type Optional Description

format

string

Yes

The format the data being loaded is in. Allowed options are "AVRO", "CSV", "JSON", "ORC", or "PARQUET".

jobId

string

Yes

Custom job id.

jobPrefix

string

Yes

Prefix to apply to the job id.

callback

function()

Yes

The callback function.

Values in callback have the following properties:

Name Type Optional Description

err

error

 

An error returned while making this request

Value can be null.

job

Job

 

The job used to load your data.

apiResponse

object

 

The full API response.

See also

Jobs: insert API Documentation

Throws

Error 

If the source isn't a string file name or a File instance.

Returns

Promise 

createQueryJob()

Run a query as a job. No results are immediately returned. Instead, your callback will be executed with a Job object that you must ping for the results. See the Job documentation for explanations of how to check on the status of the job.

See BigQuery#createQueryJob for full documentation of this method.

createQueryStream(query) → stream

Run a query scoped to your dataset as a readable object stream.

See BigQuery#createQueryStream for full documentation of this method.

Parameter

Name Type Optional Description

query

object

 

See BigQuery#createQueryStream for full documentation of this method.

Returns

stream 

See BigQuery#createQueryStream for full documentation of this method.

createWriteStream([metadata]) → WritableStream

Load data into your table from a readable stream of AVRO, CSV, JSON, ORC, or PARQUET data.

Example

const {BigQuery} = require('@google-cloud/bigquery');
const bigquery = new BigQuery();
const dataset = bigquery.dataset('my-dataset');
const table = dataset.table('my-table');

//-
// Load data from a CSV file.
//-
const request = require('request');

const csvUrl = 'http://goo.gl/kSE7z6';

const metadata = {
  allowJaggedRows: true,
  skipLeadingRows: 1
};

request.get(csvUrl)
  .pipe(table.createWriteStream(metadata))
  .on('job', (job) => {
    // `job` is a Job object that can be used to check the status of the
    // request.
  })
  .on('complete', (job) => {
    // The job has completed successfully.
  });

//-
// Load data from a JSON file.
//-
const fs = require('fs');

fs.createReadStream('./test/testdata/testfile.json')
  .pipe(table.createWriteStream('json'))
  .on('job', (job) => {
    // `job` is a Job object that can be used to check the status of the
    // request.
  })
  .on('complete', (job) => {
    // The job has completed successfully.
  });

Parameters

Name Type Optional Description

metadata

(string or object)

Yes

Metadata to set with the load operation. The metadata object should be in the format of the configuration.load property of a Jobs resource. If a string is given, it will be used as the filetype.

Values in metadata have the following properties:

Name Type Optional Description

jobId

string

Yes

Custom job id.

jobPrefix

string

Yes

Prefix to apply to the job id.

See also

Jobs: insert API Documentation

Throws

Error 

If source format isn't recognized.

Returns

WritableStream 

delete([callback]) → Promise

Delete a table and all its data.

Example

const {BigQuery} = require('@google-cloud/bigquery');
const bigquery = new BigQuery();
const dataset = bigquery.dataset('my-dataset');

const table = dataset.table('my-table');

table.delete((err, apiResponse) => {});

//-
// If the callback is omitted, we'll return a Promise.
//-
table.delete().then((data) => {
  const apiResponse = data[0];
});

Parameters

Name Type Optional Description

callback

function()

Yes

Values in callback have the following properties:

Name Type Optional Description

err

error

 

An error returned while making this request.

Value can be null.

apiResponse

object

 

The full API response.

See also

Tables: delete API Documentation

Returns

Promise 

exists([callback]) → Promise

Check if the table exists.

Example

const {BigQuery} = require('@google-cloud/bigquery');
const bigquery = new BigQuery();
const dataset = bigquery.dataset('my-dataset');

const table = dataset.table('my-table');

table.exists((err, exists) => {});

//-
// If the callback is omitted, we'll return a Promise.
//-
table.exists().then((data) => {
  const exists = data[0];
});

Parameters

Name Type Optional Description

callback

function()

Yes

Values in callback have the following properties:

Name Type Optional Description

err

error

 

An error returned while making this request.

Value can be null.

exists

boolean

 

Whether the table exists or not.

Returns

Promise 

extract(destination[, options][, callback]) → Promise

Export table to Cloud Storage.

Example

const Storage = require('@google-cloud/storage');
const {BigQuery} = require('@google-cloud/bigquery');
const bigquery = new BigQuery();
const dataset = bigquery.dataset('my-dataset');
const table = dataset.table('my-table');

const storage = new Storage({
  projectId: 'grape-spaceship-123'
});
const extractedFile = storage.bucket('institutions').file('2014.csv');

//-
// To use the default options, just pass a {@link
https://cloud.google.com/nodejs/docs/reference/storage/latest/File File}
object.
//
// Note: The exported format type will be inferred by the file's extension.
// If you wish to override this, or provide an array of destination files,
// you must provide an `options` object.
//-
table.extract(extractedFile, (err, apiResponse) => {});

//-
// If you need more customization, pass an `options` object.
//-
const options = {
  format: 'json',
  gzip: true
};

table.extract(extractedFile, options, (err, apiResponse) => {});

//-
// You can also specify multiple destination files.
//-
table.extract([
  storage.bucket('institutions').file('2014.json'),
  storage.bucket('institutions-copy').file('2014.json')
], options, (err, apiResponse) => {});

//-
// If the callback is omitted, we'll return a Promise.
//-
table.extract(extractedFile, options).then((data) => {
  const apiResponse = data[0];
});

Parameters

Name Type Optional Description

destination

(string or File)

 

Where the file should be exported to. A string or a File.

options

object

Yes

The configuration object.

Values in options have the following properties:

Name Type Optional Description

format

string

Yes

The format to export the data in. Allowed options are "AVRO", "CSV", "JSON", "ORC" or "PARQUET".

Defaults to "CSV".

gzip

boolean

Yes

Specify if you would like the file compressed with GZIP. Default: false.

jobId

string

Yes

Custom id for the underlying job.

jobPrefix

string

Yes

Prefix to apply to the underlying job id.

callback

function()

Yes

The callback function.

Values in callback have the following properties:

Name Type Optional Description

err

error

 

An error returned while making this request

Value can be null.

apiResponse

object

 

The full API response.

Throws

Error 

If destination isn't a File object.

Error 

If destination format isn't recongized.

Returns

Promise 

get([options][, callback]) → Promise

Get a table if it exists.

You may optionally use this to "get or create" an object by providing an object with autoCreate set to true. Any extra configuration that is normally required for the create method must be contained within this object as well.

Example

const {BigQuery} = require('@google-cloud/bigquery');
const bigquery = new BigQuery();
const dataset = bigquery.dataset('my-dataset');

const table = dataset.table('my-table');

table.get((err, table, apiResponse) => {
  // `table.metadata` has been populated.
});

//-
// If the callback is omitted, we'll return a Promise.
//-
table.get().then((data) => {
  const table = data[0];
  const apiResponse = data[1];
});

Parameters

Name Type Optional Description

options

options

Yes

Configuration object.

Values in options have the following properties:

Name Type Optional Description

autoCreate

boolean

Yes

Automatically create the object if it does not exist.

Defaults to false.

callback

function()

Yes

Values in callback have the following properties:

Name Type Optional Description

err

error

 

An error returned while making this request.

Value can be null.

table

Table

 

The Table.

apiResponse

object

 

The full API response.

Returns

Promise 

getMetadata([callback]) → Promise

Return the metadata associated with the Table.

Example

const {BigQuery} = require('@google-cloud/bigquery');
const bigquery = new BigQuery();
const dataset = bigquery.dataset('my-dataset');

const table = dataset.table('my-table');

table.getMetadata((err, metadata, apiResponse) => {});

//-
// If the callback is omitted, we'll return a Promise.
//-
table.getMetadata().then((data) => {
  const metadata = data[0];
  const apiResponse = data[1];
});

Parameters

Name Type Optional Description

callback

function()

Yes

The callback function.

Values in callback have the following properties:

Name Type Optional Description

err

error

 

An error returned while making this request.

Value can be null.

metadata

object

 

The metadata of the Table.

apiResponse

object

 

The full API response.

See also

Tables: get API Documentation

Returns

Promise 

getRows([options][, callback]) → Promise

Retrieves table data from a specified set of rows. The rows are returned to your callback as an array of objects matching your table's schema.

Example

const {BigQuery} = require('@google-cloud/bigquery');
const bigquery = new BigQuery();
const dataset = bigquery.dataset('my-dataset');
const table = dataset.table('my-table');

table.getRows((err, rows) => {
  if (!err) {
    // rows is an array of results.
  }
});

//-
// To control how many API requests are made and page through the results
// manually, set `autoPaginate` to `false`.
//-
function manualPaginationCallback(err, rows, nextQuery, apiResponse) {
  if (nextQuery) {
    // More results exist.
    table.getRows(nextQuery, manualPaginationCallback);
  }
}

table.getRows({
  autoPaginate: false
}, manualPaginationCallback);

//-
// If the callback is omitted, we'll return a Promise.
//-
table.getRows().then((data) => {
  const rows = data[0];
  });

Parameters

Name Type Optional Description

options

object

Yes

The configuration object.

Values in options have the following properties:

Name Type Optional Description

autoPaginate

boolean

Yes

Have pagination handled automatically.

Defaults to true.

maxApiCalls

number

Yes

Maximum number of API calls to make.

maxResults

number

Yes

Maximum number of results to return.

callback

function()

Yes

The callback function.

Values in callback have the following properties:

Name Type Optional Description

err

error

 

An error returned while making this request

Value can be null.

rows

array

 

The table data from specified set of rows.

See also

Tabledata: list API Documentation

Returns

Promise 

insert(rows[, options][, callback]) → Promise

Stream data into BigQuery one record at a time without running a load job.

If you need to create an entire table from a file, consider using Table#load instead.

Example

const {BigQuery} = require('@google-cloud/bigquery');
const bigquery = new BigQuery();
const dataset = bigquery.dataset('my-dataset');
const table = dataset.table('my-table');

//-
// Insert a single row.
//-
table.insert({
  INSTNM: 'Motion Picture Institute of Michigan',
  CITY: 'Troy',
  STABBR: 'MI'
}, insertHandler);

//-
// Insert multiple rows at a time.
//-
const rows = [
  {
    INSTNM: 'Motion Picture Institute of Michigan',
    CITY: 'Troy',
    STABBR: 'MI'
  },
  // ...
];

table.insert(rows, insertHandler);

//-
// Insert a row as according to the <a
href="https://cloud.google.com/bigquery/docs/reference/v2/tabledata/insertAll">
// specification</a>.
//-
const row = {
  insertId: '1',
  json: {
    INSTNM: 'Motion Picture Institute of Michigan',
    CITY: 'Troy',
    STABBR: 'MI'
  }
};

const options = {
  raw: true
};

table.insert(row, options, insertHandler);

//-
// Handling the response. See <a
href="https://developers.google.com/bigquery/troubleshooting-errors">
// Troubleshooting Errors</a> for best practices on how to handle errors.
//-
function insertHandler(err, apiResponse) {
  if (err) {
    // An API error or partial failure occurred.

    if (err.name === 'PartialFailureError') {
      // Some rows failed to insert, while others may have succeeded.

      // err.errors (object[]):
      // err.errors[].row (original row object passed to `insert`)
      // err.errors[].errors[].reason
      // err.errors[].errors[].message
    }
  }
}

//-
// If the callback is omitted, we'll return a Promise.
//-
table.insert(rows)
  .then((data) => {
    const apiResponse = data[0];
  })
  .catch((err) => {
    // An API error or partial failure occurred.

    if (err.name === 'PartialFailureError') {
      // Some rows failed to insert, while others may have succeeded.

      // err.errors (object[]):
      // err.errors[].row (original row object passed to `insert`)
      // err.errors[].errors[].reason
      // err.errors[].errors[].message
    }
  });

Parameters

Name Type Optional Description

rows

(object or Array of object)

 

The rows to insert into the table.

options

object

Yes

Configuration object.

Values in options have the following properties:

Name Type Optional Description

ignoreUnknownValues

boolean

Yes

Accept rows that contain values that do not match the schema. The unknown values are ignored.

Defaults to false.

raw

boolean

Yes

If true, the rows argument is expected to be formatted as according to the specification.

schema

(string or object)

Yes

If provided will atomatically create a table if it doesn't already exist. Note that this can take longer than 2 minutes to complete. A comma-separated list of name:type pairs. Valid types are "string", "integer", "float", "boolean", and "timestamp". If the type is omitted, it is assumed to be "string". Example: "name:string, age:integer". Schemas can also be specified as a JSON array of fields, which allows for nested and repeated fields. See a Table resource for more detailed information.

skipInvalidRows

boolean

Yes

Insert all valid rows of a request, even if invalid rows exist.

Defaults to false.

templateSuffix

string

Yes

Treat the destination table as a base template, and insert the rows into an instance table named "{destination}{templateSuffix}". BigQuery will manage creation of the instance table, using the schema of the base template table. See Automatic table creation using template tables for considerations when working with templates tables.

callback

function()

Yes

The callback function.

Values in callback have the following properties:

Name Type Optional Description

err

error

 

An error returned while making this request.

Value can be null.

err.errors

Array of object

 

If present, these represent partial failures. It's possible for part of your request to be completed successfully, while the other part was not.

apiResponse

object

 

The full API response.

See also

Tabledata: insertAll API Documentation

Streaming Insert Limits

Troubleshooting Errors

Returns

Promise 

load(source[, metadata][, callback]) → Promise

Load data from a local file or Storage File.

By loading data this way, you create a load job that will run your data load asynchronously. If you would like instantaneous access to your data, insert it using Table#insert.

Note: The file type will be inferred by the given file's extension. If you wish to override this, you must provide metadata.format.

Example

const {BigQuery} = require('@google-cloud/bigquery');
const bigquery = new BigQuery();
const dataset = bigquery.dataset('my-dataset');
const table = dataset.table('my-table');

//-
// Load data from a local file.
//-
table.load('./institutions.csv', (err, apiResponse) => {});

//-
// You may also pass in metadata in the format of a Jobs resource. See
// (http://goo.gl/BVcXk4) for a full list of supported values.
//-
const metadata = {
  encoding: 'ISO-8859-1',
  sourceFormat: 'NEWLINE_DELIMITED_JSON'
};

table.load('./my-data.csv', metadata, (err, apiResponse) => {});

//-
// Load data from a file in your Cloud Storage bucket.
//-
const gcs = require('@google-cloud/storage')({
  projectId: 'grape-spaceship-123'
});
const data = gcs.bucket('institutions').file('data.csv');
table.load(data, (err, apiResponse) => {});

//-
// Load data from multiple files in your Cloud Storage bucket(s).
//-
table.load([
  gcs.bucket('institutions').file('2011.csv'),
  gcs.bucket('institutions').file('2012.csv')
], function(err, apiResponse) {});

//-
// If the callback is omitted, we'll return a Promise.
//-
table.load(data).then(function(data) {
  const apiResponse = data[0];
});

Parameters

Name Type Optional Description

source

(string or File)

 

The source file to load. A filepath as a string or a File object.

metadata

object

Yes

Metadata to set with the load operation. The metadata object should be in the format of the configuration.load property of a Jobs resource.

Values in metadata have the following properties:

Name Type Optional Description

format

string

Yes

The format the data being loaded is in. Allowed options are "AVRO", "CSV", "JSON", "ORC", or "PARQUET".

jobId

string

Yes

Custom id for the underlying job.

jobPrefix

string

Yes

Prefix to apply to the underlying job id.

callback

function()

Yes

The callback function.

Values in callback have the following properties:

Name Type Optional Description

err

error

 

An error returned while making this request

Value can be null.

apiResponse

object

 

The full API response.

Throws

Error 

If the source isn't a string file name or a File instance.

Returns

Promise 

query(query[, callback]) → Promise

Run a query scoped to your dataset.

See BigQuery#query for full documentation of this method.

Parameters

Name Type Optional Description

query

object

 

See BigQuery#query for full documentation of this method.

callback

function()

Yes

See BigQuery#query for full documentation of this method.

Returns

Promise 

setMetadata(metadata[, callback]) → Promise

Set the metadata on the table.

Example

const {BigQuery} = require('@google-cloud/bigquery');
const bigquery = new BigQuery();
const dataset = bigquery.dataset('my-dataset');
const table = dataset.table('my-table');

const metadata = {
  name: 'My recipes',
  description: 'A table for storing my recipes.',
  schema: 'name:string, servings:integer, cookingTime:float, quick:boolean'
};

table.setMetadata(metadata, (err, metadata, apiResponse) => {});

//-
// If the callback is omitted, we'll return a Promise.
//-
table.setMetadata(metadata).then((data) => {
  const metadata = data[0];
  const apiResponse = data[1];
});

Parameters

Name Type Optional Description

metadata

object

 

The metadata key/value object to set.

Values in metadata have the following properties:

Name Type Optional Description

description

string

 

A user-friendly description of the table.

name

string

 

A descriptive name for the table.

schema

(string or object)

 

A comma-separated list of name:type pairs. Valid types are "string", "integer", "float", "boolean", "bytes", "record", and "timestamp". If the type is omitted, it is assumed to be "string". Example: "name:string, age:integer". Schemas can also be specified as a JSON array of fields, which allows for nested and repeated fields. See a Table resource for more detailed information.

callback

function()

Yes

The callback function.

Values in callback have the following properties:

Name Type Optional Description

err

error

 

An error returned while making this request.

Value can be null.

apiResponse

object

 

The full API response.

See also

Tables: patch API Documentation

Returns

Promise