Google Cloud Platform logo

Google Cloud Dataproc: Node.js Client

release level npm version codecov

Google Cloud Dataproc API client for Node.js

Read more about the client libraries for Cloud APIs, including the older Google APIs Client Libraries, in Client Libraries Explained.

Table of contents:


Before you begin

  1. Select or create a Cloud Platform project.
  2. Enable billing for your project.
  3. Enable the Google Cloud Dataproc API.
  4. Set up authentication with a service account so you can access the API from your local workstation.

Installing the client library

npm install @google-cloud/dataproc

Using the client library

// This quickstart sample walks a user through creating a Cloud Dataproc
// cluster, submitting a PySpark job from Google Cloud Storage to the
// cluster, reading the output of the job and deleting the cluster, all
// using the Node.js client library.
// Usage:
//     node quickstart.js <PROJECT_ID> <REGION> <CLUSTER_NAME> <GCS_JOB_FILE_PATH>

'use strict';

function main(projectId, region, clusterName, jobFilePath) {
  const dataproc = require('@google-cloud/dataproc').v1;
  const {Storage} = require('@google-cloud/storage');

  const sleep = require('sleep');

  // Create a cluster client with the endpoint set to the desired cluster region
  const clusterClient = new dataproc.ClusterControllerClient({
    apiEndpoint: `${region}`,

  // Create a job client with the endpoint set to the desired cluster region
  const jobClient = new dataproc.v1.JobControllerClient({
    apiEndpoint: `${region}`,

  async function quickstart() {
    // Create the cluster config
    const cluster = {
      projectId: projectId,
      region: region,
      cluster: {
        clusterName: clusterName,
        config: {
          masterConfig: {
            numInstances: 1,
            machineTypeUri: 'n1-standard-1',
          workerConfig: {
            numInstances: 2,
            machineTypeUri: 'n1-standard-1',

    // Create the cluster
    const [operation] = await clusterClient.createCluster(cluster);
    const [response] = await operation.promise();

    // Output a success message
    console.log(`Cluster created successfully: ${response.clusterName}`);

    const job = {
      projectId: projectId,
      region: region,
      job: {
        placement: {
          clusterName: clusterName,
        pysparkJob: {
          mainPythonFileUri: jobFilePath,

    let [jobResp] = await jobClient.submitJob(job);
    const jobId = jobResp.reference.jobId;

    console.log(`Submitted job "${jobId}".`);

    // Terminal states for a job
    const terminalStates = new Set(['DONE', 'ERROR', 'CANCELLED']);

    // Create a timeout such that the job gets cancelled if not
    // in a termimal state after a fixed period of time.
    const timeout = 600000;
    const start = new Date();

    // Wait for the job to finish.
    const jobReq = {
      projectId: projectId,
      region: region,
      jobId: jobId,

    while (!terminalStates.has(jobResp.status.state)) {
      if (new Date() - timeout > start) {
        await jobClient.cancelJob(jobReq);
          `Job ${jobId} timed out after threshold of ` +
            `${timeout / 60000} minutes.`
      await sleep.sleep(1);
      [jobResp] = await jobClient.getJob(jobReq);

    const clusterReq = {
      projectId: projectId,
      region: region,
      clusterName: clusterName,

    const [clusterResp] = await clusterClient.getCluster(clusterReq);

    const storage = new Storage();

    const output = await storage
        `google-cloud-dataproc-metainfo/${clusterResp.clusterUuid}/` +

    // Output a success message.
      `Job ${jobId} finished with state ${jobResp.status.state}:\n${output}`

    // Delete the cluster once the job has terminated.
    const [deleteOperation] = await clusterClient.deleteCluster(clusterReq);
    await deleteOperation.promise();

    // Output a success message
    console.log(`Cluster ${clusterName} successfully deleted.`);


const args = process.argv.slice(2);

if (args.length !== 4) {
    'Insufficient number of parameters provided. Please make sure a ' +
      'PROJECT_ID, REGION, CLUSTER_NAME and JOB_FILE_PATH are provided, in this order.'



Samples are in the samples/ directory. The samples' has instructions for running the samples.

Sample Source Code Try it
Create Cluster source code Open in Cloud Shell
Quickstart source code Open in Cloud Shell

The Google Cloud Dataproc Node.js Client API Reference documentation also contains samples.


This library follows Semantic Versioning.

This library is considered to be General Availability (GA). This means it is stable; the code surface will not change in backwards-incompatible ways unless absolutely necessary (e.g. because of critical security issues) or with an extensive deprecation period. Issues and requests against GA libraries are addressed with the highest priority.

More Information: Google Cloud Platform Launch Stages


Contributions welcome! See the Contributing Guide.


Apache Version 2.0