Cloud Dataproc API . projects . locations . sessions . sparkApplications

Instance Methods

access(name, parent=None, x__xgafv=None)

Obtain high level information corresponding to a single Spark Application.

accessEnvironmentInfo(name, parent=None, x__xgafv=None)

Obtain environment details for a Spark Application

accessJob(name, jobId=None, parent=None, x__xgafv=None)

Obtain data corresponding to a spark job for a Spark Application.

accessSqlPlan(name, executionId=None, parent=None, x__xgafv=None)

Obtain Spark Plan Graph for a Spark Application SQL execution. Limits the number of clusters returned as part of the graph to 10000.

accessSqlQuery(name, details=None, executionId=None, parent=None, planDescription=None, x__xgafv=None)

Obtain data corresponding to a particular SQL Query for a Spark Application.

accessStageAttempt(name, parent=None, stageAttemptId=None, stageId=None, summaryMetricsMask=None, x__xgafv=None)

Obtain data corresponding to a spark stage attempt for a Spark Application.

accessStageRddGraph(name, parent=None, stageId=None, x__xgafv=None)

Obtain RDD operation graph for a Spark Application Stage. Limits the number of clusters returned as part of the graph to 10000.

close()

Close httplib2 connections.

search(parent, applicationStatus=None, maxEndTime=None, maxTime=None, minEndTime=None, minTime=None, pageSize=None, pageToken=None, x__xgafv=None)

Obtain high level information and list of Spark Applications corresponding to a batch

searchExecutorStageSummary(name, pageSize=None, pageToken=None, parent=None, stageAttemptId=None, stageId=None, x__xgafv=None)

Obtain executor summary with respect to a spark stage attempt.

searchExecutorStageSummary_next()

Retrieves the next page of results.

searchExecutors(name, executorStatus=None, pageSize=None, pageToken=None, parent=None, x__xgafv=None)

Obtain data corresponding to executors for a Spark Application.

searchExecutors_next()

Retrieves the next page of results.

searchJobs(name, jobStatus=None, pageSize=None, pageToken=None, parent=None, x__xgafv=None)

Obtain list of spark jobs corresponding to a Spark Application.

searchJobs_next()

Retrieves the next page of results.

searchSqlQueries(name, details=None, pageSize=None, pageToken=None, parent=None, planDescription=None, x__xgafv=None)

Obtain data corresponding to SQL Queries for a Spark Application.

searchSqlQueries_next()

Retrieves the next page of results.

searchStageAttemptTasks(name, pageSize=None, pageToken=None, parent=None, sortRuntime=None, stageAttemptId=None, stageId=None, taskStatus=None, x__xgafv=None)

Obtain data corresponding to tasks for a spark stage attempt for a Spark Application.

searchStageAttemptTasks_next()

Retrieves the next page of results.

searchStageAttempts(name, pageSize=None, pageToken=None, parent=None, stageId=None, summaryMetricsMask=None, x__xgafv=None)

Obtain data corresponding to a spark stage attempts for a Spark Application.

searchStageAttempts_next()

Retrieves the next page of results.

searchStages(name, pageSize=None, pageToken=None, parent=None, stageStatus=None, summaryMetricsMask=None, x__xgafv=None)

Obtain data corresponding to stages for a Spark Application.

searchStages_next()

Retrieves the next page of results.

search_next()

Retrieves the next page of results.

summarizeExecutors(name, parent=None, x__xgafv=None)

Obtain summary of Executor Summary for a Spark Application

summarizeJobs(name, parent=None, x__xgafv=None)

Obtain summary of Jobs for a Spark Application

summarizeStageAttemptTasks(name, parent=None, stageAttemptId=None, stageId=None, x__xgafv=None)

Obtain summary of Tasks for a Spark Application Stage Attempt

summarizeStages(name, parent=None, x__xgafv=None)

Obtain summary of Stages for a Spark Application

write(name, body=None, x__xgafv=None)

Write wrapper objects from dataplane to spanner

Method Details

access(name, parent=None, x__xgafv=None)
Obtain high level information corresponding to a single Spark Application.

Args:
  name: string, Required. The fully qualified name of the session to retrieve in the format "projects/PROJECT_ID/locations/DATAPROC_REGION/sessions/SESSION_ID/sparkApplications/APPLICATION_ID" (required)
  parent: string, Required. Parent (Session) resource reference.
  x__xgafv: string, V1 error format.
    Allowed values
      1 - v1 error format
      2 - v2 error format

Returns:
  An object of the form:

    { # A summary of Spark Application
  "application": { # High level information corresponding to an application. # Output only. High level information corresponding to an application.
    "applicationContextIngestionStatus": "A String",
    "applicationId": "A String",
    "attempts": [
      { # Specific attempt of an application.
        "appSparkVersion": "A String",
        "attemptId": "A String",
        "completed": True or False,
        "durationMillis": "A String",
        "endTime": "A String",
        "lastUpdated": "A String",
        "sparkUser": "A String",
        "startTime": "A String",
      },
    ],
    "coresGranted": 42,
    "coresPerExecutor": 42,
    "maxCores": 42,
    "memoryPerExecutorMb": 42,
    "name": "A String",
    "quantileDataStatus": "A String",
  },
}
accessEnvironmentInfo(name, parent=None, x__xgafv=None)
Obtain environment details for a Spark Application

Args:
  name: string, Required. The fully qualified name of the session to retrieve in the format "projects/PROJECT_ID/locations/DATAPROC_REGION/sessions/SESSION_ID/sparkApplications/APPLICATION_ID" (required)
  parent: string, Required. Parent (Session) resource reference.
  x__xgafv: string, V1 error format.
    Allowed values
      1 - v1 error format
      2 - v2 error format

Returns:
  An object of the form:

    { # Environment details of a Saprk Application.
  "applicationEnvironmentInfo": { # Details about the Environment that the application is running in. # Details about the Environment that the application is running in.
    "classpathEntries": {
      "a_key": "A String",
    },
    "hadoopProperties": {
      "a_key": "A String",
    },
    "metricsProperties": {
      "a_key": "A String",
    },
    "resourceProfiles": [
      { # Resource profile that contains information about all the resources required by executors and tasks.
        "executorResources": {
          "a_key": { # Resources used per executor used by the application.
            "amount": "A String",
            "discoveryScript": "A String",
            "resourceName": "A String",
            "vendor": "A String",
          },
        },
        "resourceProfileId": 42,
        "taskResources": {
          "a_key": { # Resources used per task created by the application.
            "amount": 3.14,
            "resourceName": "A String",
          },
        },
      },
    ],
    "runtime": {
      "javaHome": "A String",
      "javaVersion": "A String",
      "scalaVersion": "A String",
    },
    "sparkProperties": {
      "a_key": "A String",
    },
    "systemProperties": {
      "a_key": "A String",
    },
  },
}
accessJob(name, jobId=None, parent=None, x__xgafv=None)
Obtain data corresponding to a spark job for a Spark Application.

Args:
  name: string, Required. The fully qualified name of the session to retrieve in the format "projects/PROJECT_ID/locations/DATAPROC_REGION/sessions/SESSION_ID/sparkApplications/APPLICATION_ID" (required)
  jobId: string, Required. Job ID to fetch data for.
  parent: string, Required. Parent (Session) resource reference.
  x__xgafv: string, V1 error format.
    Allowed values
      1 - v1 error format
      2 - v2 error format

Returns:
  An object of the form:

    { # Details of a particular job associated with Spark Application
  "jobData": { # Data corresponding to a spark job. # Output only. Data corresponding to a spark job.
    "completionTime": "A String",
    "description": "A String",
    "jobGroup": "A String",
    "jobId": "A String",
    "killTasksSummary": {
      "a_key": 42,
    },
    "name": "A String",
    "numActiveStages": 42,
    "numActiveTasks": 42,
    "numCompletedIndices": 42,
    "numCompletedStages": 42,
    "numCompletedTasks": 42,
    "numFailedStages": 42,
    "numFailedTasks": 42,
    "numKilledTasks": 42,
    "numSkippedStages": 42,
    "numSkippedTasks": 42,
    "numTasks": 42,
    "skippedStages": [
      42,
    ],
    "sqlExecutionId": "A String",
    "stageIds": [
      "A String",
    ],
    "status": "A String",
    "submissionTime": "A String",
  },
}
accessSqlPlan(name, executionId=None, parent=None, x__xgafv=None)
Obtain Spark Plan Graph for a Spark Application SQL execution. Limits the number of clusters returned as part of the graph to 10000.

Args:
  name: string, Required. The fully qualified name of the session to retrieve in the format "projects/PROJECT_ID/locations/DATAPROC_REGION/sessions/SESSION_ID/sparkApplications/APPLICATION_ID" (required)
  executionId: string, Required. Execution ID
  parent: string, Required. Parent (Session) resource reference.
  x__xgafv: string, V1 error format.
    Allowed values
      1 - v1 error format
      2 - v2 error format

Returns:
  An object of the form:

    { # SparkPlanGraph for a Spark Application execution limited to maximum 10000 clusters.
  "sparkPlanGraph": { # A graph used for storing information of an executionPlan of DataFrame. # SparkPlanGraph for a Spark Application execution.
    "edges": [
      { # Represents a directed edge in the spark plan tree from child to parent.
        "fromId": "A String",
        "toId": "A String",
      },
    ],
    "executionId": "A String",
    "nodes": [
      { # Wrapper user to represent either a node or a cluster.
        "cluster": { # Represents a tree of spark plan.
          "desc": "A String",
          "metrics": [
            { # Metrics related to SQL execution.
              "accumulatorId": "A String",
              "metricType": "A String",
              "name": "A String",
            },
          ],
          "name": "A String",
          "nodes": [
            # Object with schema name: SparkPlanGraphNodeWrapper
          ],
          "sparkPlanGraphClusterId": "A String",
        },
        "node": { # Represents a node in the spark plan tree.
          "desc": "A String",
          "metrics": [
            { # Metrics related to SQL execution.
              "accumulatorId": "A String",
              "metricType": "A String",
              "name": "A String",
            },
          ],
          "name": "A String",
          "sparkPlanGraphNodeId": "A String",
        },
      },
    ],
  },
}
accessSqlQuery(name, details=None, executionId=None, parent=None, planDescription=None, x__xgafv=None)
Obtain data corresponding to a particular SQL Query for a Spark Application.

Args:
  name: string, Required. The fully qualified name of the session to retrieve in the format "projects/PROJECT_ID/locations/DATAPROC_REGION/sessions/SESSION_ID/sparkApplications/APPLICATION_ID" (required)
  details: boolean, Optional. Lists/ hides details of Spark plan nodes. True is set to list and false to hide.
  executionId: string, Required. Execution ID
  parent: string, Required. Parent (Session) resource reference.
  planDescription: boolean, Optional. Enables/ disables physical plan description on demand
  x__xgafv: string, V1 error format.
    Allowed values
      1 - v1 error format
      2 - v2 error format

Returns:
  An object of the form:

    { # Details of a query for a Spark Application
  "executionData": { # SQL Execution Data # SQL Execution Data
    "completionTime": "A String",
    "description": "A String",
    "details": "A String",
    "errorMessage": "A String",
    "executionId": "A String",
    "jobs": {
      "a_key": "A String",
    },
    "metricValues": {
      "a_key": "A String",
    },
    "metricValuesIsNull": True or False,
    "metrics": [
      { # Metrics related to SQL execution.
        "accumulatorId": "A String",
        "metricType": "A String",
        "name": "A String",
      },
    ],
    "modifiedConfigs": {
      "a_key": "A String",
    },
    "physicalPlanDescription": "A String",
    "rootExecutionId": "A String",
    "stages": [
      "A String",
    ],
    "submissionTime": "A String",
  },
}
accessStageAttempt(name, parent=None, stageAttemptId=None, stageId=None, summaryMetricsMask=None, x__xgafv=None)
Obtain data corresponding to a spark stage attempt for a Spark Application.

Args:
  name: string, Required. The fully qualified name of the session to retrieve in the format "projects/PROJECT_ID/locations/DATAPROC_REGION/sessions/SESSION_ID/sparkApplications/APPLICATION_ID" (required)
  parent: string, Required. Parent (Session) resource reference.
  stageAttemptId: integer, Required. Stage Attempt ID
  stageId: string, Required. Stage ID
  summaryMetricsMask: string, Optional. The list of summary metrics fields to include. Empty list will default to skip all summary metrics fields. Example, if the response should include TaskQuantileMetrics, the request should have task_quantile_metrics in summary_metrics_mask field
  x__xgafv: string, V1 error format.
    Allowed values
      1 - v1 error format
      2 - v2 error format

Returns:
  An object of the form:

    { # Stage Attempt for a Stage of a Spark Application
  "stageData": { # Data corresponding to a stage. # Output only. Data corresponding to a stage.
    "accumulatorUpdates": [
      {
        "accumullableInfoId": "A String",
        "name": "A String",
        "update": "A String",
        "value": "A String",
      },
    ],
    "completionTime": "A String",
    "description": "A String",
    "details": "A String",
    "executorMetricsDistributions": {
      "diskBytesSpilled": [
        3.14,
      ],
      "failedTasks": [
        3.14,
      ],
      "inputBytes": [
        3.14,
      ],
      "inputRecords": [
        3.14,
      ],
      "killedTasks": [
        3.14,
      ],
      "memoryBytesSpilled": [
        3.14,
      ],
      "outputBytes": [
        3.14,
      ],
      "outputRecords": [
        3.14,
      ],
      "peakMemoryMetrics": {
        "executorMetrics": [
          {
            "metrics": {
              "a_key": "A String",
            },
          },
        ],
        "quantiles": [
          3.14,
        ],
      },
      "quantiles": [
        3.14,
      ],
      "shuffleRead": [
        3.14,
      ],
      "shuffleReadRecords": [
        3.14,
      ],
      "shuffleWrite": [
        3.14,
      ],
      "shuffleWriteRecords": [
        3.14,
      ],
      "succeededTasks": [
        3.14,
      ],
      "taskTimeMillis": [
        3.14,
      ],
    },
    "executorSummary": {
      "a_key": { # Executor resources consumed by a stage.
        "diskBytesSpilled": "A String",
        "executorId": "A String",
        "failedTasks": 42,
        "inputBytes": "A String",
        "inputRecords": "A String",
        "isExcludedForStage": True or False,
        "killedTasks": 42,
        "memoryBytesSpilled": "A String",
        "outputBytes": "A String",
        "outputRecords": "A String",
        "peakMemoryMetrics": {
          "metrics": {
            "a_key": "A String",
          },
        },
        "shuffleRead": "A String",
        "shuffleReadRecords": "A String",
        "shuffleWrite": "A String",
        "shuffleWriteRecords": "A String",
        "stageAttemptId": 42,
        "stageId": "A String",
        "succeededTasks": 42,
        "taskTimeMillis": "A String",
      },
    },
    "failureReason": "A String",
    "firstTaskLaunchedTime": "A String",
    "isShufflePushEnabled": True or False,
    "jobIds": [
      "A String",
    ],
    "killedTasksSummary": {
      "a_key": 42,
    },
    "locality": {
      "a_key": "A String",
    },
    "name": "A String",
    "numActiveTasks": 42,
    "numCompleteTasks": 42,
    "numCompletedIndices": 42,
    "numFailedTasks": 42,
    "numKilledTasks": 42,
    "numTasks": 42,
    "parentStageIds": [
      "A String",
    ],
    "peakExecutorMetrics": {
      "metrics": {
        "a_key": "A String",
      },
    },
    "rddIds": [
      "A String",
    ],
    "resourceProfileId": 42,
    "schedulingPool": "A String",
    "shuffleMergersCount": 42,
    "speculationSummary": { # Details of the speculation task when speculative execution is enabled.
      "numActiveTasks": 42,
      "numCompletedTasks": 42,
      "numFailedTasks": 42,
      "numKilledTasks": 42,
      "numTasks": 42,
      "stageAttemptId": 42,
      "stageId": "A String",
    },
    "stageAttemptId": 42,
    "stageId": "A String",
    "stageMetrics": { # Stage Level Aggregated Metrics
      "diskBytesSpilled": "A String",
      "executorCpuTimeNanos": "A String",
      "executorDeserializeCpuTimeNanos": "A String",
      "executorDeserializeTimeMillis": "A String",
      "executorRunTimeMillis": "A String",
      "jvmGcTimeMillis": "A String",
      "memoryBytesSpilled": "A String",
      "peakExecutionMemoryBytes": "A String",
      "resultSerializationTimeMillis": "A String",
      "resultSize": "A String",
      "stageInputMetrics": { # Metrics about the input read by the stage.
        "bytesRead": "A String",
        "recordsRead": "A String",
      },
      "stageOutputMetrics": { # Metrics about the output written by the stage.
        "bytesWritten": "A String",
        "recordsWritten": "A String",
      },
      "stageShuffleReadMetrics": { # Shuffle data read for the stage.
        "bytesRead": "A String",
        "fetchWaitTimeMillis": "A String",
        "localBlocksFetched": "A String",
        "localBytesRead": "A String",
        "recordsRead": "A String",
        "remoteBlocksFetched": "A String",
        "remoteBytesRead": "A String",
        "remoteBytesReadToDisk": "A String",
        "remoteReqsDuration": "A String",
        "stageShufflePushReadMetrics": {
          "corruptMergedBlockChunks": "A String",
          "localMergedBlocksFetched": "A String",
          "localMergedBytesRead": "A String",
          "localMergedChunksFetched": "A String",
          "mergedFetchFallbackCount": "A String",
          "remoteMergedBlocksFetched": "A String",
          "remoteMergedBytesRead": "A String",
          "remoteMergedChunksFetched": "A String",
          "remoteMergedReqsDuration": "A String",
        },
      },
      "stageShuffleWriteMetrics": { # Shuffle data written for the stage.
        "bytesWritten": "A String",
        "recordsWritten": "A String",
        "writeTimeNanos": "A String",
      },
    },
    "status": "A String",
    "submissionTime": "A String",
    "taskQuantileMetrics": { # Summary metrics fields. These are included in response only if present in summary_metrics_mask field in request
      "diskBytesSpilled": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
        "count": "A String",
        "maximum": "A String",
        "minimum": "A String",
        "percentile25": "A String",
        "percentile50": "A String",
        "percentile75": "A String",
        "sum": "A String",
      },
      "durationMillis": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
        "count": "A String",
        "maximum": "A String",
        "minimum": "A String",
        "percentile25": "A String",
        "percentile50": "A String",
        "percentile75": "A String",
        "sum": "A String",
      },
      "executorCpuTimeNanos": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
        "count": "A String",
        "maximum": "A String",
        "minimum": "A String",
        "percentile25": "A String",
        "percentile50": "A String",
        "percentile75": "A String",
        "sum": "A String",
      },
      "executorDeserializeCpuTimeNanos": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
        "count": "A String",
        "maximum": "A String",
        "minimum": "A String",
        "percentile25": "A String",
        "percentile50": "A String",
        "percentile75": "A String",
        "sum": "A String",
      },
      "executorDeserializeTimeMillis": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
        "count": "A String",
        "maximum": "A String",
        "minimum": "A String",
        "percentile25": "A String",
        "percentile50": "A String",
        "percentile75": "A String",
        "sum": "A String",
      },
      "executorRunTimeMillis": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
        "count": "A String",
        "maximum": "A String",
        "minimum": "A String",
        "percentile25": "A String",
        "percentile50": "A String",
        "percentile75": "A String",
        "sum": "A String",
      },
      "gettingResultTimeMillis": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
        "count": "A String",
        "maximum": "A String",
        "minimum": "A String",
        "percentile25": "A String",
        "percentile50": "A String",
        "percentile75": "A String",
        "sum": "A String",
      },
      "inputMetrics": {
        "bytesRead": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
        "recordsRead": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
      },
      "jvmGcTimeMillis": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
        "count": "A String",
        "maximum": "A String",
        "minimum": "A String",
        "percentile25": "A String",
        "percentile50": "A String",
        "percentile75": "A String",
        "sum": "A String",
      },
      "memoryBytesSpilled": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
        "count": "A String",
        "maximum": "A String",
        "minimum": "A String",
        "percentile25": "A String",
        "percentile50": "A String",
        "percentile75": "A String",
        "sum": "A String",
      },
      "outputMetrics": {
        "bytesWritten": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
        "recordsWritten": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
      },
      "peakExecutionMemoryBytes": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
        "count": "A String",
        "maximum": "A String",
        "minimum": "A String",
        "percentile25": "A String",
        "percentile50": "A String",
        "percentile75": "A String",
        "sum": "A String",
      },
      "resultSerializationTimeMillis": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
        "count": "A String",
        "maximum": "A String",
        "minimum": "A String",
        "percentile25": "A String",
        "percentile50": "A String",
        "percentile75": "A String",
        "sum": "A String",
      },
      "resultSize": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
        "count": "A String",
        "maximum": "A String",
        "minimum": "A String",
        "percentile25": "A String",
        "percentile50": "A String",
        "percentile75": "A String",
        "sum": "A String",
      },
      "schedulerDelayMillis": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
        "count": "A String",
        "maximum": "A String",
        "minimum": "A String",
        "percentile25": "A String",
        "percentile50": "A String",
        "percentile75": "A String",
        "sum": "A String",
      },
      "shuffleReadMetrics": {
        "fetchWaitTimeMillis": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
        "localBlocksFetched": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
        "readBytes": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
        "readRecords": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
        "remoteBlocksFetched": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
        "remoteBytesRead": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
        "remoteBytesReadToDisk": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
        "remoteReqsDuration": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
        "shufflePushReadMetrics": {
          "corruptMergedBlockChunks": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "localMergedBlocksFetched": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "localMergedBytesRead": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "localMergedChunksFetched": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "mergedFetchFallbackCount": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "remoteMergedBlocksFetched": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "remoteMergedBytesRead": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "remoteMergedChunksFetched": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "remoteMergedReqsDuration": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
        },
        "totalBlocksFetched": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
      },
      "shuffleWriteMetrics": {
        "writeBytes": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
        "writeRecords": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
        "writeTimeNanos": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
      },
    },
    "tasks": {
      "a_key": { # Data corresponding to tasks created by spark.
        "accumulatorUpdates": [
          {
            "accumullableInfoId": "A String",
            "name": "A String",
            "update": "A String",
            "value": "A String",
          },
        ],
        "attempt": 42,
        "durationMillis": "A String",
        "errorMessage": "A String",
        "executorId": "A String",
        "executorLogs": {
          "a_key": "A String",
        },
        "gettingResultTimeMillis": "A String",
        "hasMetrics": True or False,
        "host": "A String",
        "index": 42,
        "launchTime": "A String",
        "partitionId": 42,
        "resultFetchStart": "A String",
        "schedulerDelayMillis": "A String",
        "speculative": True or False,
        "stageAttemptId": 42,
        "stageId": "A String",
        "status": "A String",
        "taskId": "A String",
        "taskLocality": "A String",
        "taskMetrics": { # Executor Task Metrics
          "diskBytesSpilled": "A String",
          "executorCpuTimeNanos": "A String",
          "executorDeserializeCpuTimeNanos": "A String",
          "executorDeserializeTimeMillis": "A String",
          "executorRunTimeMillis": "A String",
          "inputMetrics": { # Metrics about the input data read by the task.
            "bytesRead": "A String",
            "recordsRead": "A String",
          },
          "jvmGcTimeMillis": "A String",
          "memoryBytesSpilled": "A String",
          "outputMetrics": { # Metrics about the data written by the task.
            "bytesWritten": "A String",
            "recordsWritten": "A String",
          },
          "peakExecutionMemoryBytes": "A String",
          "resultSerializationTimeMillis": "A String",
          "resultSize": "A String",
          "shuffleReadMetrics": { # Shuffle data read by the task.
            "fetchWaitTimeMillis": "A String",
            "localBlocksFetched": "A String",
            "localBytesRead": "A String",
            "recordsRead": "A String",
            "remoteBlocksFetched": "A String",
            "remoteBytesRead": "A String",
            "remoteBytesReadToDisk": "A String",
            "remoteReqsDuration": "A String",
            "shufflePushReadMetrics": {
              "corruptMergedBlockChunks": "A String",
              "localMergedBlocksFetched": "A String",
              "localMergedBytesRead": "A String",
              "localMergedChunksFetched": "A String",
              "mergedFetchFallbackCount": "A String",
              "remoteMergedBlocksFetched": "A String",
              "remoteMergedBytesRead": "A String",
              "remoteMergedChunksFetched": "A String",
              "remoteMergedReqsDuration": "A String",
            },
          },
          "shuffleWriteMetrics": { # Shuffle data written by task.
            "bytesWritten": "A String",
            "recordsWritten": "A String",
            "writeTimeNanos": "A String",
          },
        },
      },
    },
  },
}
accessStageRddGraph(name, parent=None, stageId=None, x__xgafv=None)
Obtain RDD operation graph for a Spark Application Stage. Limits the number of clusters returned as part of the graph to 10000.

Args:
  name: string, Required. The fully qualified name of the session to retrieve in the format "projects/PROJECT_ID/locations/DATAPROC_REGION/sessions/SESSION_ID/sparkApplications/APPLICATION_ID" (required)
  parent: string, Required. Parent (Session) resource reference.
  stageId: string, Required. Stage ID
  x__xgafv: string, V1 error format.
    Allowed values
      1 - v1 error format
      2 - v2 error format

Returns:
  An object of the form:

    { # RDD operation graph for a Spark Application Stage limited to maximum 10000 clusters.
  "rddOperationGraph": { # Graph representing RDD dependencies. Consists of edges and a root cluster. # RDD operation graph for a Spark Application Stage.
    "edges": [
      { # A directed edge representing dependency between two RDDs.
        "fromId": 42,
        "toId": 42,
      },
    ],
    "incomingEdges": [
      { # A directed edge representing dependency between two RDDs.
        "fromId": 42,
        "toId": 42,
      },
    ],
    "outgoingEdges": [
      { # A directed edge representing dependency between two RDDs.
        "fromId": 42,
        "toId": 42,
      },
    ],
    "rootCluster": { # A grouping of nodes representing higher level constructs (stage, job etc.).
      "childClusters": [
        # Object with schema name: RddOperationCluster
      ],
      "childNodes": [
        { # A node in the RDD operation graph. Corresponds to a single RDD.
          "barrier": True or False,
          "cached": True or False,
          "callsite": "A String",
          "name": "A String",
          "nodeId": 42,
          "outputDeterministicLevel": "A String",
        },
      ],
      "name": "A String",
      "rddClusterId": "A String",
    },
    "stageId": "A String",
  },
}
close()
Close httplib2 connections.
search(parent, applicationStatus=None, maxEndTime=None, maxTime=None, minEndTime=None, minTime=None, pageSize=None, pageToken=None, x__xgafv=None)
Obtain high level information and list of Spark Applications corresponding to a batch

Args:
  parent: string, Required. The fully qualified name of the session to retrieve in the format "projects/PROJECT_ID/locations/DATAPROC_REGION/sessions/SESSION_ID" (required)
  applicationStatus: string, Optional. Search only applications in the chosen state.
    Allowed values
      APPLICATION_STATUS_UNSPECIFIED - 
      APPLICATION_STATUS_RUNNING - 
      APPLICATION_STATUS_COMPLETED - 
  maxEndTime: string, Optional. Latest end timestamp to list.
  maxTime: string, Optional. Latest start timestamp to list.
  minEndTime: string, Optional. Earliest end timestamp to list.
  minTime: string, Optional. Earliest start timestamp to list.
  pageSize: integer, Optional. Maximum number of applications to return in each response. The service may return fewer than this. The default page size is 10; the maximum page size is 100.
  pageToken: string, Optional. A page token received from a previous SearchSessionSparkApplications call. Provide this token to retrieve the subsequent page.
  x__xgafv: string, V1 error format.
    Allowed values
      1 - v1 error format
      2 - v2 error format

Returns:
  An object of the form:

    { # A list of summary of Spark Applications
  "nextPageToken": "A String", # This token is included in the response if there are more results to fetch. To fetch additional results, provide this value as the page_token in a subsequent SearchSessionSparkApplicationsRequest.
  "sparkApplications": [ # Output only. High level information corresponding to an application.
    { # A summary of Spark Application
      "application": { # High level information corresponding to an application. # Output only. High level information corresponding to an application.
        "applicationContextIngestionStatus": "A String",
        "applicationId": "A String",
        "attempts": [
          { # Specific attempt of an application.
            "appSparkVersion": "A String",
            "attemptId": "A String",
            "completed": True or False,
            "durationMillis": "A String",
            "endTime": "A String",
            "lastUpdated": "A String",
            "sparkUser": "A String",
            "startTime": "A String",
          },
        ],
        "coresGranted": 42,
        "coresPerExecutor": 42,
        "maxCores": 42,
        "memoryPerExecutorMb": 42,
        "name": "A String",
        "quantileDataStatus": "A String",
      },
      "name": "A String", # Identifier. Name of the spark application
    },
  ],
}
searchExecutorStageSummary(name, pageSize=None, pageToken=None, parent=None, stageAttemptId=None, stageId=None, x__xgafv=None)
Obtain executor summary with respect to a spark stage attempt.

Args:
  name: string, Required. The fully qualified name of the session to retrieve in the format "projects/PROJECT_ID/locations/DATAPROC_REGION/sessions/SESSION_ID/sparkApplications/APPLICATION_ID" (required)
  pageSize: integer, Optional. Maximum number of executors to return in each response. The service may return fewer than this. The default page size is 10; the maximum page size is 100.
  pageToken: string, Optional. A page token received from a previous SearchSessionSparkApplicationExecutorStageSummary call. Provide this token to retrieve the subsequent page.
  parent: string, Required. Parent (Session) resource reference.
  stageAttemptId: integer, Required. Stage Attempt ID
  stageId: string, Required. Stage ID
  x__xgafv: string, V1 error format.
    Allowed values
      1 - v1 error format
      2 - v2 error format

Returns:
  An object of the form:

    { # List of Executors associated with a Spark Application Stage.
  "nextPageToken": "A String", # This token is included in the response if there are more results to fetch. To fetch additional results, provide this value as the page_token in a subsequent SearchSessionSparkApplicationExecutorStageSummaryRequest.
  "sparkApplicationStageExecutors": [ # Details about executors used by the application stage.
    { # Executor resources consumed by a stage.
      "diskBytesSpilled": "A String",
      "executorId": "A String",
      "failedTasks": 42,
      "inputBytes": "A String",
      "inputRecords": "A String",
      "isExcludedForStage": True or False,
      "killedTasks": 42,
      "memoryBytesSpilled": "A String",
      "outputBytes": "A String",
      "outputRecords": "A String",
      "peakMemoryMetrics": {
        "metrics": {
          "a_key": "A String",
        },
      },
      "shuffleRead": "A String",
      "shuffleReadRecords": "A String",
      "shuffleWrite": "A String",
      "shuffleWriteRecords": "A String",
      "stageAttemptId": 42,
      "stageId": "A String",
      "succeededTasks": 42,
      "taskTimeMillis": "A String",
    },
  ],
}
searchExecutorStageSummary_next()
Retrieves the next page of results.

        Args:
          previous_request: The request for the previous page. (required)
          previous_response: The response from the request for the previous page. (required)

        Returns:
          A request object that you can call 'execute()' on to request the next
          page. Returns None if there are no more items in the collection.
        
searchExecutors(name, executorStatus=None, pageSize=None, pageToken=None, parent=None, x__xgafv=None)
Obtain data corresponding to executors for a Spark Application.

Args:
  name: string, Required. The fully qualified name of the session to retrieve in the format "projects/PROJECT_ID/locations/DATAPROC_REGION/sessions/SESSION_ID/sparkApplications/APPLICATION_ID" (required)
  executorStatus: string, Optional. Filter to select whether active/ dead or all executors should be selected.
    Allowed values
      EXECUTOR_STATUS_UNSPECIFIED - 
      EXECUTOR_STATUS_ACTIVE - 
      EXECUTOR_STATUS_DEAD - 
  pageSize: integer, Optional. Maximum number of executors to return in each response. The service may return fewer than this. The default page size is 10; the maximum page size is 100.
  pageToken: string, Optional. A page token received from a previous SearchSessionSparkApplicationExecutors call. Provide this token to retrieve the subsequent page.
  parent: string, Required. Parent (Session) resource reference.
  x__xgafv: string, V1 error format.
    Allowed values
      1 - v1 error format
      2 - v2 error format

Returns:
  An object of the form:

    { # List of Executors associated with a Spark Application.
  "nextPageToken": "A String", # This token is included in the response if there are more results to fetch. To fetch additional results, provide this value as the page_token in a subsequent SearchSessionSparkApplicationExecutorsRequest.
  "sparkApplicationExecutors": [ # Details about executors used by the application.
    { # Details about executors used by the application.
      "activeTasks": 42,
      "addTime": "A String",
      "attributes": {
        "a_key": "A String",
      },
      "completedTasks": 42,
      "diskUsed": "A String",
      "excludedInStages": [
        "A String",
      ],
      "executorId": "A String",
      "executorLogs": {
        "a_key": "A String",
      },
      "failedTasks": 42,
      "hostPort": "A String",
      "isActive": True or False,
      "isExcluded": True or False,
      "maxMemory": "A String",
      "maxTasks": 42,
      "memoryMetrics": {
        "totalOffHeapStorageMemory": "A String",
        "totalOnHeapStorageMemory": "A String",
        "usedOffHeapStorageMemory": "A String",
        "usedOnHeapStorageMemory": "A String",
      },
      "memoryUsed": "A String",
      "peakMemoryMetrics": {
        "metrics": {
          "a_key": "A String",
        },
      },
      "rddBlocks": 42,
      "removeReason": "A String",
      "removeTime": "A String",
      "resourceProfileId": 42,
      "resources": {
        "a_key": {
          "addresses": [
            "A String",
          ],
          "name": "A String",
        },
      },
      "totalCores": 42,
      "totalDurationMillis": "A String",
      "totalGcTimeMillis": "A String",
      "totalInputBytes": "A String",
      "totalShuffleRead": "A String",
      "totalShuffleWrite": "A String",
      "totalTasks": 42,
    },
  ],
}
searchExecutors_next()
Retrieves the next page of results.

        Args:
          previous_request: The request for the previous page. (required)
          previous_response: The response from the request for the previous page. (required)

        Returns:
          A request object that you can call 'execute()' on to request the next
          page. Returns None if there are no more items in the collection.
        
searchJobs(name, jobStatus=None, pageSize=None, pageToken=None, parent=None, x__xgafv=None)
Obtain list of spark jobs corresponding to a Spark Application.

Args:
  name: string, Required. The fully qualified name of the session to retrieve in the format "projects/PROJECT_ID/locations/DATAPROC_REGION/sessions/SESSION_ID/sparkApplications/APPLICATION_ID" (required)
  jobStatus: string, Optional. List only jobs in the specific state.
    Allowed values
      JOB_EXECUTION_STATUS_UNSPECIFIED - 
      JOB_EXECUTION_STATUS_RUNNING - 
      JOB_EXECUTION_STATUS_SUCCEEDED - 
      JOB_EXECUTION_STATUS_FAILED - 
      JOB_EXECUTION_STATUS_UNKNOWN - 
  pageSize: integer, Optional. Maximum number of jobs to return in each response. The service may return fewer than this. The default page size is 10; the maximum page size is 100.
  pageToken: string, Optional. A page token received from a previous SearchSessionSparkApplicationJobs call. Provide this token to retrieve the subsequent page.
  parent: string, Required. Parent (Session) resource reference.
  x__xgafv: string, V1 error format.
    Allowed values
      1 - v1 error format
      2 - v2 error format

Returns:
  An object of the form:

    { # A list of Jobs associated with a Spark Application.
  "nextPageToken": "A String", # This token is included in the response if there are more results to fetch. To fetch additional results, provide this value as the page_token in a subsequent SearchSessionSparkApplicationJobsRequest.
  "sparkApplicationJobs": [ # Output only. Data corresponding to a spark job.
    { # Data corresponding to a spark job.
      "completionTime": "A String",
      "description": "A String",
      "jobGroup": "A String",
      "jobId": "A String",
      "killTasksSummary": {
        "a_key": 42,
      },
      "name": "A String",
      "numActiveStages": 42,
      "numActiveTasks": 42,
      "numCompletedIndices": 42,
      "numCompletedStages": 42,
      "numCompletedTasks": 42,
      "numFailedStages": 42,
      "numFailedTasks": 42,
      "numKilledTasks": 42,
      "numSkippedStages": 42,
      "numSkippedTasks": 42,
      "numTasks": 42,
      "skippedStages": [
        42,
      ],
      "sqlExecutionId": "A String",
      "stageIds": [
        "A String",
      ],
      "status": "A String",
      "submissionTime": "A String",
    },
  ],
}
searchJobs_next()
Retrieves the next page of results.

        Args:
          previous_request: The request for the previous page. (required)
          previous_response: The response from the request for the previous page. (required)

        Returns:
          A request object that you can call 'execute()' on to request the next
          page. Returns None if there are no more items in the collection.
        
searchSqlQueries(name, details=None, pageSize=None, pageToken=None, parent=None, planDescription=None, x__xgafv=None)
Obtain data corresponding to SQL Queries for a Spark Application.

Args:
  name: string, Required. The fully qualified name of the session to retrieve in the format "projects/PROJECT_ID/locations/DATAPROC_REGION/sessions/SESSION_ID/sparkApplications/APPLICATION_ID" (required)
  details: boolean, Optional. Lists/ hides details of Spark plan nodes. True is set to list and false to hide.
  pageSize: integer, Optional. Maximum number of queries to return in each response. The service may return fewer than this. The default page size is 10; the maximum page size is 100.
  pageToken: string, Optional. A page token received from a previous SearchSessionSparkApplicationSqlQueries call. Provide this token to retrieve the subsequent page.
  parent: string, Required. Parent (Session) resource reference.
  planDescription: boolean, Optional. Enables/ disables physical plan description on demand
  x__xgafv: string, V1 error format.
    Allowed values
      1 - v1 error format
      2 - v2 error format

Returns:
  An object of the form:

    { # List of all queries for a Spark Application.
  "nextPageToken": "A String", # This token is included in the response if there are more results to fetch. To fetch additional results, provide this value as the page_token in a subsequent SearchSessionSparkApplicationSqlQueriesRequest.
  "sparkApplicationSqlQueries": [ # Output only. SQL Execution Data
    { # SQL Execution Data
      "completionTime": "A String",
      "description": "A String",
      "details": "A String",
      "errorMessage": "A String",
      "executionId": "A String",
      "jobs": {
        "a_key": "A String",
      },
      "metricValues": {
        "a_key": "A String",
      },
      "metricValuesIsNull": True or False,
      "metrics": [
        { # Metrics related to SQL execution.
          "accumulatorId": "A String",
          "metricType": "A String",
          "name": "A String",
        },
      ],
      "modifiedConfigs": {
        "a_key": "A String",
      },
      "physicalPlanDescription": "A String",
      "rootExecutionId": "A String",
      "stages": [
        "A String",
      ],
      "submissionTime": "A String",
    },
  ],
}
searchSqlQueries_next()
Retrieves the next page of results.

        Args:
          previous_request: The request for the previous page. (required)
          previous_response: The response from the request for the previous page. (required)

        Returns:
          A request object that you can call 'execute()' on to request the next
          page. Returns None if there are no more items in the collection.
        
searchStageAttemptTasks(name, pageSize=None, pageToken=None, parent=None, sortRuntime=None, stageAttemptId=None, stageId=None, taskStatus=None, x__xgafv=None)
Obtain data corresponding to tasks for a spark stage attempt for a Spark Application.

Args:
  name: string, Required. The fully qualified name of the session to retrieve in the format "projects/PROJECT_ID/locations/DATAPROC_REGION/sessions/SESSION_ID/sparkApplications/APPLICATION_ID" (required)
  pageSize: integer, Optional. Maximum number of tasks to return in each response. The service may return fewer than this. The default page size is 10; the maximum page size is 100.
  pageToken: string, Optional. A page token received from a previous SearchSessionSparkApplicationStageAttemptTasks call. Provide this token to retrieve the subsequent page.
  parent: string, Required. Parent (Session) resource reference.
  sortRuntime: boolean, Optional. Sort the tasks by runtime.
  stageAttemptId: integer, Optional. Stage Attempt ID
  stageId: string, Optional. Stage ID
  taskStatus: string, Optional. List only tasks in the state.
    Allowed values
      TASK_STATUS_UNSPECIFIED - 
      TASK_STATUS_RUNNING - 
      TASK_STATUS_SUCCESS - 
      TASK_STATUS_FAILED - 
      TASK_STATUS_KILLED - 
      TASK_STATUS_PENDING - 
  x__xgafv: string, V1 error format.
    Allowed values
      1 - v1 error format
      2 - v2 error format

Returns:
  An object of the form:

    { # List of tasks for a stage of a Spark Application
  "nextPageToken": "A String", # This token is included in the response if there are more results to fetch. To fetch additional results, provide this value as the page_token in a subsequent SearchSessionSparkApplicationStageAttemptTasksRequest.
  "sparkApplicationStageAttemptTasks": [ # Output only. Data corresponding to tasks created by spark.
    { # Data corresponding to tasks created by spark.
      "accumulatorUpdates": [
        {
          "accumullableInfoId": "A String",
          "name": "A String",
          "update": "A String",
          "value": "A String",
        },
      ],
      "attempt": 42,
      "durationMillis": "A String",
      "errorMessage": "A String",
      "executorId": "A String",
      "executorLogs": {
        "a_key": "A String",
      },
      "gettingResultTimeMillis": "A String",
      "hasMetrics": True or False,
      "host": "A String",
      "index": 42,
      "launchTime": "A String",
      "partitionId": 42,
      "resultFetchStart": "A String",
      "schedulerDelayMillis": "A String",
      "speculative": True or False,
      "stageAttemptId": 42,
      "stageId": "A String",
      "status": "A String",
      "taskId": "A String",
      "taskLocality": "A String",
      "taskMetrics": { # Executor Task Metrics
        "diskBytesSpilled": "A String",
        "executorCpuTimeNanos": "A String",
        "executorDeserializeCpuTimeNanos": "A String",
        "executorDeserializeTimeMillis": "A String",
        "executorRunTimeMillis": "A String",
        "inputMetrics": { # Metrics about the input data read by the task.
          "bytesRead": "A String",
          "recordsRead": "A String",
        },
        "jvmGcTimeMillis": "A String",
        "memoryBytesSpilled": "A String",
        "outputMetrics": { # Metrics about the data written by the task.
          "bytesWritten": "A String",
          "recordsWritten": "A String",
        },
        "peakExecutionMemoryBytes": "A String",
        "resultSerializationTimeMillis": "A String",
        "resultSize": "A String",
        "shuffleReadMetrics": { # Shuffle data read by the task.
          "fetchWaitTimeMillis": "A String",
          "localBlocksFetched": "A String",
          "localBytesRead": "A String",
          "recordsRead": "A String",
          "remoteBlocksFetched": "A String",
          "remoteBytesRead": "A String",
          "remoteBytesReadToDisk": "A String",
          "remoteReqsDuration": "A String",
          "shufflePushReadMetrics": {
            "corruptMergedBlockChunks": "A String",
            "localMergedBlocksFetched": "A String",
            "localMergedBytesRead": "A String",
            "localMergedChunksFetched": "A String",
            "mergedFetchFallbackCount": "A String",
            "remoteMergedBlocksFetched": "A String",
            "remoteMergedBytesRead": "A String",
            "remoteMergedChunksFetched": "A String",
            "remoteMergedReqsDuration": "A String",
          },
        },
        "shuffleWriteMetrics": { # Shuffle data written by task.
          "bytesWritten": "A String",
          "recordsWritten": "A String",
          "writeTimeNanos": "A String",
        },
      },
    },
  ],
}
searchStageAttemptTasks_next()
Retrieves the next page of results.

        Args:
          previous_request: The request for the previous page. (required)
          previous_response: The response from the request for the previous page. (required)

        Returns:
          A request object that you can call 'execute()' on to request the next
          page. Returns None if there are no more items in the collection.
        
searchStageAttempts(name, pageSize=None, pageToken=None, parent=None, stageId=None, summaryMetricsMask=None, x__xgafv=None)
Obtain data corresponding to a spark stage attempts for a Spark Application.

Args:
  name: string, Required. The fully qualified name of the session to retrieve in the format "projects/PROJECT_ID/locations/DATAPROC_REGION/sessions/SESSION_ID/sparkApplications/APPLICATION_ID" (required)
  pageSize: integer, Optional. Maximum number of stage attempts (paging based on stage_attempt_id) to return in each response. The service may return fewer than this. The default page size is 10; the maximum page size is 100.
  pageToken: string, Optional. A page token received from a previous SearchSessionSparkApplicationStageAttempts call. Provide this token to retrieve the subsequent page.
  parent: string, Required. Parent (Session) resource reference.
  stageId: string, Required. Stage ID for which attempts are to be fetched
  summaryMetricsMask: string, Optional. The list of summary metrics fields to include. Empty list will default to skip all summary metrics fields. Example, if the response should include TaskQuantileMetrics, the request should have task_quantile_metrics in summary_metrics_mask field
  x__xgafv: string, V1 error format.
    Allowed values
      1 - v1 error format
      2 - v2 error format

Returns:
  An object of the form:

    { # A list of Stage Attempts for a Stage of a Spark Application.
  "nextPageToken": "A String", # This token is included in the response if there are more results to fetch. To fetch additional results, provide this value as the page_token in a subsequent SearchSessionSparkApplicationStageAttemptsRequest.
  "sparkApplicationStageAttempts": [ # Output only. Data corresponding to a stage attempts
    { # Data corresponding to a stage.
      "accumulatorUpdates": [
        {
          "accumullableInfoId": "A String",
          "name": "A String",
          "update": "A String",
          "value": "A String",
        },
      ],
      "completionTime": "A String",
      "description": "A String",
      "details": "A String",
      "executorMetricsDistributions": {
        "diskBytesSpilled": [
          3.14,
        ],
        "failedTasks": [
          3.14,
        ],
        "inputBytes": [
          3.14,
        ],
        "inputRecords": [
          3.14,
        ],
        "killedTasks": [
          3.14,
        ],
        "memoryBytesSpilled": [
          3.14,
        ],
        "outputBytes": [
          3.14,
        ],
        "outputRecords": [
          3.14,
        ],
        "peakMemoryMetrics": {
          "executorMetrics": [
            {
              "metrics": {
                "a_key": "A String",
              },
            },
          ],
          "quantiles": [
            3.14,
          ],
        },
        "quantiles": [
          3.14,
        ],
        "shuffleRead": [
          3.14,
        ],
        "shuffleReadRecords": [
          3.14,
        ],
        "shuffleWrite": [
          3.14,
        ],
        "shuffleWriteRecords": [
          3.14,
        ],
        "succeededTasks": [
          3.14,
        ],
        "taskTimeMillis": [
          3.14,
        ],
      },
      "executorSummary": {
        "a_key": { # Executor resources consumed by a stage.
          "diskBytesSpilled": "A String",
          "executorId": "A String",
          "failedTasks": 42,
          "inputBytes": "A String",
          "inputRecords": "A String",
          "isExcludedForStage": True or False,
          "killedTasks": 42,
          "memoryBytesSpilled": "A String",
          "outputBytes": "A String",
          "outputRecords": "A String",
          "peakMemoryMetrics": {
            "metrics": {
              "a_key": "A String",
            },
          },
          "shuffleRead": "A String",
          "shuffleReadRecords": "A String",
          "shuffleWrite": "A String",
          "shuffleWriteRecords": "A String",
          "stageAttemptId": 42,
          "stageId": "A String",
          "succeededTasks": 42,
          "taskTimeMillis": "A String",
        },
      },
      "failureReason": "A String",
      "firstTaskLaunchedTime": "A String",
      "isShufflePushEnabled": True or False,
      "jobIds": [
        "A String",
      ],
      "killedTasksSummary": {
        "a_key": 42,
      },
      "locality": {
        "a_key": "A String",
      },
      "name": "A String",
      "numActiveTasks": 42,
      "numCompleteTasks": 42,
      "numCompletedIndices": 42,
      "numFailedTasks": 42,
      "numKilledTasks": 42,
      "numTasks": 42,
      "parentStageIds": [
        "A String",
      ],
      "peakExecutorMetrics": {
        "metrics": {
          "a_key": "A String",
        },
      },
      "rddIds": [
        "A String",
      ],
      "resourceProfileId": 42,
      "schedulingPool": "A String",
      "shuffleMergersCount": 42,
      "speculationSummary": { # Details of the speculation task when speculative execution is enabled.
        "numActiveTasks": 42,
        "numCompletedTasks": 42,
        "numFailedTasks": 42,
        "numKilledTasks": 42,
        "numTasks": 42,
        "stageAttemptId": 42,
        "stageId": "A String",
      },
      "stageAttemptId": 42,
      "stageId": "A String",
      "stageMetrics": { # Stage Level Aggregated Metrics
        "diskBytesSpilled": "A String",
        "executorCpuTimeNanos": "A String",
        "executorDeserializeCpuTimeNanos": "A String",
        "executorDeserializeTimeMillis": "A String",
        "executorRunTimeMillis": "A String",
        "jvmGcTimeMillis": "A String",
        "memoryBytesSpilled": "A String",
        "peakExecutionMemoryBytes": "A String",
        "resultSerializationTimeMillis": "A String",
        "resultSize": "A String",
        "stageInputMetrics": { # Metrics about the input read by the stage.
          "bytesRead": "A String",
          "recordsRead": "A String",
        },
        "stageOutputMetrics": { # Metrics about the output written by the stage.
          "bytesWritten": "A String",
          "recordsWritten": "A String",
        },
        "stageShuffleReadMetrics": { # Shuffle data read for the stage.
          "bytesRead": "A String",
          "fetchWaitTimeMillis": "A String",
          "localBlocksFetched": "A String",
          "localBytesRead": "A String",
          "recordsRead": "A String",
          "remoteBlocksFetched": "A String",
          "remoteBytesRead": "A String",
          "remoteBytesReadToDisk": "A String",
          "remoteReqsDuration": "A String",
          "stageShufflePushReadMetrics": {
            "corruptMergedBlockChunks": "A String",
            "localMergedBlocksFetched": "A String",
            "localMergedBytesRead": "A String",
            "localMergedChunksFetched": "A String",
            "mergedFetchFallbackCount": "A String",
            "remoteMergedBlocksFetched": "A String",
            "remoteMergedBytesRead": "A String",
            "remoteMergedChunksFetched": "A String",
            "remoteMergedReqsDuration": "A String",
          },
        },
        "stageShuffleWriteMetrics": { # Shuffle data written for the stage.
          "bytesWritten": "A String",
          "recordsWritten": "A String",
          "writeTimeNanos": "A String",
        },
      },
      "status": "A String",
      "submissionTime": "A String",
      "taskQuantileMetrics": { # Summary metrics fields. These are included in response only if present in summary_metrics_mask field in request
        "diskBytesSpilled": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
        "durationMillis": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
        "executorCpuTimeNanos": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
        "executorDeserializeCpuTimeNanos": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
        "executorDeserializeTimeMillis": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
        "executorRunTimeMillis": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
        "gettingResultTimeMillis": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
        "inputMetrics": {
          "bytesRead": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "recordsRead": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
        },
        "jvmGcTimeMillis": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
        "memoryBytesSpilled": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
        "outputMetrics": {
          "bytesWritten": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "recordsWritten": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
        },
        "peakExecutionMemoryBytes": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
        "resultSerializationTimeMillis": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
        "resultSize": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
        "schedulerDelayMillis": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
        "shuffleReadMetrics": {
          "fetchWaitTimeMillis": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "localBlocksFetched": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "readBytes": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "readRecords": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "remoteBlocksFetched": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "remoteBytesRead": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "remoteBytesReadToDisk": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "remoteReqsDuration": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "shufflePushReadMetrics": {
            "corruptMergedBlockChunks": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
              "count": "A String",
              "maximum": "A String",
              "minimum": "A String",
              "percentile25": "A String",
              "percentile50": "A String",
              "percentile75": "A String",
              "sum": "A String",
            },
            "localMergedBlocksFetched": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
              "count": "A String",
              "maximum": "A String",
              "minimum": "A String",
              "percentile25": "A String",
              "percentile50": "A String",
              "percentile75": "A String",
              "sum": "A String",
            },
            "localMergedBytesRead": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
              "count": "A String",
              "maximum": "A String",
              "minimum": "A String",
              "percentile25": "A String",
              "percentile50": "A String",
              "percentile75": "A String",
              "sum": "A String",
            },
            "localMergedChunksFetched": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
              "count": "A String",
              "maximum": "A String",
              "minimum": "A String",
              "percentile25": "A String",
              "percentile50": "A String",
              "percentile75": "A String",
              "sum": "A String",
            },
            "mergedFetchFallbackCount": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
              "count": "A String",
              "maximum": "A String",
              "minimum": "A String",
              "percentile25": "A String",
              "percentile50": "A String",
              "percentile75": "A String",
              "sum": "A String",
            },
            "remoteMergedBlocksFetched": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
              "count": "A String",
              "maximum": "A String",
              "minimum": "A String",
              "percentile25": "A String",
              "percentile50": "A String",
              "percentile75": "A String",
              "sum": "A String",
            },
            "remoteMergedBytesRead": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
              "count": "A String",
              "maximum": "A String",
              "minimum": "A String",
              "percentile25": "A String",
              "percentile50": "A String",
              "percentile75": "A String",
              "sum": "A String",
            },
            "remoteMergedChunksFetched": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
              "count": "A String",
              "maximum": "A String",
              "minimum": "A String",
              "percentile25": "A String",
              "percentile50": "A String",
              "percentile75": "A String",
              "sum": "A String",
            },
            "remoteMergedReqsDuration": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
              "count": "A String",
              "maximum": "A String",
              "minimum": "A String",
              "percentile25": "A String",
              "percentile50": "A String",
              "percentile75": "A String",
              "sum": "A String",
            },
          },
          "totalBlocksFetched": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
        },
        "shuffleWriteMetrics": {
          "writeBytes": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "writeRecords": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "writeTimeNanos": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
        },
      },
      "tasks": {
        "a_key": { # Data corresponding to tasks created by spark.
          "accumulatorUpdates": [
            {
              "accumullableInfoId": "A String",
              "name": "A String",
              "update": "A String",
              "value": "A String",
            },
          ],
          "attempt": 42,
          "durationMillis": "A String",
          "errorMessage": "A String",
          "executorId": "A String",
          "executorLogs": {
            "a_key": "A String",
          },
          "gettingResultTimeMillis": "A String",
          "hasMetrics": True or False,
          "host": "A String",
          "index": 42,
          "launchTime": "A String",
          "partitionId": 42,
          "resultFetchStart": "A String",
          "schedulerDelayMillis": "A String",
          "speculative": True or False,
          "stageAttemptId": 42,
          "stageId": "A String",
          "status": "A String",
          "taskId": "A String",
          "taskLocality": "A String",
          "taskMetrics": { # Executor Task Metrics
            "diskBytesSpilled": "A String",
            "executorCpuTimeNanos": "A String",
            "executorDeserializeCpuTimeNanos": "A String",
            "executorDeserializeTimeMillis": "A String",
            "executorRunTimeMillis": "A String",
            "inputMetrics": { # Metrics about the input data read by the task.
              "bytesRead": "A String",
              "recordsRead": "A String",
            },
            "jvmGcTimeMillis": "A String",
            "memoryBytesSpilled": "A String",
            "outputMetrics": { # Metrics about the data written by the task.
              "bytesWritten": "A String",
              "recordsWritten": "A String",
            },
            "peakExecutionMemoryBytes": "A String",
            "resultSerializationTimeMillis": "A String",
            "resultSize": "A String",
            "shuffleReadMetrics": { # Shuffle data read by the task.
              "fetchWaitTimeMillis": "A String",
              "localBlocksFetched": "A String",
              "localBytesRead": "A String",
              "recordsRead": "A String",
              "remoteBlocksFetched": "A String",
              "remoteBytesRead": "A String",
              "remoteBytesReadToDisk": "A String",
              "remoteReqsDuration": "A String",
              "shufflePushReadMetrics": {
                "corruptMergedBlockChunks": "A String",
                "localMergedBlocksFetched": "A String",
                "localMergedBytesRead": "A String",
                "localMergedChunksFetched": "A String",
                "mergedFetchFallbackCount": "A String",
                "remoteMergedBlocksFetched": "A String",
                "remoteMergedBytesRead": "A String",
                "remoteMergedChunksFetched": "A String",
                "remoteMergedReqsDuration": "A String",
              },
            },
            "shuffleWriteMetrics": { # Shuffle data written by task.
              "bytesWritten": "A String",
              "recordsWritten": "A String",
              "writeTimeNanos": "A String",
            },
          },
        },
      },
    },
  ],
}
searchStageAttempts_next()
Retrieves the next page of results.

        Args:
          previous_request: The request for the previous page. (required)
          previous_response: The response from the request for the previous page. (required)

        Returns:
          A request object that you can call 'execute()' on to request the next
          page. Returns None if there are no more items in the collection.
        
searchStages(name, pageSize=None, pageToken=None, parent=None, stageStatus=None, summaryMetricsMask=None, x__xgafv=None)
Obtain data corresponding to stages for a Spark Application.

Args:
  name: string, Required. The fully qualified name of the session to retrieve in the format "projects/PROJECT_ID/locations/DATAPROC_REGION/sessions/SESSION_ID/sparkApplications/APPLICATION_ID" (required)
  pageSize: integer, Optional. Maximum number of stages (paging based on stage_id) to return in each response. The service may return fewer than this. The default page size is 10; the maximum page size is 100.
  pageToken: string, Optional. A page token received from a previous SearchSessionSparkApplicationStages call. Provide this token to retrieve the subsequent page.
  parent: string, Required. Parent (Session) resource reference.
  stageStatus: string, Optional. List only stages in the given state.
    Allowed values
      STAGE_STATUS_UNSPECIFIED - 
      STAGE_STATUS_ACTIVE - 
      STAGE_STATUS_COMPLETE - 
      STAGE_STATUS_FAILED - 
      STAGE_STATUS_PENDING - 
      STAGE_STATUS_SKIPPED - 
  summaryMetricsMask: string, Optional. The list of summary metrics fields to include. Empty list will default to skip all summary metrics fields. Example, if the response should include TaskQuantileMetrics, the request should have task_quantile_metrics in summary_metrics_mask field
  x__xgafv: string, V1 error format.
    Allowed values
      1 - v1 error format
      2 - v2 error format

Returns:
  An object of the form:

    { # A list of stages associated with a Spark Application.
  "nextPageToken": "A String", # This token is included in the response if there are more results to fetch. To fetch additional results, provide this value as the page_token in a subsequent SearchSessionSparkApplicationStages.
  "sparkApplicationStages": [ # Output only. Data corresponding to a stage.
    { # Data corresponding to a stage.
      "accumulatorUpdates": [
        {
          "accumullableInfoId": "A String",
          "name": "A String",
          "update": "A String",
          "value": "A String",
        },
      ],
      "completionTime": "A String",
      "description": "A String",
      "details": "A String",
      "executorMetricsDistributions": {
        "diskBytesSpilled": [
          3.14,
        ],
        "failedTasks": [
          3.14,
        ],
        "inputBytes": [
          3.14,
        ],
        "inputRecords": [
          3.14,
        ],
        "killedTasks": [
          3.14,
        ],
        "memoryBytesSpilled": [
          3.14,
        ],
        "outputBytes": [
          3.14,
        ],
        "outputRecords": [
          3.14,
        ],
        "peakMemoryMetrics": {
          "executorMetrics": [
            {
              "metrics": {
                "a_key": "A String",
              },
            },
          ],
          "quantiles": [
            3.14,
          ],
        },
        "quantiles": [
          3.14,
        ],
        "shuffleRead": [
          3.14,
        ],
        "shuffleReadRecords": [
          3.14,
        ],
        "shuffleWrite": [
          3.14,
        ],
        "shuffleWriteRecords": [
          3.14,
        ],
        "succeededTasks": [
          3.14,
        ],
        "taskTimeMillis": [
          3.14,
        ],
      },
      "executorSummary": {
        "a_key": { # Executor resources consumed by a stage.
          "diskBytesSpilled": "A String",
          "executorId": "A String",
          "failedTasks": 42,
          "inputBytes": "A String",
          "inputRecords": "A String",
          "isExcludedForStage": True or False,
          "killedTasks": 42,
          "memoryBytesSpilled": "A String",
          "outputBytes": "A String",
          "outputRecords": "A String",
          "peakMemoryMetrics": {
            "metrics": {
              "a_key": "A String",
            },
          },
          "shuffleRead": "A String",
          "shuffleReadRecords": "A String",
          "shuffleWrite": "A String",
          "shuffleWriteRecords": "A String",
          "stageAttemptId": 42,
          "stageId": "A String",
          "succeededTasks": 42,
          "taskTimeMillis": "A String",
        },
      },
      "failureReason": "A String",
      "firstTaskLaunchedTime": "A String",
      "isShufflePushEnabled": True or False,
      "jobIds": [
        "A String",
      ],
      "killedTasksSummary": {
        "a_key": 42,
      },
      "locality": {
        "a_key": "A String",
      },
      "name": "A String",
      "numActiveTasks": 42,
      "numCompleteTasks": 42,
      "numCompletedIndices": 42,
      "numFailedTasks": 42,
      "numKilledTasks": 42,
      "numTasks": 42,
      "parentStageIds": [
        "A String",
      ],
      "peakExecutorMetrics": {
        "metrics": {
          "a_key": "A String",
        },
      },
      "rddIds": [
        "A String",
      ],
      "resourceProfileId": 42,
      "schedulingPool": "A String",
      "shuffleMergersCount": 42,
      "speculationSummary": { # Details of the speculation task when speculative execution is enabled.
        "numActiveTasks": 42,
        "numCompletedTasks": 42,
        "numFailedTasks": 42,
        "numKilledTasks": 42,
        "numTasks": 42,
        "stageAttemptId": 42,
        "stageId": "A String",
      },
      "stageAttemptId": 42,
      "stageId": "A String",
      "stageMetrics": { # Stage Level Aggregated Metrics
        "diskBytesSpilled": "A String",
        "executorCpuTimeNanos": "A String",
        "executorDeserializeCpuTimeNanos": "A String",
        "executorDeserializeTimeMillis": "A String",
        "executorRunTimeMillis": "A String",
        "jvmGcTimeMillis": "A String",
        "memoryBytesSpilled": "A String",
        "peakExecutionMemoryBytes": "A String",
        "resultSerializationTimeMillis": "A String",
        "resultSize": "A String",
        "stageInputMetrics": { # Metrics about the input read by the stage.
          "bytesRead": "A String",
          "recordsRead": "A String",
        },
        "stageOutputMetrics": { # Metrics about the output written by the stage.
          "bytesWritten": "A String",
          "recordsWritten": "A String",
        },
        "stageShuffleReadMetrics": { # Shuffle data read for the stage.
          "bytesRead": "A String",
          "fetchWaitTimeMillis": "A String",
          "localBlocksFetched": "A String",
          "localBytesRead": "A String",
          "recordsRead": "A String",
          "remoteBlocksFetched": "A String",
          "remoteBytesRead": "A String",
          "remoteBytesReadToDisk": "A String",
          "remoteReqsDuration": "A String",
          "stageShufflePushReadMetrics": {
            "corruptMergedBlockChunks": "A String",
            "localMergedBlocksFetched": "A String",
            "localMergedBytesRead": "A String",
            "localMergedChunksFetched": "A String",
            "mergedFetchFallbackCount": "A String",
            "remoteMergedBlocksFetched": "A String",
            "remoteMergedBytesRead": "A String",
            "remoteMergedChunksFetched": "A String",
            "remoteMergedReqsDuration": "A String",
          },
        },
        "stageShuffleWriteMetrics": { # Shuffle data written for the stage.
          "bytesWritten": "A String",
          "recordsWritten": "A String",
          "writeTimeNanos": "A String",
        },
      },
      "status": "A String",
      "submissionTime": "A String",
      "taskQuantileMetrics": { # Summary metrics fields. These are included in response only if present in summary_metrics_mask field in request
        "diskBytesSpilled": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
        "durationMillis": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
        "executorCpuTimeNanos": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
        "executorDeserializeCpuTimeNanos": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
        "executorDeserializeTimeMillis": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
        "executorRunTimeMillis": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
        "gettingResultTimeMillis": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
        "inputMetrics": {
          "bytesRead": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "recordsRead": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
        },
        "jvmGcTimeMillis": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
        "memoryBytesSpilled": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
        "outputMetrics": {
          "bytesWritten": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "recordsWritten": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
        },
        "peakExecutionMemoryBytes": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
        "resultSerializationTimeMillis": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
        "resultSize": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
        "schedulerDelayMillis": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
          "count": "A String",
          "maximum": "A String",
          "minimum": "A String",
          "percentile25": "A String",
          "percentile50": "A String",
          "percentile75": "A String",
          "sum": "A String",
        },
        "shuffleReadMetrics": {
          "fetchWaitTimeMillis": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "localBlocksFetched": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "readBytes": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "readRecords": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "remoteBlocksFetched": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "remoteBytesRead": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "remoteBytesReadToDisk": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "remoteReqsDuration": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "shufflePushReadMetrics": {
            "corruptMergedBlockChunks": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
              "count": "A String",
              "maximum": "A String",
              "minimum": "A String",
              "percentile25": "A String",
              "percentile50": "A String",
              "percentile75": "A String",
              "sum": "A String",
            },
            "localMergedBlocksFetched": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
              "count": "A String",
              "maximum": "A String",
              "minimum": "A String",
              "percentile25": "A String",
              "percentile50": "A String",
              "percentile75": "A String",
              "sum": "A String",
            },
            "localMergedBytesRead": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
              "count": "A String",
              "maximum": "A String",
              "minimum": "A String",
              "percentile25": "A String",
              "percentile50": "A String",
              "percentile75": "A String",
              "sum": "A String",
            },
            "localMergedChunksFetched": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
              "count": "A String",
              "maximum": "A String",
              "minimum": "A String",
              "percentile25": "A String",
              "percentile50": "A String",
              "percentile75": "A String",
              "sum": "A String",
            },
            "mergedFetchFallbackCount": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
              "count": "A String",
              "maximum": "A String",
              "minimum": "A String",
              "percentile25": "A String",
              "percentile50": "A String",
              "percentile75": "A String",
              "sum": "A String",
            },
            "remoteMergedBlocksFetched": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
              "count": "A String",
              "maximum": "A String",
              "minimum": "A String",
              "percentile25": "A String",
              "percentile50": "A String",
              "percentile75": "A String",
              "sum": "A String",
            },
            "remoteMergedBytesRead": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
              "count": "A String",
              "maximum": "A String",
              "minimum": "A String",
              "percentile25": "A String",
              "percentile50": "A String",
              "percentile75": "A String",
              "sum": "A String",
            },
            "remoteMergedChunksFetched": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
              "count": "A String",
              "maximum": "A String",
              "minimum": "A String",
              "percentile25": "A String",
              "percentile50": "A String",
              "percentile75": "A String",
              "sum": "A String",
            },
            "remoteMergedReqsDuration": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
              "count": "A String",
              "maximum": "A String",
              "minimum": "A String",
              "percentile25": "A String",
              "percentile50": "A String",
              "percentile75": "A String",
              "sum": "A String",
            },
          },
          "totalBlocksFetched": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
        },
        "shuffleWriteMetrics": {
          "writeBytes": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "writeRecords": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "writeTimeNanos": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
        },
      },
      "tasks": {
        "a_key": { # Data corresponding to tasks created by spark.
          "accumulatorUpdates": [
            {
              "accumullableInfoId": "A String",
              "name": "A String",
              "update": "A String",
              "value": "A String",
            },
          ],
          "attempt": 42,
          "durationMillis": "A String",
          "errorMessage": "A String",
          "executorId": "A String",
          "executorLogs": {
            "a_key": "A String",
          },
          "gettingResultTimeMillis": "A String",
          "hasMetrics": True or False,
          "host": "A String",
          "index": 42,
          "launchTime": "A String",
          "partitionId": 42,
          "resultFetchStart": "A String",
          "schedulerDelayMillis": "A String",
          "speculative": True or False,
          "stageAttemptId": 42,
          "stageId": "A String",
          "status": "A String",
          "taskId": "A String",
          "taskLocality": "A String",
          "taskMetrics": { # Executor Task Metrics
            "diskBytesSpilled": "A String",
            "executorCpuTimeNanos": "A String",
            "executorDeserializeCpuTimeNanos": "A String",
            "executorDeserializeTimeMillis": "A String",
            "executorRunTimeMillis": "A String",
            "inputMetrics": { # Metrics about the input data read by the task.
              "bytesRead": "A String",
              "recordsRead": "A String",
            },
            "jvmGcTimeMillis": "A String",
            "memoryBytesSpilled": "A String",
            "outputMetrics": { # Metrics about the data written by the task.
              "bytesWritten": "A String",
              "recordsWritten": "A String",
            },
            "peakExecutionMemoryBytes": "A String",
            "resultSerializationTimeMillis": "A String",
            "resultSize": "A String",
            "shuffleReadMetrics": { # Shuffle data read by the task.
              "fetchWaitTimeMillis": "A String",
              "localBlocksFetched": "A String",
              "localBytesRead": "A String",
              "recordsRead": "A String",
              "remoteBlocksFetched": "A String",
              "remoteBytesRead": "A String",
              "remoteBytesReadToDisk": "A String",
              "remoteReqsDuration": "A String",
              "shufflePushReadMetrics": {
                "corruptMergedBlockChunks": "A String",
                "localMergedBlocksFetched": "A String",
                "localMergedBytesRead": "A String",
                "localMergedChunksFetched": "A String",
                "mergedFetchFallbackCount": "A String",
                "remoteMergedBlocksFetched": "A String",
                "remoteMergedBytesRead": "A String",
                "remoteMergedChunksFetched": "A String",
                "remoteMergedReqsDuration": "A String",
              },
            },
            "shuffleWriteMetrics": { # Shuffle data written by task.
              "bytesWritten": "A String",
              "recordsWritten": "A String",
              "writeTimeNanos": "A String",
            },
          },
        },
      },
    },
  ],
}
searchStages_next()
Retrieves the next page of results.

        Args:
          previous_request: The request for the previous page. (required)
          previous_response: The response from the request for the previous page. (required)

        Returns:
          A request object that you can call 'execute()' on to request the next
          page. Returns None if there are no more items in the collection.
        
search_next()
Retrieves the next page of results.

        Args:
          previous_request: The request for the previous page. (required)
          previous_response: The response from the request for the previous page. (required)

        Returns:
          A request object that you can call 'execute()' on to request the next
          page. Returns None if there are no more items in the collection.
        
summarizeExecutors(name, parent=None, x__xgafv=None)
Obtain summary of Executor Summary for a Spark Application

Args:
  name: string, Required. The fully qualified name of the session to retrieve in the format "projects/PROJECT_ID/locations/DATAPROC_REGION/sessions/SESSION_ID/sparkApplications/APPLICATION_ID" (required)
  parent: string, Required. Parent (Session) resource reference.
  x__xgafv: string, V1 error format.
    Allowed values
      1 - v1 error format
      2 - v2 error format

Returns:
  An object of the form:

    { # Consolidated summary of executors for a Spark Application.
  "activeExecutorSummary": { # Consolidated summary about executors used by the application. # Consolidated summary for active executors.
    "activeTasks": 42,
    "completedTasks": 42,
    "count": 42,
    "diskUsed": "A String",
    "failedTasks": 42,
    "isExcluded": 42,
    "maxMemory": "A String",
    "memoryMetrics": {
      "totalOffHeapStorageMemory": "A String",
      "totalOnHeapStorageMemory": "A String",
      "usedOffHeapStorageMemory": "A String",
      "usedOnHeapStorageMemory": "A String",
    },
    "memoryUsed": "A String",
    "rddBlocks": 42,
    "totalCores": 42,
    "totalDurationMillis": "A String",
    "totalGcTimeMillis": "A String",
    "totalInputBytes": "A String",
    "totalShuffleRead": "A String",
    "totalShuffleWrite": "A String",
    "totalTasks": 42,
  },
  "applicationId": "A String", # Spark Application Id
  "deadExecutorSummary": { # Consolidated summary about executors used by the application. # Consolidated summary for dead executors.
    "activeTasks": 42,
    "completedTasks": 42,
    "count": 42,
    "diskUsed": "A String",
    "failedTasks": 42,
    "isExcluded": 42,
    "maxMemory": "A String",
    "memoryMetrics": {
      "totalOffHeapStorageMemory": "A String",
      "totalOnHeapStorageMemory": "A String",
      "usedOffHeapStorageMemory": "A String",
      "usedOnHeapStorageMemory": "A String",
    },
    "memoryUsed": "A String",
    "rddBlocks": 42,
    "totalCores": 42,
    "totalDurationMillis": "A String",
    "totalGcTimeMillis": "A String",
    "totalInputBytes": "A String",
    "totalShuffleRead": "A String",
    "totalShuffleWrite": "A String",
    "totalTasks": 42,
  },
  "totalExecutorSummary": { # Consolidated summary about executors used by the application. # Overall consolidated summary for all executors.
    "activeTasks": 42,
    "completedTasks": 42,
    "count": 42,
    "diskUsed": "A String",
    "failedTasks": 42,
    "isExcluded": 42,
    "maxMemory": "A String",
    "memoryMetrics": {
      "totalOffHeapStorageMemory": "A String",
      "totalOnHeapStorageMemory": "A String",
      "usedOffHeapStorageMemory": "A String",
      "usedOnHeapStorageMemory": "A String",
    },
    "memoryUsed": "A String",
    "rddBlocks": 42,
    "totalCores": 42,
    "totalDurationMillis": "A String",
    "totalGcTimeMillis": "A String",
    "totalInputBytes": "A String",
    "totalShuffleRead": "A String",
    "totalShuffleWrite": "A String",
    "totalTasks": 42,
  },
}
summarizeJobs(name, parent=None, x__xgafv=None)
Obtain summary of Jobs for a Spark Application

Args:
  name: string, Required. The fully qualified name of the session to retrieve in the format "projects/PROJECT_ID/locations/DATAPROC_REGION/sessions/SESSION_ID/sparkApplications/APPLICATION_ID" (required)
  parent: string, Required. Parent (Session) resource reference.
  x__xgafv: string, V1 error format.
    Allowed values
      1 - v1 error format
      2 - v2 error format

Returns:
  An object of the form:

    { # Summary of a Spark Application jobs.
  "jobsSummary": { # Data related to Jobs page summary # Summary of a Spark Application Jobs
    "activeJobs": 42, # Number of active jobs
    "applicationId": "A String", # Spark Application Id
    "attempts": [ # Attempts info
      { # Specific attempt of an application.
        "appSparkVersion": "A String",
        "attemptId": "A String",
        "completed": True or False,
        "durationMillis": "A String",
        "endTime": "A String",
        "lastUpdated": "A String",
        "sparkUser": "A String",
        "startTime": "A String",
      },
    ],
    "completedJobs": 42, # Number of completed jobs
    "failedJobs": 42, # Number of failed jobs
    "schedulingMode": "A String", # Spark Scheduling mode
  },
}
summarizeStageAttemptTasks(name, parent=None, stageAttemptId=None, stageId=None, x__xgafv=None)
Obtain summary of Tasks for a Spark Application Stage Attempt

Args:
  name: string, Required. The fully qualified name of the session to retrieve in the format "projects/PROJECT_ID/locations/DATAPROC_REGION/sessions/SESSION_ID/sparkApplications/APPLICATION_ID" (required)
  parent: string, Required. Parent (Session) resource reference.
  stageAttemptId: integer, Required. Stage Attempt ID
  stageId: string, Required. Stage ID
  x__xgafv: string, V1 error format.
    Allowed values
      1 - v1 error format
      2 - v2 error format

Returns:
  An object of the form:

    { # Summary of tasks for a Spark Application stage attempt.
  "stageAttemptTasksSummary": { # Data related to tasks summary for a Spark Stage Attempt # Summary of tasks for a Spark Application Stage Attempt
    "applicationId": "A String",
    "numFailedTasks": 42,
    "numKilledTasks": 42,
    "numPendingTasks": 42,
    "numRunningTasks": 42,
    "numSuccessTasks": 42,
    "numTasks": 42,
    "stageAttemptId": 42,
    "stageId": "A String",
  },
}
summarizeStages(name, parent=None, x__xgafv=None)
Obtain summary of Stages for a Spark Application

Args:
  name: string, Required. The fully qualified name of the session to retrieve in the format "projects/PROJECT_ID/locations/DATAPROC_REGION/sessions/SESSION_ID/sparkApplications/APPLICATION_ID" (required)
  parent: string, Required. Parent (Session) resource reference.
  x__xgafv: string, V1 error format.
    Allowed values
      1 - v1 error format
      2 - v2 error format

Returns:
  An object of the form:

    { # Summary of a Spark Application stages.
  "stagesSummary": { # Data related to Stages page summary # Summary of a Spark Application Stages
    "applicationId": "A String",
    "numActiveStages": 42,
    "numCompletedStages": 42,
    "numFailedStages": 42,
    "numPendingStages": 42,
    "numSkippedStages": 42,
  },
}
write(name, body=None, x__xgafv=None)
Write wrapper objects from dataplane to spanner

Args:
  name: string, Required. The fully qualified name of the spark application to write data about in the format "projects/PROJECT_ID/locations/DATAPROC_REGION/sessions/SESSION_ID/sparkApplications/APPLICATION_ID" (required)
  body: object, The request body.
    The object takes the form of:

{ # Write Spark Application data to internal storage systems
  "parent": "A String", # Required. Parent (Batch) resource reference.
  "sparkWrapperObjects": [ # Required. The batch of spark application context objects sent for ingestion.
    { # Outer message that contains the data obtained from spark listener, packaged with information that is required to process it.
      "appSummary": {
        "numCompletedJobs": 42,
        "numCompletedStages": 42,
      },
      "applicationEnvironmentInfo": { # Details about the Environment that the application is running in.
        "classpathEntries": {
          "a_key": "A String",
        },
        "hadoopProperties": {
          "a_key": "A String",
        },
        "metricsProperties": {
          "a_key": "A String",
        },
        "resourceProfiles": [
          { # Resource profile that contains information about all the resources required by executors and tasks.
            "executorResources": {
              "a_key": { # Resources used per executor used by the application.
                "amount": "A String",
                "discoveryScript": "A String",
                "resourceName": "A String",
                "vendor": "A String",
              },
            },
            "resourceProfileId": 42,
            "taskResources": {
              "a_key": { # Resources used per task created by the application.
                "amount": 3.14,
                "resourceName": "A String",
              },
            },
          },
        ],
        "runtime": {
          "javaHome": "A String",
          "javaVersion": "A String",
          "scalaVersion": "A String",
        },
        "sparkProperties": {
          "a_key": "A String",
        },
        "systemProperties": {
          "a_key": "A String",
        },
      },
      "applicationId": "A String", # Application Id created by Spark.
      "applicationInfo": { # High level information corresponding to an application.
        "applicationContextIngestionStatus": "A String",
        "applicationId": "A String",
        "attempts": [
          { # Specific attempt of an application.
            "appSparkVersion": "A String",
            "attemptId": "A String",
            "completed": True or False,
            "durationMillis": "A String",
            "endTime": "A String",
            "lastUpdated": "A String",
            "sparkUser": "A String",
            "startTime": "A String",
          },
        ],
        "coresGranted": 42,
        "coresPerExecutor": 42,
        "maxCores": 42,
        "memoryPerExecutorMb": 42,
        "name": "A String",
        "quantileDataStatus": "A String",
      },
      "eventTimestamp": "A String", # VM Timestamp associated with the data object.
      "executorStageSummary": { # Executor resources consumed by a stage.
        "diskBytesSpilled": "A String",
        "executorId": "A String",
        "failedTasks": 42,
        "inputBytes": "A String",
        "inputRecords": "A String",
        "isExcludedForStage": True or False,
        "killedTasks": 42,
        "memoryBytesSpilled": "A String",
        "outputBytes": "A String",
        "outputRecords": "A String",
        "peakMemoryMetrics": {
          "metrics": {
            "a_key": "A String",
          },
        },
        "shuffleRead": "A String",
        "shuffleReadRecords": "A String",
        "shuffleWrite": "A String",
        "shuffleWriteRecords": "A String",
        "stageAttemptId": 42,
        "stageId": "A String",
        "succeededTasks": 42,
        "taskTimeMillis": "A String",
      },
      "executorSummary": { # Details about executors used by the application.
        "activeTasks": 42,
        "addTime": "A String",
        "attributes": {
          "a_key": "A String",
        },
        "completedTasks": 42,
        "diskUsed": "A String",
        "excludedInStages": [
          "A String",
        ],
        "executorId": "A String",
        "executorLogs": {
          "a_key": "A String",
        },
        "failedTasks": 42,
        "hostPort": "A String",
        "isActive": True or False,
        "isExcluded": True or False,
        "maxMemory": "A String",
        "maxTasks": 42,
        "memoryMetrics": {
          "totalOffHeapStorageMemory": "A String",
          "totalOnHeapStorageMemory": "A String",
          "usedOffHeapStorageMemory": "A String",
          "usedOnHeapStorageMemory": "A String",
        },
        "memoryUsed": "A String",
        "peakMemoryMetrics": {
          "metrics": {
            "a_key": "A String",
          },
        },
        "rddBlocks": 42,
        "removeReason": "A String",
        "removeTime": "A String",
        "resourceProfileId": 42,
        "resources": {
          "a_key": {
            "addresses": [
              "A String",
            ],
            "name": "A String",
          },
        },
        "totalCores": 42,
        "totalDurationMillis": "A String",
        "totalGcTimeMillis": "A String",
        "totalInputBytes": "A String",
        "totalShuffleRead": "A String",
        "totalShuffleWrite": "A String",
        "totalTasks": 42,
      },
      "jobData": { # Data corresponding to a spark job.
        "completionTime": "A String",
        "description": "A String",
        "jobGroup": "A String",
        "jobId": "A String",
        "killTasksSummary": {
          "a_key": 42,
        },
        "name": "A String",
        "numActiveStages": 42,
        "numActiveTasks": 42,
        "numCompletedIndices": 42,
        "numCompletedStages": 42,
        "numCompletedTasks": 42,
        "numFailedStages": 42,
        "numFailedTasks": 42,
        "numKilledTasks": 42,
        "numSkippedStages": 42,
        "numSkippedTasks": 42,
        "numTasks": 42,
        "skippedStages": [
          42,
        ],
        "sqlExecutionId": "A String",
        "stageIds": [
          "A String",
        ],
        "status": "A String",
        "submissionTime": "A String",
      },
      "poolData": { # Pool Data
        "name": "A String",
        "stageIds": [
          "A String",
        ],
      },
      "processSummary": { # Process Summary
        "addTime": "A String",
        "hostPort": "A String",
        "isActive": True or False,
        "processId": "A String",
        "processLogs": {
          "a_key": "A String",
        },
        "removeTime": "A String",
        "totalCores": 42,
      },
      "rddOperationGraph": { # Graph representing RDD dependencies. Consists of edges and a root cluster.
        "edges": [
          { # A directed edge representing dependency between two RDDs.
            "fromId": 42,
            "toId": 42,
          },
        ],
        "incomingEdges": [
          { # A directed edge representing dependency between two RDDs.
            "fromId": 42,
            "toId": 42,
          },
        ],
        "outgoingEdges": [
          { # A directed edge representing dependency between two RDDs.
            "fromId": 42,
            "toId": 42,
          },
        ],
        "rootCluster": { # A grouping of nodes representing higher level constructs (stage, job etc.).
          "childClusters": [
            # Object with schema name: RddOperationCluster
          ],
          "childNodes": [
            { # A node in the RDD operation graph. Corresponds to a single RDD.
              "barrier": True or False,
              "cached": True or False,
              "callsite": "A String",
              "name": "A String",
              "nodeId": 42,
              "outputDeterministicLevel": "A String",
            },
          ],
          "name": "A String",
          "rddClusterId": "A String",
        },
        "stageId": "A String",
      },
      "rddStorageInfo": { # Overall data about RDD storage.
        "dataDistribution": [
          { # Details about RDD usage.
            "address": "A String",
            "diskUsed": "A String",
            "memoryRemaining": "A String",
            "memoryUsed": "A String",
            "offHeapMemoryRemaining": "A String",
            "offHeapMemoryUsed": "A String",
            "onHeapMemoryRemaining": "A String",
            "onHeapMemoryUsed": "A String",
          },
        ],
        "diskUsed": "A String",
        "memoryUsed": "A String",
        "name": "A String",
        "numCachedPartitions": 42,
        "numPartitions": 42,
        "partitions": [
          { # Information about RDD partitions.
            "blockName": "A String",
            "diskUsed": "A String",
            "executors": [
              "A String",
            ],
            "memoryUsed": "A String",
            "storageLevel": "A String",
          },
        ],
        "rddStorageId": 42,
        "storageLevel": "A String",
      },
      "resourceProfileInfo": { # Resource profile that contains information about all the resources required by executors and tasks.
        "executorResources": {
          "a_key": { # Resources used per executor used by the application.
            "amount": "A String",
            "discoveryScript": "A String",
            "resourceName": "A String",
            "vendor": "A String",
          },
        },
        "resourceProfileId": 42,
        "taskResources": {
          "a_key": { # Resources used per task created by the application.
            "amount": 3.14,
            "resourceName": "A String",
          },
        },
      },
      "sparkPlanGraph": { # A graph used for storing information of an executionPlan of DataFrame.
        "edges": [
          { # Represents a directed edge in the spark plan tree from child to parent.
            "fromId": "A String",
            "toId": "A String",
          },
        ],
        "executionId": "A String",
        "nodes": [
          { # Wrapper user to represent either a node or a cluster.
            "cluster": { # Represents a tree of spark plan.
              "desc": "A String",
              "metrics": [
                { # Metrics related to SQL execution.
                  "accumulatorId": "A String",
                  "metricType": "A String",
                  "name": "A String",
                },
              ],
              "name": "A String",
              "nodes": [
                # Object with schema name: SparkPlanGraphNodeWrapper
              ],
              "sparkPlanGraphClusterId": "A String",
            },
            "node": { # Represents a node in the spark plan tree.
              "desc": "A String",
              "metrics": [
                { # Metrics related to SQL execution.
                  "accumulatorId": "A String",
                  "metricType": "A String",
                  "name": "A String",
                },
              ],
              "name": "A String",
              "sparkPlanGraphNodeId": "A String",
            },
          },
        ],
      },
      "speculationStageSummary": { # Details of the speculation task when speculative execution is enabled.
        "numActiveTasks": 42,
        "numCompletedTasks": 42,
        "numFailedTasks": 42,
        "numKilledTasks": 42,
        "numTasks": 42,
        "stageAttemptId": 42,
        "stageId": "A String",
      },
      "sqlExecutionUiData": { # SQL Execution Data
        "completionTime": "A String",
        "description": "A String",
        "details": "A String",
        "errorMessage": "A String",
        "executionId": "A String",
        "jobs": {
          "a_key": "A String",
        },
        "metricValues": {
          "a_key": "A String",
        },
        "metricValuesIsNull": True or False,
        "metrics": [
          { # Metrics related to SQL execution.
            "accumulatorId": "A String",
            "metricType": "A String",
            "name": "A String",
          },
        ],
        "modifiedConfigs": {
          "a_key": "A String",
        },
        "physicalPlanDescription": "A String",
        "rootExecutionId": "A String",
        "stages": [
          "A String",
        ],
        "submissionTime": "A String",
      },
      "stageData": { # Data corresponding to a stage.
        "accumulatorUpdates": [
          {
            "accumullableInfoId": "A String",
            "name": "A String",
            "update": "A String",
            "value": "A String",
          },
        ],
        "completionTime": "A String",
        "description": "A String",
        "details": "A String",
        "executorMetricsDistributions": {
          "diskBytesSpilled": [
            3.14,
          ],
          "failedTasks": [
            3.14,
          ],
          "inputBytes": [
            3.14,
          ],
          "inputRecords": [
            3.14,
          ],
          "killedTasks": [
            3.14,
          ],
          "memoryBytesSpilled": [
            3.14,
          ],
          "outputBytes": [
            3.14,
          ],
          "outputRecords": [
            3.14,
          ],
          "peakMemoryMetrics": {
            "executorMetrics": [
              {
                "metrics": {
                  "a_key": "A String",
                },
              },
            ],
            "quantiles": [
              3.14,
            ],
          },
          "quantiles": [
            3.14,
          ],
          "shuffleRead": [
            3.14,
          ],
          "shuffleReadRecords": [
            3.14,
          ],
          "shuffleWrite": [
            3.14,
          ],
          "shuffleWriteRecords": [
            3.14,
          ],
          "succeededTasks": [
            3.14,
          ],
          "taskTimeMillis": [
            3.14,
          ],
        },
        "executorSummary": {
          "a_key": { # Executor resources consumed by a stage.
            "diskBytesSpilled": "A String",
            "executorId": "A String",
            "failedTasks": 42,
            "inputBytes": "A String",
            "inputRecords": "A String",
            "isExcludedForStage": True or False,
            "killedTasks": 42,
            "memoryBytesSpilled": "A String",
            "outputBytes": "A String",
            "outputRecords": "A String",
            "peakMemoryMetrics": {
              "metrics": {
                "a_key": "A String",
              },
            },
            "shuffleRead": "A String",
            "shuffleReadRecords": "A String",
            "shuffleWrite": "A String",
            "shuffleWriteRecords": "A String",
            "stageAttemptId": 42,
            "stageId": "A String",
            "succeededTasks": 42,
            "taskTimeMillis": "A String",
          },
        },
        "failureReason": "A String",
        "firstTaskLaunchedTime": "A String",
        "isShufflePushEnabled": True or False,
        "jobIds": [
          "A String",
        ],
        "killedTasksSummary": {
          "a_key": 42,
        },
        "locality": {
          "a_key": "A String",
        },
        "name": "A String",
        "numActiveTasks": 42,
        "numCompleteTasks": 42,
        "numCompletedIndices": 42,
        "numFailedTasks": 42,
        "numKilledTasks": 42,
        "numTasks": 42,
        "parentStageIds": [
          "A String",
        ],
        "peakExecutorMetrics": {
          "metrics": {
            "a_key": "A String",
          },
        },
        "rddIds": [
          "A String",
        ],
        "resourceProfileId": 42,
        "schedulingPool": "A String",
        "shuffleMergersCount": 42,
        "speculationSummary": { # Details of the speculation task when speculative execution is enabled.
          "numActiveTasks": 42,
          "numCompletedTasks": 42,
          "numFailedTasks": 42,
          "numKilledTasks": 42,
          "numTasks": 42,
          "stageAttemptId": 42,
          "stageId": "A String",
        },
        "stageAttemptId": 42,
        "stageId": "A String",
        "stageMetrics": { # Stage Level Aggregated Metrics
          "diskBytesSpilled": "A String",
          "executorCpuTimeNanos": "A String",
          "executorDeserializeCpuTimeNanos": "A String",
          "executorDeserializeTimeMillis": "A String",
          "executorRunTimeMillis": "A String",
          "jvmGcTimeMillis": "A String",
          "memoryBytesSpilled": "A String",
          "peakExecutionMemoryBytes": "A String",
          "resultSerializationTimeMillis": "A String",
          "resultSize": "A String",
          "stageInputMetrics": { # Metrics about the input read by the stage.
            "bytesRead": "A String",
            "recordsRead": "A String",
          },
          "stageOutputMetrics": { # Metrics about the output written by the stage.
            "bytesWritten": "A String",
            "recordsWritten": "A String",
          },
          "stageShuffleReadMetrics": { # Shuffle data read for the stage.
            "bytesRead": "A String",
            "fetchWaitTimeMillis": "A String",
            "localBlocksFetched": "A String",
            "localBytesRead": "A String",
            "recordsRead": "A String",
            "remoteBlocksFetched": "A String",
            "remoteBytesRead": "A String",
            "remoteBytesReadToDisk": "A String",
            "remoteReqsDuration": "A String",
            "stageShufflePushReadMetrics": {
              "corruptMergedBlockChunks": "A String",
              "localMergedBlocksFetched": "A String",
              "localMergedBytesRead": "A String",
              "localMergedChunksFetched": "A String",
              "mergedFetchFallbackCount": "A String",
              "remoteMergedBlocksFetched": "A String",
              "remoteMergedBytesRead": "A String",
              "remoteMergedChunksFetched": "A String",
              "remoteMergedReqsDuration": "A String",
            },
          },
          "stageShuffleWriteMetrics": { # Shuffle data written for the stage.
            "bytesWritten": "A String",
            "recordsWritten": "A String",
            "writeTimeNanos": "A String",
          },
        },
        "status": "A String",
        "submissionTime": "A String",
        "taskQuantileMetrics": { # Summary metrics fields. These are included in response only if present in summary_metrics_mask field in request
          "diskBytesSpilled": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "durationMillis": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "executorCpuTimeNanos": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "executorDeserializeCpuTimeNanos": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "executorDeserializeTimeMillis": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "executorRunTimeMillis": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "gettingResultTimeMillis": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "inputMetrics": {
            "bytesRead": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
              "count": "A String",
              "maximum": "A String",
              "minimum": "A String",
              "percentile25": "A String",
              "percentile50": "A String",
              "percentile75": "A String",
              "sum": "A String",
            },
            "recordsRead": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
              "count": "A String",
              "maximum": "A String",
              "minimum": "A String",
              "percentile25": "A String",
              "percentile50": "A String",
              "percentile75": "A String",
              "sum": "A String",
            },
          },
          "jvmGcTimeMillis": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "memoryBytesSpilled": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "outputMetrics": {
            "bytesWritten": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
              "count": "A String",
              "maximum": "A String",
              "minimum": "A String",
              "percentile25": "A String",
              "percentile50": "A String",
              "percentile75": "A String",
              "sum": "A String",
            },
            "recordsWritten": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
              "count": "A String",
              "maximum": "A String",
              "minimum": "A String",
              "percentile25": "A String",
              "percentile50": "A String",
              "percentile75": "A String",
              "sum": "A String",
            },
          },
          "peakExecutionMemoryBytes": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "resultSerializationTimeMillis": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "resultSize": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "schedulerDelayMillis": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
            "count": "A String",
            "maximum": "A String",
            "minimum": "A String",
            "percentile25": "A String",
            "percentile50": "A String",
            "percentile75": "A String",
            "sum": "A String",
          },
          "shuffleReadMetrics": {
            "fetchWaitTimeMillis": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
              "count": "A String",
              "maximum": "A String",
              "minimum": "A String",
              "percentile25": "A String",
              "percentile50": "A String",
              "percentile75": "A String",
              "sum": "A String",
            },
            "localBlocksFetched": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
              "count": "A String",
              "maximum": "A String",
              "minimum": "A String",
              "percentile25": "A String",
              "percentile50": "A String",
              "percentile75": "A String",
              "sum": "A String",
            },
            "readBytes": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
              "count": "A String",
              "maximum": "A String",
              "minimum": "A String",
              "percentile25": "A String",
              "percentile50": "A String",
              "percentile75": "A String",
              "sum": "A String",
            },
            "readRecords": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
              "count": "A String",
              "maximum": "A String",
              "minimum": "A String",
              "percentile25": "A String",
              "percentile50": "A String",
              "percentile75": "A String",
              "sum": "A String",
            },
            "remoteBlocksFetched": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
              "count": "A String",
              "maximum": "A String",
              "minimum": "A String",
              "percentile25": "A String",
              "percentile50": "A String",
              "percentile75": "A String",
              "sum": "A String",
            },
            "remoteBytesRead": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
              "count": "A String",
              "maximum": "A String",
              "minimum": "A String",
              "percentile25": "A String",
              "percentile50": "A String",
              "percentile75": "A String",
              "sum": "A String",
            },
            "remoteBytesReadToDisk": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
              "count": "A String",
              "maximum": "A String",
              "minimum": "A String",
              "percentile25": "A String",
              "percentile50": "A String",
              "percentile75": "A String",
              "sum": "A String",
            },
            "remoteReqsDuration": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
              "count": "A String",
              "maximum": "A String",
              "minimum": "A String",
              "percentile25": "A String",
              "percentile50": "A String",
              "percentile75": "A String",
              "sum": "A String",
            },
            "shufflePushReadMetrics": {
              "corruptMergedBlockChunks": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
                "count": "A String",
                "maximum": "A String",
                "minimum": "A String",
                "percentile25": "A String",
                "percentile50": "A String",
                "percentile75": "A String",
                "sum": "A String",
              },
              "localMergedBlocksFetched": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
                "count": "A String",
                "maximum": "A String",
                "minimum": "A String",
                "percentile25": "A String",
                "percentile50": "A String",
                "percentile75": "A String",
                "sum": "A String",
              },
              "localMergedBytesRead": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
                "count": "A String",
                "maximum": "A String",
                "minimum": "A String",
                "percentile25": "A String",
                "percentile50": "A String",
                "percentile75": "A String",
                "sum": "A String",
              },
              "localMergedChunksFetched": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
                "count": "A String",
                "maximum": "A String",
                "minimum": "A String",
                "percentile25": "A String",
                "percentile50": "A String",
                "percentile75": "A String",
                "sum": "A String",
              },
              "mergedFetchFallbackCount": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
                "count": "A String",
                "maximum": "A String",
                "minimum": "A String",
                "percentile25": "A String",
                "percentile50": "A String",
                "percentile75": "A String",
                "sum": "A String",
              },
              "remoteMergedBlocksFetched": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
                "count": "A String",
                "maximum": "A String",
                "minimum": "A String",
                "percentile25": "A String",
                "percentile50": "A String",
                "percentile75": "A String",
                "sum": "A String",
              },
              "remoteMergedBytesRead": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
                "count": "A String",
                "maximum": "A String",
                "minimum": "A String",
                "percentile25": "A String",
                "percentile50": "A String",
                "percentile75": "A String",
                "sum": "A String",
              },
              "remoteMergedChunksFetched": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
                "count": "A String",
                "maximum": "A String",
                "minimum": "A String",
                "percentile25": "A String",
                "percentile50": "A String",
                "percentile75": "A String",
                "sum": "A String",
              },
              "remoteMergedReqsDuration": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
                "count": "A String",
                "maximum": "A String",
                "minimum": "A String",
                "percentile25": "A String",
                "percentile50": "A String",
                "percentile75": "A String",
                "sum": "A String",
              },
            },
            "totalBlocksFetched": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
              "count": "A String",
              "maximum": "A String",
              "minimum": "A String",
              "percentile25": "A String",
              "percentile50": "A String",
              "percentile75": "A String",
              "sum": "A String",
            },
          },
          "shuffleWriteMetrics": {
            "writeBytes": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
              "count": "A String",
              "maximum": "A String",
              "minimum": "A String",
              "percentile25": "A String",
              "percentile50": "A String",
              "percentile75": "A String",
              "sum": "A String",
            },
            "writeRecords": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
              "count": "A String",
              "maximum": "A String",
              "minimum": "A String",
              "percentile25": "A String",
              "percentile50": "A String",
              "percentile75": "A String",
              "sum": "A String",
            },
            "writeTimeNanos": { # Quantile metrics data related to Tasks. Units can be seconds, bytes, milliseconds, etc depending on the message type.
              "count": "A String",
              "maximum": "A String",
              "minimum": "A String",
              "percentile25": "A String",
              "percentile50": "A String",
              "percentile75": "A String",
              "sum": "A String",
            },
          },
        },
        "tasks": {
          "a_key": { # Data corresponding to tasks created by spark.
            "accumulatorUpdates": [
              {
                "accumullableInfoId": "A String",
                "name": "A String",
                "update": "A String",
                "value": "A String",
              },
            ],
            "attempt": 42,
            "durationMillis": "A String",
            "errorMessage": "A String",
            "executorId": "A String",
            "executorLogs": {
              "a_key": "A String",
            },
            "gettingResultTimeMillis": "A String",
            "hasMetrics": True or False,
            "host": "A String",
            "index": 42,
            "launchTime": "A String",
            "partitionId": 42,
            "resultFetchStart": "A String",
            "schedulerDelayMillis": "A String",
            "speculative": True or False,
            "stageAttemptId": 42,
            "stageId": "A String",
            "status": "A String",
            "taskId": "A String",
            "taskLocality": "A String",
            "taskMetrics": { # Executor Task Metrics
              "diskBytesSpilled": "A String",
              "executorCpuTimeNanos": "A String",
              "executorDeserializeCpuTimeNanos": "A String",
              "executorDeserializeTimeMillis": "A String",
              "executorRunTimeMillis": "A String",
              "inputMetrics": { # Metrics about the input data read by the task.
                "bytesRead": "A String",
                "recordsRead": "A String",
              },
              "jvmGcTimeMillis": "A String",
              "memoryBytesSpilled": "A String",
              "outputMetrics": { # Metrics about the data written by the task.
                "bytesWritten": "A String",
                "recordsWritten": "A String",
              },
              "peakExecutionMemoryBytes": "A String",
              "resultSerializationTimeMillis": "A String",
              "resultSize": "A String",
              "shuffleReadMetrics": { # Shuffle data read by the task.
                "fetchWaitTimeMillis": "A String",
                "localBlocksFetched": "A String",
                "localBytesRead": "A String",
                "recordsRead": "A String",
                "remoteBlocksFetched": "A String",
                "remoteBytesRead": "A String",
                "remoteBytesReadToDisk": "A String",
                "remoteReqsDuration": "A String",
                "shufflePushReadMetrics": {
                  "corruptMergedBlockChunks": "A String",
                  "localMergedBlocksFetched": "A String",
                  "localMergedBytesRead": "A String",
                  "localMergedChunksFetched": "A String",
                  "mergedFetchFallbackCount": "A String",
                  "remoteMergedBlocksFetched": "A String",
                  "remoteMergedBytesRead": "A String",
                  "remoteMergedChunksFetched": "A String",
                  "remoteMergedReqsDuration": "A String",
                },
              },
              "shuffleWriteMetrics": { # Shuffle data written by task.
                "bytesWritten": "A String",
                "recordsWritten": "A String",
                "writeTimeNanos": "A String",
              },
            },
          },
        },
      },
      "streamBlockData": { # Stream Block Data.
        "deserialized": True or False,
        "diskSize": "A String",
        "executorId": "A String",
        "hostPort": "A String",
        "memSize": "A String",
        "name": "A String",
        "storageLevel": "A String",
        "useDisk": True or False,
        "useMemory": True or False,
      },
      "streamingQueryData": { # Streaming
        "endTimestamp": "A String",
        "exception": "A String",
        "isActive": True or False,
        "name": "A String",
        "runId": "A String",
        "startTimestamp": "A String",
        "streamingQueryId": "A String",
      },
      "streamingQueryProgress": {
        "batchDuration": "A String",
        "batchId": "A String",
        "durationMillis": {
          "a_key": "A String",
        },
        "eventTime": {
          "a_key": "A String",
        },
        "name": "A String",
        "observedMetrics": {
          "a_key": "A String",
        },
        "runId": "A String",
        "sink": {
          "description": "A String",
          "metrics": {
            "a_key": "A String",
          },
          "numOutputRows": "A String",
        },
        "sources": [
          {
            "description": "A String",
            "endOffset": "A String",
            "inputRowsPerSecond": 3.14,
            "latestOffset": "A String",
            "metrics": {
              "a_key": "A String",
            },
            "numInputRows": "A String",
            "processedRowsPerSecond": 3.14,
            "startOffset": "A String",
          },
        ],
        "stateOperators": [
          {
            "allRemovalsTimeMs": "A String",
            "allUpdatesTimeMs": "A String",
            "commitTimeMs": "A String",
            "customMetrics": {
              "a_key": "A String",
            },
            "memoryUsedBytes": "A String",
            "numRowsDroppedByWatermark": "A String",
            "numRowsRemoved": "A String",
            "numRowsTotal": "A String",
            "numRowsUpdated": "A String",
            "numShufflePartitions": "A String",
            "numStateStoreInstances": "A String",
            "operatorName": "A String",
          },
        ],
        "streamingQueryProgressId": "A String",
        "timestamp": "A String",
      },
      "taskData": { # Data corresponding to tasks created by spark.
        "accumulatorUpdates": [
          {
            "accumullableInfoId": "A String",
            "name": "A String",
            "update": "A String",
            "value": "A String",
          },
        ],
        "attempt": 42,
        "durationMillis": "A String",
        "errorMessage": "A String",
        "executorId": "A String",
        "executorLogs": {
          "a_key": "A String",
        },
        "gettingResultTimeMillis": "A String",
        "hasMetrics": True or False,
        "host": "A String",
        "index": 42,
        "launchTime": "A String",
        "partitionId": 42,
        "resultFetchStart": "A String",
        "schedulerDelayMillis": "A String",
        "speculative": True or False,
        "stageAttemptId": 42,
        "stageId": "A String",
        "status": "A String",
        "taskId": "A String",
        "taskLocality": "A String",
        "taskMetrics": { # Executor Task Metrics
          "diskBytesSpilled": "A String",
          "executorCpuTimeNanos": "A String",
          "executorDeserializeCpuTimeNanos": "A String",
          "executorDeserializeTimeMillis": "A String",
          "executorRunTimeMillis": "A String",
          "inputMetrics": { # Metrics about the input data read by the task.
            "bytesRead": "A String",
            "recordsRead": "A String",
          },
          "jvmGcTimeMillis": "A String",
          "memoryBytesSpilled": "A String",
          "outputMetrics": { # Metrics about the data written by the task.
            "bytesWritten": "A String",
            "recordsWritten": "A String",
          },
          "peakExecutionMemoryBytes": "A String",
          "resultSerializationTimeMillis": "A String",
          "resultSize": "A String",
          "shuffleReadMetrics": { # Shuffle data read by the task.
            "fetchWaitTimeMillis": "A String",
            "localBlocksFetched": "A String",
            "localBytesRead": "A String",
            "recordsRead": "A String",
            "remoteBlocksFetched": "A String",
            "remoteBytesRead": "A String",
            "remoteBytesReadToDisk": "A String",
            "remoteReqsDuration": "A String",
            "shufflePushReadMetrics": {
              "corruptMergedBlockChunks": "A String",
              "localMergedBlocksFetched": "A String",
              "localMergedBytesRead": "A String",
              "localMergedChunksFetched": "A String",
              "mergedFetchFallbackCount": "A String",
              "remoteMergedBlocksFetched": "A String",
              "remoteMergedBytesRead": "A String",
              "remoteMergedChunksFetched": "A String",
              "remoteMergedReqsDuration": "A String",
            },
          },
          "shuffleWriteMetrics": { # Shuffle data written by task.
            "bytesWritten": "A String",
            "recordsWritten": "A String",
            "writeTimeNanos": "A String",
          },
        },
      },
    },
  ],
}

  x__xgafv: string, V1 error format.
    Allowed values
      1 - v1 error format
      2 - v2 error format

Returns:
  An object of the form:

    { # Response returned as an acknowledgement of receipt of data.
}