Getting started with Spark applications
To get started with Spark applications:
- Provision an Analytics Engine powered by Apache Spark instance. You need Administrator role in the analytics project or in the deployment space to provision an instance. See Provisioning an instance.
- Manage the instance. You need Administrator role in the analytics project to manage the resource quota and user access.
- Generate an access token to use the Spark jobs API. See Generating an access token.
- Choose how to persist your Spark application job files. See Persisting Spark applications.
- Run your Spark application job. See Submitting Spark jobs.
- View the job status. See Viewing Spark job status.
- View job logs. See Accessing Spark job driver logs.
- You can also run Spark applications interactively. See Running Spark applications interactively.
- Debug your applications using the Spark history server. See Accessing the Spark history server.
- Access data in your Spark application from storage volumes by using the IBM Cloud Pak for Data volume API. See Accessing data from storage.