Scaling native Spark engine
You can increase or decrease the compute capacity of the native Spark engine by scaling. Scale up your native Spark engine, if the Spark workloads require additional compute capacity and scale down if you need to release surplus compute capacity.
Scaling up the Spark engine results in additional charges. Ensure to scale down the unused storage capacity.
To scale engine, use one of the following methods:
- Scaling an engine in list view
- Scaling an engine in topology view
Scaling an engine in list view
- Click the overflow menu icon at the end of the row and click Scale.
- In Scale engine window, enter the number of worker nodes in the Worker nodes field. To scale up, increase the worker node and to scale down, reduce the worker nodes.
- Click Scale.
Scaling an engine in topology view
- Hover over the engine that you want to scale and click the Scale icon.
- In Scale engine window, enter the number of worker nodes in the Worker nodes field. To scale up, increase the worker node and to scale down, reduce the worker nodes.
- Click Scale.