IBM Support

PI93605: NEW FUNCTION SPARK 2.2.0.2

A fix is available

Subscribe

You can track all active APARs for this component.

 

APAR status

  • Closed as new function.

Error description

  • New Function Spark 2.2.0.2
    

Local fix

  • Adjusting the permissions of
    spark-configuration-checker-2.2.0.jar in the directory
    $SPARK_HOME/jars to match the other files will resolve this
    issue. Command: chmod 644 spark-configuration-checker-2.2.0.jar
    
    Adjusting the permissions of
    spark-configuration-checker.sh in the directory
    $SPARK_HOME/sbin to match the other files will resolve this
    issue. Command: chmod 755 spark-configuration-checker.sh
    

Problem summary

  • ****************************************************************
    * USERS AFFECTED:                                              *
    * Users of IBM Open Data Analytics for z/OS - IzODA Spark -    *
    * Release 110                                                  *
    ****************************************************************
    * PROBLEM DESCRIPTION:                                         *
    * Pyspark Refused Connection fix                               *
    *  - Fix the connection refused issue for PySpark              *
    * New Directories Permissions - Adjust the permission for new  *
    * Spark directories to follow umask settings for others        *
    * Environment Verification                                     *
    *  - Verify and fail early if Spark daemon environment is not  *
    * set up properly                                              *
    * Started Task Support- Started task support for Spark master  *
    * and worker                                                   *
    * Spark Configuration Checker - New configuration checker to   *
    * aid with Spark configuration                                 *
    ****************************************************************
    * RECOMMENDATION:                                              *
    ****************************************************************
    PySpark fix:
    
    A java.net.ConnectException: EDC8128I Connection refused.
    (errno2=0x76630291) (Connection refused)
    error may appear when PySpark is invoked. The reason being when
    the PySpark daemon is launched, it writes its port number to
    STDOUT as raw bytes. This port number is, however, treated as
    ASCII encoded and is converted to EBCDIC and subsequently to
    UTF-8. This adds a prefix of C2x or C3x if the port number has a
    byte that's greater than 7Fx, causing the port number to become
    invalid. The fix is to convert the port number to ASCII if a C2x
    or C3x is detected.
    
    
    
    Umask fix:
    
    Prior to this APAR, new Spark directories are created with the
    permission of 775. This APAR changes the permission for new
    directories to 77x, where x is based on the umask.
    
    
    
    Env verification:
     With this new feature the Spark daemons (master and worker)
    verify some of the environment set up upon initialization. The
    daemon is terminated if the environment is not set up and will
    eventually lead to application failure. The reason for
    termination can be found in the daemon's log.
    
    
    
    Spark Configuration checker:
     This APAR introduces the Spark configuration checker tool. The
    tool verifies and displays some of the Spark settings. It
    displays any errors and warnings found and provides possible
    solutions.
    
    
    
    Started tasks:
     Provides support to start Spark daemons (master and worker) as
    started tasks.
    

Problem conclusion

  • PySpark fix:
    This fix changes how the port number is read to avoid Connection
    Refused error when using PySpark.
    
    
    Umask fix:
     This fix adjusts the permission for new Spark directories to
    follow umask settings for others
    
    Both fixes are inside module AZKSP220
    

Temporary fix

Comments

  • The following publication is updated:
    
    1. IBM Open Data Analytics for z/OS Installation and
    Customization Guide (SC27-9033)
    
    Refer to the latest (March 2018) IBM Open Data Analytics for
    z/OS Installation and Customization Guide to see the
    documentation updates.
    ×**** PE18/03/23 FIX IN ERROR. SEE APAR PI95680  FOR DESCRIPTION
    

APAR Information

  • APAR number

    PI93605

  • Reported component name

    Z/OS SPARK

  • Reported component ID

    5655AAB01

  • Reported release

    120

  • Status

    CLOSED UR1

  • PE

    NoPE

  • HIPER

    NoHIPER

  • Special Attention

    YesSpecatt / New Function / Xsystem

  • Submitted date

    2018-02-09

  • Closed date

    2018-03-07

  • Last modified date

    2018-04-03

  • APAR is sysrouted FROM one or more of the following:

  • APAR is sysrouted TO one or more of the following:

    UI54316

Modules/Macros

  • AZKSPUPX AZKSP220
    

Publications Referenced
SC279033    

Fix information

  • Fixed component name

    Z/OS SPARK

  • Fixed component ID

    5655AAB01

Applicable component levels

  • R120 PSY UI54316

       UP18/03/14 P F803  

Fix is available

  • Select the PTF appropriate for your component level. You will be required to sign in. Distribution on physical media is not available in all countries.

[{"Business Unit":{"code":null,"label":null},"Product":{"code":"SG19O","label":"APARs - MVS environment"},"Component":"","ARM Category":[],"Platform":[{"code":"PF025","label":"Platform Independent"}],"Version":"120","Edition":"","Line of Business":{"code":"","label":""}}]

Document Information

Modified date:
03 April 2018