Security and deployment best practices for InfoSphere Information Server Pack for SAP applications, Part 2: Deployment

Part 2 in this article series discusses how to effectively manage the different components required by IBM® InfoSphere® Information Server Pack for SAP applications, and how to distribute them across the SAP and Information Server landscape. It explains the various deployment components and strategies, and explains the advantages and disadvantages using each.

Christian Gaege (gaege@de.ibm.com), Software Engineer, IBM

Photo of author Christian GaegeChristian Gaege is a software engineer working in the Information Platform and Solutions team. He started at IBM in 2008 in the IBM Silicon Valley Labs in the United States and is currently based in the IBM Research and Development Lab in Germany. Christian is developing applications and tools for the integration of SAP into the IBM Information Server product portfolio. He holds a master's degree in computer science from Furtwangen University in Germany.



Martin Oberhofer, Senior IT Architect, IBM

Martin Oberhofer 的照片Martin Oberhofer works as Senior IT Architect in the area of Enterprise Information Architecture with large clients world-wide. He helps customers to define their Enterprise Information Strategy and Architecture solving information-intense business problems. His areas of expertise include master data management based on an SOA, data warehousing, information integration and database technologies. He especially likes to work with enterprises running SAP applications solving the SAP-specific information management challenges. Martin provides, in a lab advocate role, expert advice for Information Management to large IBM clients. He started his career at IBM in the IBM Silicon Valley Labs in the United States at the beginning of 2002 and is currently based in the IBM Research and Development Lab in Germany. Martin co-authored the books Enterprise Master Data Management: An SOA Approach to Managing Core Information and The Art of Enterprise Information Architecture: A Systems-Based Approach for Unlocking Business Insight as well as numerous research articles and developerWorks articles. As inventor, he contributed to over 30 patent applications for IBM. Martin is also a The Open Group Master Certified IT Architect and holds a master’s degree in mathematics from the University of Constance/Germany.


developerWorks Contributing author
        level

Nicole Schoen (nrausch@de.ibm.com), Software Engineer, IBM

Photo of author Nicole SchoenNicole Schoen is a software engineer working in the Information Platform and Solutions team at the IBM Research and Development Lab in Boeblingen. She started working for IBM as a developer for OMEGAMON XE for DB2 Performance Expert on z/OS in 2006. Now she is developing applications and tools for the integration of SAP into the IBM Information Server product portfolio.



16 February 2012

Also available in Chinese

Introduction

This article introduces deployment best practices for the IBM InfoSphere Information Server Pack for SAP Applications V7 (hereafter referred to as SAP Packs). The first article in this series, see the Resources section, gave a general introduction to the SAP Packs, and provided details on the security concept and SAP-side installation steps. Building on this foundation, the second article discusses best practices for the roll out of the components required by the SAP Packs.

This article begins with a general overview of different environment strategies, followed by suggested information that you should gather before you deploy the SAP Packs across an SAP and InfoSphere Information Server landscape, then discusses the different deployment options for the SAP Packs components that must be distributed. The last section outlines some considerations on security and on how to set up authorization profiles for the different environments.

Solution environment

IBM InfoSphere Information Server is used as an information integration platform for all kinds of data integration needs within an enterprise. The SAP Packs provide the connectivity to SAP Applications in data integration projects as well as for data migration in SAP application consolidation projects with a solution, like Ready-to-Launch, for SAP. See the Resources section for more information.

The environment approach that you often find for SAP applications consists of at least a development environment, a quality assurance environment, and a production environment. For the InfoSphere Information Server (IIS) installation it is usually a good approach to follow the SAP environment structure, which means that for each SAP environment, there should be a dedicated IIS environment of the same type.

The environment strategy is not discussed in this article, including decisions like which environments are needed and the topology for an InfoSphere Information Server installation. Find further information on how to select your installation topology and designing a topology for InfoSphere Information Server in the Resources section.

For the deployment of the SAP Packs you need to know which environment strategy is implemented. The following overview shows the involved environments and their major use cases in the landscape.

  1. Sandbox environment:
    • Test patches
    • Test configuration changes
    • Prototyping
  2. Development environment:
    • Code development
    • Unit testing
  3. Quality Assurance (QA) environment:
    • Functional Verification Testing (FVT)
    • Optional if pre-production system does not exist: System Integration Testing (SIT)
    • Optional if pre-production system does not exist: Performance Testing (PT)
  4. Pre-production environment:
    • System Integration Test (SIT)
    • Performance Testing (PT)
  5. Production environment:
    • Production use

The following sections briefly introduce three commonly-used environment strategies that one might come across.

Basic environment strategy

In this environment strategy, as shown in Figure 1, there is one environment for development, quality assurance, and production. Developers have full access to the development environment, but no or very limited access to the quality assurance and production environment.

Figure 1. Basic environment strategy
diagram shows SAP DEV connected to IIS DEV, SAP QA connected to IIS QA, SAP PROD connected to IIS PROD

The advantages of this environment strategy are:

  • Separation of concerns regarding testing and production which means FVT, SIT and PT do not impact the production system.
  • Ability to test configuration changes and software changes (software patches or upgrades) with the full suites of test cases before applying them to the production system. This substantially reduces the risk of impacting the production system.

This setup avoids the costs of the sandbox and pre-production environments. However, the problem of not being able to test a software upgrade, software patch, or configuration change before applying it to the development environment still exposes the development system to risks of unavailability, which incurs high costs if the system is used by a large number of developers. Also, there is no dedicated system available for complex SIT and PT requirements.

Basic environment strategy with a sandbox

This environment strategy is similar to the previous one, however it adds the sandbox environment shown in Figure 2. The sandbox allows you to test software patches, upgrades, and configuration changes in the sandbox environment, thus reducing the risk of impacting the development environment. This benefit should not be underestimated if the IIS Dev system is used by a larger number of developers, possibly distributed in several geographies demanding at least a 24x5 availability of the development system. This comes though at increased costs for the infrastructure since another environment is needed.

Figure 2. Basic environment strategy with sandbox
diagram shows SAP sandbox connected to IIS Sandbox

Advanced environment strategy

This environment strategy adds a dedicated environment for SIT and PT which is the pre-production environment shown in Figure 3. This environment is identical to the production environment from a hardware perspective regarding capacity and configuration. Any SIT and PT executed here produce results accurately predicting what will happen in the production system. This environment strategy is used for large projects with large data volumes requiring in-depth SIT and PT. This is also the most costly infrastructure setup.

Figure 3. Advanced environment strategy
like previous diagram with addiiton of SAP Pre-PROD connected to IIS Pre-PROD

Security and deployment considerations

To decide on the deployment strategy it is important to consider the security and code management policies of a company. Relevant questions in this regard are:

  • Which security policies are in place with regard to loading ABAP reports to your SAP landscape?
  • Which review processes are established for ABAP reports across the SAP landscape?
  • Are there naming conventions defined for ABAP reports or RFC destinations?
  • Who has privileges to create SAP-side components like RFC destinations, ports, partner profiles for the different environments?
  • Does your SAP administrator take care of setting up and distributing these SAP-side components?

Deployment strategies

Depending on the environment strategy, the different deployment artifacts must be distributed across the environments. This article uses a basic configuration as a basis for explaining the different deployment options for moving the components from the development to the quality assurance, and finally to the production system. This can be easily applied to other environment strategies regarding the deployment of the SAP Packs artifacts. The sandbox environment can be handled similarly to the development environment, and the pre-production environment can be handled similarly to a quality assurance environment.

DataStage jobs using the SAP Packs have up to four categories of artifacts depending on the stage type, as shown in Table 1.

Table 1. Deployment artifacts by stage type
ABAP stageBAPI stageIDoc stages
SAP-side product componentsRFC service
RFC service (CTS)
optional: authorization profiles
optional: authorization profilesoptional: authorization profiles
SAP-side artifacts RFC destinationn/aRFC destination
Partner profile
Generated ABAP programapplicablen/an/a
DataStage jobsapplicableapplicableapplicable

These categories need to be deployed and managed across the following environments:

  1. SAP-side product components, such as the RFC service required for the ABAP stage.
  2. SAP-side artifacts enabling the communication between the SAP system and the DataStage jobs, such as RFC destinations or partner profiles.
  3. Generated ABAP programs.
  4. Information Server artifacts, such as DataStage jobs.

The following sections discuss alternatives for deploying these components to the different environments.

Deployment of SAP-side product components

The SAP Packs require the deployment of certain product components on your SAP systems. The product components include a set of pre-configured authorization profiles and a set of remote function modules that are required by the ABAP stage. These product components are delivered as transports. You can use the SAP Transport Management System to import those transports into your SAP landscape. Basically, the two following options deploy these product components to a SAP landscape:

  1. Install the delivered transports to all involved SAP systems.
  2. Install the delivered transports only to your development environment and then distribute it with the SAP Change and Transport System (CTS).

Before you distribute the components you should consider whether you want to customize the pre-configured authorization profiles. See section 5 for the key points to consider when customizing the authorization profiles. You can then create a new transport request for the customized authorization profiles for further distribution.

For the first option, you can find detailed instructions in the first article located in the Resources section. Figure 4 shows the import of the transports into DEV, QA and PROD SAP systems.

Figure 4. Import transports in all involved SAP systems
SAP transports connected to SAP DEV, SAP QA, and SAP PROD

For the second option, follow the same instructions provided for the first option to import the product components to your development environment. From there you use the transport routes established for your SAP system landscape to distribute the SAP Packs product components as shown in Figure 5:

Figure 5. Use CTS to redistribute transports into QA and PROD systems
SAP transports connected only to SAP DEV

If you follow this approach, then you have to create the packages (formerly known as development class) ZETL and ZETL_CTS before you import the product components to your development system. When you create a package, you assign a transport layer. All objects in this package are transported according to the transport routes defined for this transport layer. Work with your SAP basis administrator to get the correct transport layer for your environment. You can use the ABAP Workbench SE80 to create new packages. In order to transport a package to another SAP system, you need to create a new transport request in SAP transaction SE03. Include all objects that you want to transport and save the new request. Then release the transport in transaction SE01. After the transport request has been released, it can be found in the transport import queue (STMS) of the target SAP system.

Figure 6 shows an example transport request that contains an ABAP function group and two ABAP tables. The transport request is distributed along a CTS transport route that connects the DEV system with the QA system.

Figure 6. Transport request from SAP DEV to SAP QA
illustrates transport request moving from SAP DEV to SAP QA

Distribution of SAP-side components

In order to use the IDoc Load or IDoc Extract stage to load or extract IDocs from SAP, you need to configure various settings on SAP side. Those settings include the maintenance of logical systems, partner profiles, ports and RFC destinations. When you distribute a DataStage job that processes IDocs within your SAP landscape, it is important that the involved partner profiles, ports, and RFC destinations are maintained on all involved SAP systems.

The ABAP stage also requires SAP-side components to be set up. However, only a RFC destination is required for the ABAP stage.

For the SAP-side components, it is important to have a clear picture on how these SAP-side components should be distributed across your SAP landscape. These components are required to enable the communication between SAP and the DataStage jobs. If these components are missing or are not configured correctly, such as an RFC destination that is used for communication with more than one job, this will lead to a failure of the job in the best case or some strange behavior that requires cumbersome investigation. So it is important to be aware of these additional components that need to be set up in all the environments where the respective DataStage jobs are supposed to run, and to make sure you have a process in place to distribute the components themselves, as well as changes to the SAP-side setup to prevent cumbersome error analysis.

SAP provides support for distribution for some of the required SAP-side components. The required partner profiles and ports can be generated from a distribution model. A distribution model describes which IDoc types may be sent from a specific source system to specific target systems. The distribution model itself can be distributed by means of STMS via a transport request. Details can be found in the SAP documentation located in the Resources section.

RFC destinations on the other hand must be set up on each system. There is no mechanism in place that helps with the distribution of RFC destinations.

However, for the ABAP stage, a function is provided that, if enabled, creates a RFC destination at runtime before each job run, and cleans it up after the job finishes successfully. This way you can make sure that the required unique RFC destination is available in every environment without having to worry about distribution. This is a very convenient way for the developer to ensure that the SAP-side components are in place. But it might not be desirable, depending on the security policies of the respective company using this function, due to security concerns as the user running the DataStage job must be assigned the privileges required to create and delete RFC destinations. Furthermore the SAP basis administration team might want to have more control on which SAP-side components are created on an SAP system.

Distribution of generated ABAP programs

Data extraction with the ABAP stage requires that an ABAP report is loaded to the SAP system which is used to access the logical SAP tables and to provide the data to the DataStage job. The ABAP stage provides an ABAP code generator to create the required ABAP reports. There are various ways to make the ABAP reports available on the SAP system. In general, the ABAP stage provides two upload mechanisms at design-time. One adds the ABAP reports directly to the SAP object repository. The other uses the SAP Change and Transport System (CTS) which lets you integrate with the code management that you established for SAP by creating a transport request for the ABAP report. As an alternative, the SAP basis administrator can also manually add the ABAP reports to the SAP system which gives the basis administrator additional control on code that is added to the SAP system.

The following section discusses how you can apply these options to distribute the ABAP reports across your SAP landscape using the following approaches:

  • Distribution by means of the SAP transport management system (STMS). This is the recommended approach.
  • Manual distribution by SAP Basis Administrator. This approach might be applicable if strong security requirements exist.
  • Distribution by means of the ABAP stage upload. This approach should not be used unless exceptional conditions demand it.

Distribution by means of the SAP transport management system (STMS)

For this approach, the generated ABAP programs are uploaded to the development SAP system by means of the ABAP stage upload using CTS, as shown in Figure 7.

Figure 7. CTS Upload to DEV and redistribution to QA and PROD
shows upload to SAP DEV, QA, and PROD from IIS, and communication acorss SAP systems

It is recommended that you use the CTS upload for this task. If you do not use the CTS option for uploading the generated ABAP program, it will not be assigned to a package, and as a result, is not connected to the SAP transport management system for further distribution. In this case, you can still assign a package to an uploaded ABAP program, but you would have to do that manually in the ABAP workbench (se80) after the upload. Furthermore, by using the CTS upload, you can integrate the ABAP report upload with your SAP-side code management as the CTS upload creates a transport request for the uploaded ABAP program.

An ABAP report that is uploaded using CTS can be redistributed to QA and PROD SAP systems using CTS. As shown previously in Figure 7, design time upload of ABAP reports (indicated by the left arrow and top arrows) is only necessary between IIS DEV and SAP DEV. The ABAP reports are redistributed to QA and PROD using CTS as well (indicated by the left arrow and top arrows).

The obvious advantage of this approach is that no direct upload of generated ABAP reports to QA and PROD systems is necessary. In order to upload ABAP reports using CTS, the technical SAP user needs to fulfill the following prerequisites:

  • A development key has to be assigned to the technical SAP user.
  • The Z-DS_LOAD and S_TMW_CREATE authorization profile has to be assigned to the technical SAP user.

By revoking the developer key, the Z-DS_LOAD and S_TMW_CREATE authorization profiles from the technical SAP user (and assigning the Z_DS_PROFILE authorization profile only) on QA and PROD systems, the CTS upload can be prevented. Using this mechanism, you can enforce that generated ABAP reports may only be loaded to DEV systems and have to be redistributed to QA and DEV using CTS.

The CTS upload is the recommended approach to upload ABAP reports to SAP.

Manual distribution by the SAP Basis Administrator

In this scenario, the developer uses the ABAP code generator to create the required ABAP programs. The generated ABAP code is exported into a file and handed over to the SAP Basis Administration team. After a review, the ABAP reports are manually loaded to all SAP systems by the SAP Basis Administration team.

Figure 8 shows the manual upload of generated ABAP reports by the SAP basis administrator. The dashed lines from the IIS systems to the respective SAP systems indicate a manual upload. The solid lines indicate the access to the SAP system to start the ABAP report at runtime.

Figure 8. Manual distribution by the SAP Basis Administrator
line shows upload from IIS DEV to ABAP report

In order to reduce the amount of manual work for the SAP Basis Administration team, it is also possible to do the manual import of the ABAP code only on the DEV SAP system and redistribute it to QA and PROD using CTS.

The advantages of this approach are as follows:

  • ABAP reports can be thoroughly reviewed before being loaded into the SAP system to address concerns regarding security, naming, and so on.
  • The SAP basis administrator has control on all ABAP code that is added to the system.
  • Privileges required for any form of automatic upload are not granted in QA and PROD environments.

The disadvantage of this approach is that it requires manual work by the members of the SAP Basis Administration team. If there are a lot of jobs using the ABAP stage this approach requires a lot of manual effort.

Distribution by means of the ABAP stage upload

While technically this is an option, it is not a recommended distribution approach. In this case the ABAP reports are uploaded to the SAP systems by means of the ABAP stage upload. This requires that the Designer Client is installed in all environments, as well as in testing and production, and that the job designs (not just the compiled job) must be propagated to all environments. In addition to the fact that you have to manage many copies of the same job design, one in each environment, this setup allows the direct upload of changed or even new and possibly untested ABAP code to testing and production systems.

As shown in Figure 9, the left arrow on each system, pointing from the IIS systems to the SAP systems, indicate that the SAP systems are accessed at job design time. The right arrow on each system, pointing from the IIS systems to the SAP systems, indicate the SAP systems are accessed as well at job runtime.

Figure 9. Upload of generated ABAP reports to SAP
Upload of generated ABAP reports from IIS DEV to SAP DEV, from IIS QA to SAP QA, and from IIS PROD to SAP PROD

Distribution of DataStage jobs

For the distribution of Information Server artifacts, there are best practices in place that will not be discussed in detail in this article. For the DataStage jobs, it is a best practice to have the Designer Client only installed in the Sandbox and Development environment, so from there only the compiled DataStage jobs are propagated to the test and production environments instead of propagating the job designs. The advantages of this approach are the following:

  • There is only one current version of the job design, instead of multiple copies in the different environments.
  • The DS Designer Client need not be installed in the QA and PROD environments. The DS Designer Client is only needed in the DEV environment which can prevent subsequent changes to the DataStage job design.
  • It is guaranteed that each change in a DataStage job can only be done in the DEV environment, assuring that all changes pass through the full development cycle with testing before being deployed to a PROD environment.

InfoSphere Information Server offers tooling for managing assets such as DataStage jobs. You can use the InfoSphere Information Server Manager and the istool commands (see the Resources section) to package DataStage jobs and deploy them across your environments. Once a package is defined, it can be rebuilt and redeployed to distribute the changes to the DataStage jobs assigned to the package. Version tags help with the management of different package versions. When configuring the package you can specify that only the job executables should be added to the package, not the job designs. This way you can make sure that only the job executables are distributed, but the job designs are kept in the development environment.

Customization of the authorization profiles

When thinking about the customization of the authorization profiles you should consider the following:

  • Which stage types do you actually want to use?
  • Which functionality do you want to exploit, such as background processing for the ABAP stage?
  • Which security policies do you have to fulfill with regards to ABAP program upload, RFC destination creation?
  • Do you want to set up different authorization profiles for different environments in your SAP landscape? For example, do you want to allow the automatic ABAP program upload in the development environment for convenience, but deny it for testing and production systems for security reasons?

Review the first article located in the Resources section for detailed information on which authorizations are needed for certain function, stage, and phase of the DataStage job life cycle.

In general it is recommended that you use the Z-DS_LOAD (the design-time privileges) and Z_DS_PROFILE (the runtime privileges) for the sandbox and development environment, but only Z_DS_PROFILE for testing and production environments. The design-time privileges are required for the environment where the Designer Client is set up.

Conclusion

Part 1 of this article series discussed the authorization requirements for each of the SAP Packs stages as well as for the Rapid Modeler and Generator for SAP. Part 2 of this article series introduced deployment best practices for the IBM InfoSphere Information Server Pack for SAP Applications V7, helping you make the right choices regarding available deployment strategies of the SAP Packs, and DataStage Jobs that use them.

Acknowledgement

The authors would like to thank their colleagues Albert Maier, Oliver Suhre, and Stevan Antic for their valuable feedback and comments improving this article.

Resources

Learn

Get products and technologies

  • Evaluate IBM products in the way that suits you best: Download a product trial, try a product online, use a product in a cloud environment, or spend a few hours in the SOA Sandbox learning how to implement Service Oriented Architecture efficiently.

Discuss

Comments

developerWorks: Sign in

Required fields are indicated with an asterisk (*).


Need an IBM ID?
Forgot your IBM ID?


Forgot your password?
Change your password

By clicking Submit, you agree to the developerWorks terms of use.

 


The first time you sign into developerWorks, a profile is created for you. Information in your profile (your name, country/region, and company name) is displayed to the public and will accompany any content you post, unless you opt to hide your company name. You may update your IBM account at any time.

All information submitted is secure.

Choose your display name



The first time you sign in to developerWorks, a profile is created for you, so you need to choose a display name. Your display name accompanies the content you post on developerWorks.

Please choose a display name between 3-31 characters. Your display name must be unique in the developerWorks community and should not be your email address for privacy reasons.

Required fields are indicated with an asterisk (*).

(Must be between 3 – 31 characters.)

By clicking Submit, you agree to the developerWorks terms of use.

 


All information submitted is secure.

Dig deeper into Information management on developerWorks


static.content.url=http://www.ibm.com/developerworks/js/artrating/
SITE_ID=1
Zone=Information Management
ArticleID=793323
ArticleTitle=Security and deployment best practices for InfoSphere Information Server Pack for SAP applications, Part 2: Deployment
publish-date=02162012