Troubleshooting
Find solutions to common issues that you might encounter as you work with webMethods Integration.
Projects
- Sharing projects with other users
- This feature is not available.
- Unable to create a project
- You are assigned either a custom role or you do not have the permission to create projects. A user with custom role cannot create, update, or delete a project.
- Storing project files
- By default, project files are stored in an internal Git server that is managed by IBM®. For tenants enabled with Develop Anywhere, Deploy Anywhere capability, you can also store them in GitHub.
- Storing Develop Anywhere, Deploy Anywhere project files in another external Git provider
- You can store your assets only in GitHub.
- Using an existing GitHub repository that already contains other files to store the Develop Anywhere, Deploy Anywhere projects
- Always associate your projects with an empty GitHub repository.
- A personal access token has expired for the GitHub account where the assets are stored
- Go to the account alias in Settings > Version control and edit the GitHub account with the expired token.
- Setting up Git packages or project name
- The name of the repository must be in the format, RepoNameProject, where RepoName indicates the name of the repository and must start with an uppercase letter. The repository name must be the same as the name that you are going to use for your new project.
- Access permissions that are provided to a user for a project
- Currently, the project access permissions are not supported for Develop anywhere, Deploy anywhere capability. Only administrators can restrict the project access for users. Hence, users who do not have access to the project cannot add or remove packages to that project.
Workflows
- Workflow gets timed out after a certain period
- The default timeout for both sync and async workflows is 3 minutes. The maximum timeout is 6 minutes for sync workflows and 90 minutes for async workflows.
- Submit workflows to a recipe
- See Using recipes to learn how to do it.
- Delete existing triggers and accounts
- You can delete accounts that are created under a specific project by going to the Configurations > Workflow > Connections page. You can also delete the triggers set up under a specific project by going to the Configurations > Workflow > Triggers page.
- Find a connector, action, or trigger
- To find a specific connector, action, or trigger, create them yourself using either the Creating Custom Actions with Node.js block (for creating actions) or the Connector Builder - webMethods CloudStreams Connectors (for creating connectors, actions, and triggers).
- Restore a deleted workflow
- Deleted workflows cannot be restored. You can export business-critical workflows to your local computer and then import them later in case you accidentally delete a workflow from your tenant.
- SMTP connection works in flow services but fails in workflows
- When creating an SMTP connection in a workflow, on the Add Account window for a secure connection, set the value for the Port field to 465 and for the Secure field to true. While creating an SMTP connection in a workflow, on the Add Account window for a nonsecure connection, set the value for the Port field to 25 or 587, and set the value for the Secure field to false.
- Setting a custom context ID for a workflow
- In a workflow, drag the Set Context ID action that is listed in the Connectors window in the canvas. Then double-click the Set Context ID icon. A window appears where you can optionally change the name of the action. Click Next. The Set Context ID configuration window appears. Configure the action by defining a unique identifier for the workflow, and then click Next.
Flow services
- Using Flow services?
- Use Flow services to create complex integrations that require advanced data transformation and custom logic implementation. You can build a Flow service by adding steps and selecting the constructs, including Connectors, Controls, Flow services, and Services from within the steps. The Flow services editor is visually enhanced to offer ease of use and is interactive. A Flow service mainly has two parts, the Flow service signature and Flow service steps. Although workflows and Flow services help you to accomplish the same goal, significant differences are there between both the features.
- Creating a Flow service
- See Working with flow services for information on how to create a Flow service.
- Using smart mapping
- Smart mapping provides you with recommendations while mapping the pipeline data and uses a set of algorithms to determine the likely accuracy of a suggestion, allowing you to switch between different levels of mapping confidence. A machine learning (ML) algorithm is applied to provide the suggestions. The ML algorithm learns from the mappings that you create and provides suggestions automatically to map similar fields. For more information, see Using smart mapping.
- Using conditional constructs
- Conditional steps are those steps that do different actions based on the result of evaluated conditions or expressions. Types of conditional constructs include If, Else, ElseIf, Switch, Branch, and so on.
- Using transformers
- The purpose of transformers is to accomplish multiple data transformations on the pipeline data in a single step as compared to using normal steps one after another. Transformers are services that are inserted into and run within a Transform Pipeline step. Transformers act as a collection of normal steps embedded in a single transform pipeline step.
- Triggering the execution of a Flow service from an external system
- You can trigger the execution of a Flow service from an external system, for example, a REST client, apart from manual and scheduled executions from the user interface. For more information, see Working with flow services.
- Debugging a Flow service
- See Debug flow services for information on how to debug a Flow service.
- Copying text into a Flow service from an external source, in the Firefox browser
- Open a new Firefox browser window, type about:config in the address bar and press Enter. In the search bar, enter dom.events.testing.asyncClipboard to locate the specific configuration setting. Double-click the dom.events.testing.asyncClipboard entry to modify its value and set the value to true. Close the about:config tab and return to the Flow service interface.
- SFTP connection setup fails in a Flow service with a host public key error
- The host public keys from Azure are available as provided in the following link:
https://learn.microsoft.com/en-us/azure/storage/blobs/secure-file-transfer-protocol-host-keys. These
keys come in various types, including ecdsa-sha2-nistp384, ecdsa-sha2-nistp256, rsa-sha2-512, and
rsa-sha2-256. However, while creating the SFTP connection within a Flow service, none of these keys
appear in the Preferred Key Exchange Algorithms section. As a workaround, use the
ecdsa-sha2-nistp256host key that is provided by Azure and modify the key type to ssh-rsa:ssh-rsa AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBmVDE0INhtrKI83oB4r8eU1tXq7bRbzrtIhZdkgiy3lrsvNTEzsEExtWae2uy8zFHdkpyTbBlcUYCZEtNr9w3U=Save the modified key to a file and use it as a custom host public key.
Connectors and developer tools
- List of supported connectors and information on the connection configuration fields
-
You can find the list of predefined connectors that webMethods Integration supports under the Connectors section. For details about the connection configuration fields, see Account configuration fields.
- Maximum amount of data that can be stored in a single object in Flow Store, Memory Store, and Account Store
- A single object in the Flow Store, Memory Store, or Account Store can hold up to 16 MB of data.
- Calling one workflow from another workflow
- You can call one workflow from another workflow by using the Run Flow action.
- Building own actions
- You can build your own actions for a particular service by using the Creating Custom Actions with Node.js block or the webMethods CLI Connector Builder.
- SOAP connector shows invalid hostname error despite IP allow listing
-
This issue is observed due to the usage of a WSDL URL associated with a private host that webMethods Integration cannot access from the AWS (Amazon Web Services) development environment. To address this issue, expose the WSDL through a Public IP or host to help ensure accessibility from AWS. You are redirected to the Test action window, where you can check whether the action is working as expected before running the workflow.
- File stream that is sent through HTTP is reported as corrupted in Azure Blob Storage
- webMethods Integration supports two distinct integration platforms that are named Workflow and Flow service, and file streaming between these platforms is not supported. Your current integration uses the SFTP action, which is part of the workflow, and an HTTP operation, which is part of the Flow service. This incompatibility is the reason that you are encountering the problem. Use the SFTP and HTTP operations within the Flow service to resolve this issue.
- 403 Forbidden error observed when passing a folder path with / delimiter as the folder separator for the Amazon S3 connector
- The 403 Forbidden error occurs when using a folder path with / delimiter as the folder separator in the Amazon S3 connector. This issue arises due to the specific handling of double slashes in the resource path by the S3 application. When a folder path is passed with / as the separator, it results in a resource URI with double slashes, which is not allowed and leads to the 403 Forbidden error. This behavior aligns with the RFC specification (RFC 3986 Section 3.3) and cannot be handled generically. Avoid / as the folder separator at the beginning of the input or follow the guidelines that are provided in the documentation to help ensure compatibility with the Amazon S3 connector.
- Changing the GUID for a custom connector
- Replace the GUID key in your configuration with the correct value found in the
index.json file. Log in with the owner credentials and run the command
wmio deployto update the existing connector with the new GUID values. If the GUID for the actions is not updated initially, redeploy the connector. - File that is created by the Write File action in a workflow
- The file that is created by the Write File action is temporary. It exists only during the workflow execution and is automatically deleted after the execution is complete.
- Delete File action
- A separate Delete File action is not available, as the temporary file that is created is automatically deleted at the end of the workflow execution.
- Updating an existing file in a workflow
- There is no dedicated Update File action. Instead, use the File Append action to add content to an existing file.
- File storage in workflows
- The container space that is assigned to the workflow determines the storage capacity for file actions.
- Retrieving the temporary file created during the workflow execution
- After the workflow execution is complete, the temporary file is deleted and cannot be retrieved.
- Read file action fails with File not found error after workflow resumes that have a Write file action
- When a workflow with a Write file action runs successfully, but an action in between fails and is resumed, the subsequent Read file action might fail with a File not found error. The Read file action fails because the file path generated by the Write file action becomes invalid after the workflow is resumed from the failure point. The file that is created during the initial execution is no longer available when the workflow resumes, as the file path might not be preserved or accessible after resumption. This behavior is by design, as file-related actions must be completed within the same execution cycle to help ensure that the file path remains valid for the Read file action.
APIs
- Invalid XML error when you import a WSDL to create a SOAP API
- To address this issue, help ensure that the WSDL file you are using contains the XML declaration
at the beginning. The XML declaration must be in the following format:
<?xml version="1.0" encoding="UTF-8"?> - Starting a REST API by using the internal URL in webMethods API Gateway
- Develop a REST or SOAP API. Use the Internal URL under API Endpoints or obtain the Swagger JSON or YAML for the created or designated REST API. Click Create Alias to generate an Alias in webMethods API Gateway. Provide an alias name, and then select the Technical information option. In the Default value field, paste the Internal url. Next, save the alias. Create an API in webMethods API Gateway by using the downloaded Swagger JSON or YAML. Go to the Policies menu, click Edit, and modify the Routing rule. Edit the Endpoint URI to use the newly established alias to direct the traffic. Next click Transport under the Policies menu and enable the HTTP and HTTPS protocols. Click save and then Activate, and then confirm the activation action. Go to the API details menu, scroll down to the Gateway endpoints section, and copy the endpoint to start the API. Employ the webMethods API Gateway endpoint for starting the API by using a REST client.
Triggers
- Editing an existing trigger
- You can edit an existing trigger either from the workflow where it is used or from the project’s Configurations > Workflow > Triggers tab.
- Adding conditions in triggers
- While configuring a trigger, you can add conditions in triggers by using the Filters option that is available at the Trigger Output Data window. For more information, see Adding multiple condition blocks.
- Triggers taking a longer time to run
- Some triggers constantly check for updates in the external services and run in real time, while others poll periodically, for example, every five minutes, resulting in slower response times. The clock icon next to the trigger service identifies the second type, which is known as polling triggers.
- Updating trigger settings for one workflow without affecting others that use the same trigger
- If a trigger is used in multiple workflows, any changes that are made to it are reflected in all those workflows. To avoid affecting other workflows, create a new trigger instead.
- Deleting an existing trigger
- You can delete an existing trigger from a particular workflow it is used in, using the Workflow Settings window. If you want to delete the trigger permanently from the project, go to Configurations > Workflow > Triggers. For more information, see Managing triggers.
Conditions
- Use the AND operators to set up a condition
- Use the AND operator when you want the workflow to proceed only if all the specified conditions are met. For more information, see Adding multiple condition blocks.
- Using the OR operator to set up a condition
- You can use the OR operator when you want the workflow execution to proceed if any of the specified conditions is met. For more information, see Adding multiple condition blocks.
- Defining multiple conditions that work like the If-Else statement
- You can use the Switch action to define multiple possible execution paths for your workflow. With the Switch action, you can add one or more cases. Each case contains one or more conditions and a next step that will be performed if the defined conditions are met. The action also includes a Default case that specifies the action to be run if none of the conditions are met.
Deploy anywhere flow services
- Calling a deploy anywhere flow service from a flow service
- You can call deploy anywhere flow services from a workflow. But, you can call either a flow service or a deploy anywhere flow service at a time and not together.
- Calling a flow service or a workflow from a deploy anywhere flow service
- It is not possible to call a flow service or workflow from a deploy anywhere flow service. However, you can invoke them if they are available as APIs.
- Sharing deploy anywhere flow service assets across projects
- You can share deploy anywhere flow service assets across projects, if you are in the same tenant. Further, if the deploy anywhere flow services use imported package services or database connectors, then these assets can be shared. However, the database connector account must be configured in the target project.
- Invoking a deploy anywhere flow service on a different integration runtime
- You can invoke a deploy anywhere flow service on a different integration runtime. However, if the deploy anywhere flow service uses a database connector, then the account must be configured in the target runtime.
- Exporting and importing deploy anywhere flow service assets across projects
- You can export and import deploy anywhere flow service assets across projects if you are in the same tenant.
- Publishing and deploying projects, which contain deploy anywhere flow service assets
- You cannot publish and deploy projects that contain deploy anywhere flow service assets.
- Calling deploy anywhere flow services from other deploy anywhere flow services
- You can call deploy anywhere flow services from other deploy anywhere flow services within the same project and not across projects.
- Selecting the self-managed runtime dynamically at run time
- Select the self-managed runtime during design time. However, you can run a deploy anywhere flow service on multiple self-managed runtimes.
- Setting the self-managed runtime when cloning a deploy anywhere flow service
- When you clone a deploy anywhere flow service, the self-managed runtime that is linked to it does not get cloned. It is reset to the cloud runtime. Set the self-managed runtime and save the deploy anywhere flow service.
Packages
- Related packages are removed when a self-managed runtime is no longer assigned to any deploy anywhere flow services
- No options are available to remove packages from a project.
- Naming packages
- No restrictions.
- Availability of on-premises Designer features in the Cloud
- Yes, the self-managed runtime is a fully functional Microservices Runtime (Integration Server).
- Modifying a deploy anywhere flow service in the Cloud, such as pipeline changes
- If you import those services as a package from Git, they function the same way in the Cloud.
- Add a variable to a document when mapping the input of a transformer - This capability is not available in webMethods Integration.
- Overwrite pipeline value - Yes, by using the Override pipeline value field in the pipeline's Set Value dialog box.
- Perform global variable substitution - Currently supported for flow services only and not deploy anywhere flow services.
- Copy condition entered manually in Designer, and in the Cloud by using the user interface - Yes, by using the Enable condition during execution field in the pipeline.
- Setting array indices - Yes, you can map array indices.
- Sharing a package between projects
- The same package can be imported into multiple projects, but currently they reside in a single design time environment and hence in reality they are referencing the same package. This means that you cannot have projects, which reference different versions of the same package.
- Linking multiple packages to a single Git repository
- You can link a single package to a single Git repository.
Audit logs
- Availability of the Audit log window
- The Audit log window is available only for the Admin and Owner of the project. If you are assigned to a custom role by the tenant admin or owner, you cannot see the audit logs. Logging is kept in the cloud. During cloud or internet connectivity unavailability, logs are retrieved from the self-managed runtime after the connectivity is restored.
Security
- OPTIONS method in webMethods Integration
- The OPTIONS method is open for webMethods Integration as it is required to run the site. For a REST API implementation, the application requires the OPTIONS method to be available for Ajax calls to work.
- Uploading customer certificates
- webMethods Integration accepts customer certificates signed only by a trusted root CA. All other certificates are rejected.
- CORS handling
- webMethods Integration currently does not have a CORS handling mechanism. You can use webMethods API Gateway to process your requests.
- Resolving security compliances while starting a service, by using basic authentication through API gateway
- For webMethods Integration service authentication, you can use certificates instead of basic authentication. If you are processing native requests, you need to also configure these certificates on webMethods API Gateway.