The service accepts Non-XML traffic, attaches the binary input to a dummy SOAP document and therefore creates the SWA file needed for further processing.
The binary input of Non-XML transformations needs to be "consumed", otherwise the default behavior is that it will be copied to the output. Attaching the input to the dummy SOAP file does not consume the input.
Therefore stylesheet "bin.xsl" needs to invoke a <dp:input-mapping>, whose output is just discarded. In the posting above a separate FFD file is provided and used. Because its output is not used at all bin.xsl's line
<dp:input-mapping href="bin.ffd" type="ffd"/>
can be simplified to refer to a FFD file that is offically shipped as part of the DataPower product:
From my initial XML-->JSONX-->JSON transformation rdp87.2.xsl:
I created "all-in-one" stylesheet for doing the XML-->JSON transformation rdp87.3.xsl:
Now I created two XML FW on a box with 188.8.131.52 firmware containing the (performance) fix for APAR IC90781 from May 2013 fixpack.
Both services used an XML Manager with "Profile Rule" in XML Manager's Compile Options policy.
Finally I did:
clear stylesheet cache for XML manager "profile"
send 1000 requests to service by
$ time for((f=1;f<=1000;++f)); do curl --data-binary @rdp87.1000.xml http://dp7-l3:port ; echo; done
switch to "Status->XML Processing->Stylesheet Profiles" in WebGUI and take screenshot
Here is Stylesheet Profiles for XML->JSONX->JSON:
And here is Stylesheet Profiles for XML->JSON:
Yes, the total time taken for rdp87.2.xsl (1961ms) in the templates is by a factor of 2.66 greater than that of rdp87.3.xsl (735ms).
But that does only make an effect if the whole operation of a service is just XML->JSON conversion.
Typically in real live services other operations like AAA, SLM actions or other will happen.
If this is the case, then the (small) runtime overhead added is outweighted by maintainability and ease of stylesheet in my eyes.
<UPDATE> rdp87.2a.xsl runtime is reduced by 15% compared to rdp87.2.xsl in making use of <xsl:import> and <xsl:apply-templates> instead of dp:transform().
backup-b4.zip is a backup of the 4 domains "coproc2", "demo", "empty" (domain created, no config added) and "zip2html".
"demo" domain contains files of different types which your browser's application/zip helper should be able to correctly classify.
zip2html.zip (428.1 KB) is the XML FW service export of zip2html tool.
Fixed links above and below that got broken by developerWorks DataPower forum migration last month.
Add this is new export that can be used on all appliances including XB6 and XM70. zip2html-mpgw.zip (492.9 KB) is the MPGW service export of zip2html tool.
Use this URL in your web browser: http://yourBox:2228/form
zip2html.xsl (22.0 KB) is the stylesheet discussed in detail under "Technical discussion" at the bottom.
With this tool even a full device backup is processed in seconds in case "files=none".
Be careful processing big backups with option "files=all" as that might take few minutes to complete.
Just having "default" domain as part of a backup takes time in case "files=all" -- all files in "store:///" folder belong to default domain and get added.
HTML form input
This is how the HTML form looks like.
It allows to select the backup (or export) from your filesystem,
Then you have to select whether you want the files to be included in the generated HTML page in "data" links. This allows inspecting the files by the "application/zip" helper application registered with your browser.
And you have to select whether the attached or the onboard clixform.xsl should be used.
The clixform.xsl converts the "export.xml"s in the .zip archive to a sequence of CLI commands (Command Line Interface) during the initial phase of a DataPower import. This output is what you will see in the yellow background sections.
Find more explanations on which clixform.xsl will be used by DataPower by clicking on the screenshot or the real HTML form opened in you browser.
You open the HTML form with "/form" enpoint of your DataPower service, eg: http://dp3-l3:2229/form
HTTP GET to installed service
You can retrieve the HTML form with correct "post" link to the DataPower box by a simple HTTP GET.
You can then use the stored form.html instead of the HTML form from DataPower box itself.
$ curl -s http://dp3-l3:2229/form >form.html
HTTP POST to installed service
Selection of the clixform mode is done by the endpoint, the files selection is done in the query part of the URL:
As above the selection of the clixform mode is done by the endpoint and the files selection is done in the query part of the URL.
But this time stylesheet clixform.xsl is sent together with the .zip archive to the Non-XML endpoint of coproc2 service (find details here).
Here you can see the difference in generated output for a DataPower backup (left) and a DataPower export (right):
For backups the "domains" section contains the list of domains contained in the backup.
After each line the "go" link allows you to quickly get to the start of the domain's information.
This is what the domain output looks like.
Above the table you find the domain name as well as a link to the top of the document (eg. for selecting another domain by "go" link).
The table contains a configuration and a files section.
You can easily jump between both by the "files" and "configuration" links,
The "(no)" before "files" indicates that this output was generated with "files=none" selection.
For "files=all" it looks like this: "(all)"
If your browser allows you to open "data:..." links you can inspect the files by clicking on "all" link.
The "application/zip" helper application of your browser will open the base64 encoded .zip archive in the all link then.
With the description above you have everything you need to use zip2html tool.
If you are interested in some of the technical details of zip2html.xsl you are right here.
Stylesheet zip2html first creates a dummy context named "swa".
It then attaches the received backup.zip under Content-ID "sys" (cid:sys).
Next step is the extraction of export.xml from top level directory of the archive.
Then the general information HTML table at the top gets created.
Now iteration over the domains found in export.xml is done.
Each domain is contained in top level directory of received archive.
Eg. domain "dom1" will be read from cid:sys and attached as cid:dom1.zip.
The attaching allows to read export.xml from dom1.zip as well as to access
the files information for the domain by querying the attachment manifest.
In addition the files can be extracted and copied over to a newly created archive in case "files=all".
Finally the domain attachment gets removed by dp:strip-attachments before the next domain is processed.
There is a directory dp-aux in received archive (attachment cid:sys).
Above you see the inclusion hierarchy of the files in that directory.
Applying dp:transform(_,_) to clixform.xsl requires a URL as the reference to it.
"attachment://" is the protocol, "swa" is the context, "Archive=zip" says we have a zip-archive (DataPower has support for tar, too).
Finally "Filename=" points to a file in the archive, allowing to access all files at all directory levels.
The output of dp:transform() from clixform.xsl applied to export.xml is used to generate the CLI configuration output of each domain
(with yellow background color).
See the use of "localZip" if you need to create an empty archive yourself.
For each domain a new empty zip-archive gets attached to receive all files selected by "files=all" as "cid:aux".
Finally this zip-archive is read to generate the "data:application/zip;base64,..." links by just using dp:binary-decode().
At the bottom of stylesheet zip2html.xsl you find a section allowing to process "multipart/form-data" directly in the stylesheet.
swaform tool or func:filename() are needed in case you need to process binary file uploads, convert-http action cannot deal with those.
zip2html is realized by a Non-XML loopback XML firewall with "Process Messages Whose Body Is Empty" set to "on" on advanced tab.
The endpoint "/form" is for an HTTP GET request.
The endpoints "/attached", "/onboard" and "/mime" are for HTTP POST requests.
Now comes a tricky part, the dp:input-mapping needed to process the POST request input data requires data on its input context.
But there is no input data for the "/form" GET request using the same stylesheet.
On the right you see the simple solution I found for this mixed GET/POST processing problem for Non-XML input processing stylesheets:
Have two rules, one for "/form" endpoint, and another for everything else.
In "/form" rule have input context NULL for "Transform Binary" action.
In the other rule have input context INPUT -- that's it.
Customer wanted to know how to store files (retrieved by scheduled rule) locally on DataPower box.
Solution consists of: * scheduled rule * dp:url-open to retrieve file * base64 encode it (either its serialization for XML, or the binaryNode response for Non-XML) * use set-file XML management operation to store the file * access XML management interface from within a stylesheet to execute the request
This posting is a complete walk-through of the steps needed to issue calls against the XML management interface from within DataPower stylesheets.
As precondition we have already enabled XML Management interface on dp3-l3:5550. Also we have created user "soma" in group "developer" with password "secret" not restricted to a domain.
We make use of dp:soap-call() extension function. Because the XML Management interface is accessed over SSL normally we have to provide an SSL-Proxy-Profile in that call. Therefore steps A-D show how to create a SSL-Proxy-Profile from scratch via WebGUI.
A) Administration-->Miscellaneous-->Crypto Tools, "Generate Key" (this is just an example)
enter CN "xslt"
click "Generate Key"
Confirm generation of key
This results in the following message:
Action completed successfully.
Generated private key on the HSM. and exported a copy in "temporary:///xslt-privkey.pem
Generated Certificate Signing Request in "temporary:///xslt.csr
Generated Self-Signed Certificate in "cert:///xslt-sscert.pem and exported a copy in "temporary:///xslt-sscert.pem
Generated a Crypto Key object named "xslt" and a Crypto Certificate object named "xslt"
B) Objects-->Crypto Configuration-->Crypto Identification Credentials
set "Name" to "xslt-cred"
select "xslt" as "Crypto Key"
select "xslt" as "Certificate"
C) Objects-->Crypto Configuration-->CryptoProfile
set "Name" to "xslt-profile"
select "xslt-cred" as "Identification Credentials"
select "xslt-profile" for "Forward (Client) Crypto Profile"
Now we are prepared for the stylesheet work.
This is the syntax of dp:soap-call(): dp:soap-call(URL, message, SSL-Proxy-Profile, flags, SOAP-action, HTTP-headers, process-faults, timeout)
We take the default for process-faults (false()) and no timeout. Since the stylesheet is executed on the box the URL is just 'https://127.0.0.1:5550/service/mgmt/current'. As message we pass in the SOMA call -- in the example here the "FirmwareVersion" SOMA call. As SSL-Proxy-Profile we just pass "xslt-ssl", that is what we created by above steps. A "0" is passed for flags as described in documentation. We do not need a SOAP-action, passing the empty string suppresses the addition of this header. Last we have to pass the HTTP headers
application/soap+xml as Content-Type
the basicAuth string with user name and password for Authorization
So find here the complete stylesheet (it is attached in this developerWorks posting):
There is only one way writing files to DataPower filesystem not going through XML Management set-file request:
<dp:dump-nodes> allows to store a nodeSet under a specified filename in "temporary:" folder.
nodeSet has one limitation, and that is that you cannot store arbitrary (binary) files that way.
Adding this functionality (write binary files) to dump-nodes was what RFE 71699 was about.
Customer did know that storing base64 encoded binary data is possible, but wanted to be able to write binary data.
OK, an RFE (Request For Enhancement) requests to add new functionality to a next major release. RFE 71699 was rejected because writing binary data to temporary: folder is possible with today's firmware. Yes, you need the latest 7.2.0.x firmware (for "dp:gatewayscript()" and GatewayScript "_.readsAsXML()"), but 7.2.0 is definitely earlier available than a next major release. The RFE developer update describes in short how to do with v7.2.0 firmware, this blog posting will give you the details.
OK, first we have to write (binary) data to a file in "temporary:" folder with GatewayScript. This can be done via the "fs" (filesystem) API in writing eg. a Buffer. The tricky part is how a XSLT can pass a binaryNode to GatewayScript. A binaryNode is handled in DataPower as "special" XML. And therefore you have to read the input via readAsXML(), the nodelist returned will contain just a single element for a binaryNode passed from XSLT, and that can be easily ".toBuffer()"ed. So this is the complete GatewayScript fs.write.js (click for download):
Now we need only to pass a binary node in XSLT in dp:gatewayscript() call, stylesheet fs.write.xsl (click for download) shows that there is no magic at all, just pass the binaryNode as "input" in dp:gatewayscript() [you have to store "fs.write.js" above into "local:" folder]
Here you can see coproc2 call that executes XSLT and passed Non-XML data (0x03 is no valid XML character): $ coproc2 fs.write.xsl <(echo -en 'te\x3t') http://dp6-l3:2224 $
And here we see in DataPower CLI what is going on:
xi50(config)# dir temporary: File Name Last Modified Size --------- ------------- ---- log/ Jan 27, 2016 1:24:05 PM 4096 datapowerjs/ Jan 11, 2016 11:56:03 AM 4096 export/ Jan 11, 2016 11:56:03 AM 4096 dpmon/ Jan 28, 2016 5:16:00 AM 4096 ftp-response/ Jan 11, 2016 11:56:03 AM 4096 image/ Jan 7, 2016 8:37:04 AM 4096
221792.9 MB available to temporary:
xi50(config)# show file temporary:test.dat
% Unable to display 'temporary:test.dat' - file is not printable
xi50(config)# dir temporary: File Name Last Modified Size --------- ------------- ---- log/ Jan 27, 2016 1:24:05 PM 4096 datapowerjs/ Jan 11, 2016 11:56:03 AM 4096 test.dat Jan 28, 2016 5:25:01 AM 4 export/ Jan 11, 2016 11:56:03 AM 4096 dpmon/ Jan 28, 2016 5:16:00 AM 4096 ftp-response/ Jan 11, 2016 11:56:03 AM 4096 image/ Jan 7, 2016 8:37:04 AM 4096
Processing JSON in XSLT can be done "as string" or via conversion to JSONX since years.
XQuery (with JSONiq extensions) can natively process XML as well as JSON.
GatewayScript can natively process JSON, and with ≥184.108.40.206 firmware natively XML as well.
Customer was on 220.127.116.11 and therefore GatewayScript was no option (≥18.104.22.168).
Solution I provided used XSLT to do backend calls by <dp:url-open> and store results into context variables, and then XQuery script to nicely combine the responses in XQuery (see above forum posting for details):
Here is pbmtobraile online to try out (without the need to install
anything), especially "echo -e ... | pbmtext | pnmcrop |
pbmtobraille". Here is an input form to try out, with short listing
of "echo -e" options: https://stamm-wilbrandt.de/echo-e.to.braille.html