Found this in my GatewayScript folder, written some time ago and never posted.
And I wanted to execute it with GatewayScript, but doing so directly was too easy. So I wanted to execute it from DataPower XSLT via "dp:gatewayscript()".
The trick is to write to the only filesystem accessible to XSLT output without accessing the XML Managament Interface. DataPower XSLT can output to "temporary:" folder via <dp:dump:nodes>.
There is another reason why doing this might not be what you want to do in production:
nodejs "eval()" function is disabled in GatewayScript for good (security) reasons.
You can (and should not, or only if you really know what you do) use below method to get something like nodejs "eval()" inside DataPower.
There is only one way writing files to DataPower filesystem not going through XML Management set-file request:
<dp:dump-nodes> allows to store a nodeSet under a specified filename in "temporary:" folder.
nodeSet has one limitation, and that is that you cannot store arbitrary (binary) files that way.
Adding this functionality (write binary files) to dump-nodes was what RFE 71699 was about.
Customer did know that storing base64 encoded binary data is possible, but wanted to be able to write binary data.
OK, an RFE (Request For Enhancement) requests to add new functionality to a next major release. RFE 71699 was rejected because writing binary data to temporary: folder is possible with today's firmware. Yes, you need the latest 7.2.0.x firmware (for "dp:gatewayscript()" and GatewayScript "_.readsAsXML()"), but 7.2.0 is definitely earlier available than a next major release. The RFE developer update describes in short how to do with v7.2.0 firmware, this blog posting will give you the details.
OK, first we have to write (binary) data to a file in "temporary:" folder with GatewayScript. This can be done via the "fs" (filesystem) API in writing eg. a Buffer. The tricky part is how a XSLT can pass a binaryNode to GatewayScript. A binaryNode is handled in DataPower as "special" XML. And therefore you have to read the input via readAsXML(), the nodelist returned will contain just a single element for a binaryNode passed from XSLT, and that can be easily ".toBuffer()"ed. So this is the complete GatewayScript fs.write.js (click for download):
Now we need only to pass a binary node in XSLT in dp:gatewayscript() call, stylesheet fs.write.xsl (click for download) shows that there is no magic at all, just pass the binaryNode as "input" in dp:gatewayscript() [you have to store "fs.write.js" above into "local:" folder]
Here you can see coproc2 call that executes XSLT and passed Non-XML data (0x03 is no valid XML character): $ coproc2 fs.write.xsl <(echo -en 'te\x3t') http://dp6-l3:2224 $
And here we see in DataPower CLI what is going on:
xi50(config)# dir temporary: File Name Last Modified Size --------- ------------- ---- log/ Jan 27, 2016 1:24:05 PM 4096 datapowerjs/ Jan 11, 2016 11:56:03 AM 4096 export/ Jan 11, 2016 11:56:03 AM 4096 dpmon/ Jan 28, 2016 5:16:00 AM 4096 ftp-response/ Jan 11, 2016 11:56:03 AM 4096 image/ Jan 7, 2016 8:37:04 AM 4096
221792.9 MB available to temporary:
xi50(config)# show file temporary:test.dat
% Unable to display 'temporary:test.dat' - file is not printable
xi50(config)# dir temporary: File Name Last Modified Size --------- ------------- ---- log/ Jan 27, 2016 1:24:05 PM 4096 datapowerjs/ Jan 11, 2016 11:56:03 AM 4096 test.dat Jan 28, 2016 5:25:01 AM 4 export/ Jan 11, 2016 11:56:03 AM 4096 dpmon/ Jan 28, 2016 5:16:00 AM 4096 ftp-response/ Jan 11, 2016 11:56:03 AM 4096 image/ Jan 7, 2016 8:37:04 AM 4096
Processing JSON in XSLT can be done "as string" or via conversion to JSONX since years.
XQuery (with JSONiq extensions) can natively process XML as well as JSON.
GatewayScript can natively process JSON, and with ≥184.108.40.206 firmware natively XML as well.
Customer was on 220.127.116.11 and therefore GatewayScript was no option (≥18.104.22.168).
Solution I provided used XSLT to do backend calls by <dp:url-open> and store results into context variables, and then XQuery script to nicely combine the responses in XQuery (see above forum posting for details):
Here is pbmtobraile online to try out (without the need to install
anything), especially "echo -e ... | pbmtext | pnmcrop |
pbmtobraille". Here is an input form to try out, with short listing
of "echo -e" options: https://stamm-wilbrandt.de/echo-e.to.braille.html
I wanted to be able to do same or similar [of course store as file and view via gimp or browser is possible].
Back in 2011 I gave 2 WSTE webcasts on "Non-XML Data Processing in WebSphere DataPower SOA Appliances Stylesheets". The 2nd webcast shows on slide 28 how I did convert a bitmap image into textual output making use of the Braille Patterns.
This is the conversion for snowman.pbm, .pbm is portable bitmap format from netpbm tools:
Typically only the top 2x3 dots of 2x4 get used, as you can see above I used all 2x4.
samples.txt[.pre.html] contains various sample output produced (shown below), which are part of pbmtobraille.c's comments too.
9x9.pbm is a really crazy parsing sample according the .pbm spec, following this statement:
"Mr. Poskanzer cautions that programs that read this format should be as
lenient as possible, accepting anything that looks remotely like a pixmap."
This is the header section demonstrating basic use with pbmtext output, including negation of generated output, as well as the help line telling the tool's features:
This is top of tail comment section, showing graphviz output done by pbmtobraille:
And finally this is bottom section showing a bigger layout in vertical direction (layout=TB is default, Top to Bottom):
Not surprisingly the stylesheet did not compile, it missed "version", "encoding" and other stuff required by XSLT 1.0 spec and did not know what "𝖛𝖊𝖗𝖘𝖎𝖔𝖓" and "𝖊𝖓𝖈𝖔𝖉𝖎𝖓𝖌" should be. So I repeated compile-fix until the stylesheet finally compiled again. This is the result -- vi's syntax highlighting does not like black-letter ;-)
The whole thread discussed that handcrafted FFDs are not supported and referred back to this 2012 posting on the options. It also listed the only Enhancement Request that has been done since 2007 for FFD processing (FFD PMRs were fixed of course).
Further below I will show how easily binary data processing can be done with GatewayScript (available with 22.214.171.124 firmware). But before lets summarize all DataPower Non-XML data processing options here in one place:
Contivo FFDs: you need Contivo Analyst product,
no handcrafted FFDs are supported (you cannot raise PMRs against handmade FFDs)
"Binary data processing without DataGlue license!" technique, with
... <hexbin>...<hexbin> --(XSLT)-- base64 ...
... <hexbin>...<hexbin> --(XQuery)-- base64 ... (with 126.96.36.199 or higher firmware, allows for XPath 2.0)
One comment on option 4. While this works without Dataglue license (these days a XG45 without DIM option) you have to "pay" the price in form of added latency and memory consumption of the attachment processing needed by that technique.
The simple GatewayScript data structure for processing binary data is the buffer object.
For reading binary input we use readAsBuffer() method, and its documentation tries to move people to use the Buffers object.
When contexts are small, use the readAsBuffer() function. Use the readAsBuffers() function when a context is large. The readAsBuffer() function requires a contiguous memory allocation. The readAsBuffer() function is faster but is more demanding on the memory system. The readAsBuffers() function does not require contiguous memory to populate the Buffers object.
Use of Buffers might be valid for some Non-XML processing, but when the application needs access to whole input I prefer buffer.
Good news is that the first (workable) Non-XML sample program can be found in readAsBuffer() documentation itself. It is a binary identity operation with error handling. Here you can see rAB.js:
Since binary identity is not that interesting lets see now the binary reverse operation from "... without Dataglue license" posting. Adding 5 lines to rAB.js does the job. here is reverse.js:
Now lets see what both do on sample input from "... without DataGlue license" posting:
$ od -tcx1 0in0put0 0000000 \0 2 b | ~ 2 b - \0 t h a t i 00 32 62 7c 7e 32 62 20 2d 00 74 68 61 74 20 69 0000020 s t h e q u e s t i o n \0 73 20 74 68 65 20 71 75 65 73 74 69 6f 6e 00 0000037 $ $ coproc2 rAB.js 0in0put0 http://dp2-l3:2227 -s | od -tcx1 0000000 \0 2 b | ~ 2 b - \0 t h a t i 00 32 62 7c 7e 32 62 20 2d 00 74 68 61 74 20 69 0000020 s t h e q u e s t i o n \0 73 20 74 68 65 20 71 75 65 73 74 69 6f 6e 00 0000037 $ $ coproc2 reverse.js 0in0put0 http://dp2-l3:2227 -s | od -tcx1 0000000 \0 n o i t s e u q e h t s i 00 6e 6f 69 74 73 65 75 71 20 65 68 74 20 73 69 0000020 t a h t \0 - b 2 ~ | b 2 \0 20 74 61 68 74 00 2d 20 62 32 7e 7c 62 32 00 0000037 $
OK, that was small input, lets process 10MB.
$ head --bytes 10M /dev/urandom > out $ du -sb out 10485760 out $ $ time ( coproc2 reverse.js out http://dp2-l3:2227 -s > out.2 )
real 0m1.048s user 0m0.012s sys 0m0.080s $ time ( coproc2 reverse.js out.2 http://dp2-l3:2227 -s > out.3 )
real 0m1.064s user 0m0.008s sys 0m0.075s $ time ( coproc2 rAB.js out.2 http://dp2-l3:2227 -s > out.4 )
real 0m0.294s user 0m0.014s sys 0m0.074s $ $ diff out out.3 $ diff out.2 out.4 $ $ sha1sum out out.2 8fe128844bc9a19aac275272e243a1c4ce6adc7b out 66590beec5c0e226ba9efc8436d119589a67e9d8 out.2 $
Last question to be answered is on the runtime of rAB.js and reverse.js on the 10MB input. That can be answered easily based on the ExtLatency logging target of coproc2gatewayscript again blog posting:
So the reverse operation on 10MB data (read binary data, revert, output result to context) took (908-137)=771msec.
The binary identity operation on 10MB data (read binary data, output input to context) took (147-134)=13msec.
<?xml version='1.0'?> <disclaimer> <p>The opinions represented herein represent those of the individual and should not be interpreted as official policy endorsed by this organization.</p> </disclaimer>
Now it is clear what should happen if opening document.xml.
$ echo "<foobar/>" | coproc2 xinclude.demo.xsl - http://dp2-l3:2223 ; echo <?xml version="1.0" encoding="UTF-8"?> <document xmlns:xi="http://www.w3.org/2001/XInclude"> <p>120 Mz is adequate for an average home user.</p> <disclaimer> <p>The opinions represented herein represent those of the individual and should not be interpreted as official policy endorsed by this organization.</p> </disclaimer> </document> $
Just use "func:document(_)" instead of "document(_)", that is all that is needed:
And this is xinclude.xsl implementing "Basic Inclusion" only from XInclude spec:
Some words of caution:
xinclude.xsl does not copy comments, you will have to modify if you want to keep them
xinclude.xsl does not detect infinite xi:include loops
absolute URIs work as expected
relative URIs get interpreted relative to the storage location of "xinclude.xsl" (!)
[that is where the document() function gets executed inside func:document()].
So you may want to copy it to your local filesystem.
"/.." is not allowed for document() function in local: filesystem
DataPower has a XSLT 1.0 processor since day 1 long ago (1999), and that provides XPath 1.0.
(There is a wealth of EXSLT as well as DataPower proprietary extension functions/elements)
With 188.8.131.52 firmware DataPower shipped XQuery 1.0 processor as well, and that provides XPath 2.0.
Any valid XPath 2.0 statement is a valid XQuery script by definition.
And there is a HUGE amount of functions and operators in XPath 2.0 (more than 250): http://www.w3.org/TR/xpath-functions/
With 184.108.40.206 firmware released last Friday we got much new exciting stuff (I blogged on GatewayScript before).
In developerWorks DataPower forum thread Date Conversion from Gregorain to Hijri Asif asked on how to convert Date to Hijri. Wikipedia told me that Hijri is the islamic calendar. External links section showed Calendar Converter link (http://www.fourmilab.ch/documents/calendar/) and that site is pretty cool. It converts date entered in one format to many others (Gregorian, Julian day, Julian, Hebrew, Islamic, Persian, Mayan, Bahá'í, Indian Civil, French Republican, ISO-8601, Unix time(), Excel Serial Day Number).
First lets see both in action, against XSLT and GatewayScript coproc2 endpoints:
Since the complete work is done in the modules, gregorian_to_islamic.js is minimal:
I am responsible for DataPower Probe since I joined DataPower (Compiler) team 7 years ago. Just looked at the 18 Probe related APARs from the last 4 years and saw that I did fix half of them.
Probe is a very useful debugging tool in DataPower development. I have answered many Probe related questions in developerWorks DataPower forum over the last years. Last Thursday posting https://www.ibm.com/developerworks/community/forums/html/topic?id=a044516f-b2ee-4684-958d-d247687586e2
was the first time I have seen request on determining whether Probe is enabled or not INSIDE a stylesheet being executed. Ted's usecase is on different amount of logging he wants to have depending on Probe is enabled or not. This may be solved alternatively via log level, but the question on how to determine Probe status in stylesheet was interesting.
The status provider which helps on determining Probe status is "Stylesheet Status" [dp:variable('var://service/system/status/StylesheetStatusSimpleIndex')]. I did execute a stylesheet via coproc2 service two times, and "Stylesheet Status" shows "coproc2.xsl" needed by coproc2 service and two temporary stylesheets (one new gets created each time a coproc2 execution happens).
As you can see there is an indication on whether a stylesheet is streamable (xslt-s-f-...) or not streamable (xslt-f-...). Then I did enable Probe for coproc2 service and did another coproc2 request:
As you can see besides the (new) temporary stylesheet "webgui:///msdebug-client.xsl" is displayed. This stylesheet is used by DataPower internally to capture all the data for each context of a Transaction with Probe enabled. And you can see that now the (same) stylesheet that was streamable before is marked as not streamable (xslt-f-...). Reason is that enabling Probe automatically converts all "PIPE" contexts to "real" contexts and disables all streaming processing.
So for stylesheets that are streamable you might just look for streamability indicator in Stylesheet cache, but the vast majority of stylesheets is NOT streamable. For these we can look for the presence of "webgui:///msdebug-client.xsl" in XML manager of the stylesheet's service.
Now I turned off Probe for coproc2 service again and did another coproc2 transaction:
As you can see the new temporary stylesheet has the streambility indicator again (xslt-s-f-...), but "webgui:///msdebug-client.xsl" is still present. Reason is that disabling Probe does not remove that stylesheet from stylesheet cache.
So in order to use "webgui:///msdebug-client.xsl" as Probe status indicator we need to do:
1) on "enable" Probe, nothing
2) on "disable" Probe, flush service's XML Manager's Stylesheet cache xi52(config)# clear xsl
Usage: clear xsl cache <xml-mgr>
xi52(config)# clear xsl cache coproc2xform
Cleared cache of XML manager 'coproc2xform'
3) In order to not get fooled by another service's Probe enabled with same XML Manager, use a XML Manager specific to each service
In order to look for presence of "webgui:///msdebug-client.xsl" in stylesheet cache of XSLT's service's XML Manager we need to determine its name. This can be easily done by "dp:variable('var://service/xmlmgr-name')". This stylesheet now does put all pieces together, and returns a simple true/false response. In addition it measures the time needed for the test, measured in milliseconds. Here you see that it takes 1msec on XI52 with Probe enabled: $ echo "<foobar/>" | coproc2 ssstat.xsl - http://firestar:2223 ; echo
Customer raised question on developerWorks DataPower forum on how to convert client SOAP With Attachmnt (SWA) request on DataPower to REST. Yesterday's posting specifically raised the question on how to convert SWA to JSON-WSP attachment service request format.
The response attachment .zip contains sample service export, sample files and quite some useful tools for dealing with SWA files and DataPower. Besides the swa2json-wsp.xsl stylesheet these tools are useful for dealing with SWA files:
a bash script for creating a SWA file with dummy root part and all passed files as attachments
auxiliary program for detemining ContentType of an arbitrary file
auxiliary bash script that determines the boundary of a given SWA file
tool that determines SWA boundary and populates the header variable before sending request by curl
With these tools sending some attachments to DataPower SWA service is just two commands (see posting above for sample executions):
End of 2012 I did blog posting BMP.xsl.html (Basic Multilingual Plane) in which I described stylesheet BMP.xsl that created nice 16x16 table with each cell containing a 16x16 table containing a single unicode character, that spans the complete Basic Multilangual Plane (BMP, Unicodes 0-65535).
In this xml-dev posting John Cowan pointed to two BMP videos and an A0 .pdf poster.
I replied to that posting and criticized that the A0 poster does not show any BMP structure.
And I did provide XQib (XQuery in the browser) script BMP.xq.html in that posting that creeated the complete BMP table from my previous posting on the fly (the XSLT 1.0 solution did need a two-step approach). While only 2.2KB in size, it does take more than a minute to render in my browser, though.
I wanted to have BMP.xq.html A0 poster, but the biggest paper format our printers do provide is A3.
So I modified the XQuery script resulting in 8 of them spanning the whole BMP.
And for each I created the A3 PDF file which you can access below.
Since today his A0 = 8 x A3 BMP poster covers the outside wall of my Boeblingen lab office ;-)
And clicking each of the 8 XQuery links below now takes at most 13 seconds to render in my (Firefox) browser
Recently I got a stylesheet with really HUGE number of arguments in a concat() function.
My first approach of determining the arity was not nice and needed handwork, but did give the result I needed.
Later I learned from our XQuery team that XPath 2 has a much simpler solution for that:
Just replace concat(...) by count( (...) ) making the argument list an XPath 2 sequence and applying the count() function.
This approach is really easy, and can be evaluated by any XQuery processor.
With v6.0 firmware DataPower has an XQuery 1.0 compiler, so you can evaluate with DataPower.
You can also evaluate on any other XQuery 1.0 processor like saxon.
Or you can do it online at http://try.zorba.io.
From my initial XML-->JSONX-->JSON transformation rdp87.2.xsl:
I created "all-in-one" stylesheet for doing the XML-->JSON transformation rdp87.3.xsl:
Now I created two XML FW on a box with 220.127.116.11 firmware containing the (performance) fix for APAR IC90781 from May 2013 fixpack.
Both services used an XML Manager with "Profile Rule" in XML Manager's Compile Options policy.
Finally I did:
clear stylesheet cache for XML manager "profile"
send 1000 requests to service by
$ time for((f=1;f<=1000;++f)); do curl --data-binary @rdp87.1000.xml http://dp7-l3:port ; echo; done
switch to "Status->XML Processing->Stylesheet Profiles" in WebGUI and take screenshot
Here is Stylesheet Profiles for XML->JSONX->JSON:
And here is Stylesheet Profiles for XML->JSON:
Yes, the total time taken for rdp87.2.xsl (1961ms) in the templates is by a factor of 2.66 greater than that of rdp87.3.xsl (735ms).
But that does only make an effect if the whole operation of a service is just XML->JSON conversion.
Typically in real live services other operations like AAA, SLM actions or other will happen.
If this is the case, then the (small) runtime overhead added is outweighted by maintainability and ease of stylesheet in my eyes.
<UPDATE> rdp87.2a.xsl runtime is reduced by 15% compared to rdp87.2.xsl in making use of <xsl:import> and <xsl:apply-templates> instead of dp:transform().