We have an old program with a sequential tape file that is being pushed by the data elements growth to need more than 32K in each records. The programmer asked if we had any workarounds for IGYGR1224-E.
My intial suggestion was that they could migrate the active data into a database like DB2 or IMS but this is an old system which they would prefer to avoid significant development on. The files are very large with multiple copies
The file is an ordinary sequential file with old, new master file type processing i.e.
LABEL RECORDS ARE STANDARD
BLOCK CONTAINS 0 RECORDS
RECORDING MODE IS V
DATA RECORD IS NEW1-RDDB-REC.
We are at the most current releases Enterprise COBOL 4.2 on z/OS 1.13
I am not aware of any relief that would help them out here. We would look at LBI if it would help but it appears that this would not actually help in this case. They need to process this in a simple COBOL program. Any suggestions that have worked well for other customers in similar constaint?
Sam Knutson, GEICO
System z Team Leader
"Think big, act bold, start simple, grow fast..."
Pinned topic maximum record size > 32767 IGYGR1224-E
Answered question This question has been answered.
Unanswered question This question has not been answered yet.
Updated on 2012-11-19T23:18:16Z at 2012-11-19T23:18:16Z by BillWoodger
BillWoodger 270005Q076285 Posts
Re: maximum record size > 32767 IGYGR1224-E2012-11-19T23:18:16ZThis is the accepted answer. This is the accepted answer.32767 is a QSAM limit.
Although not with those record sizes and the type of volume you are indicating, I've done similar by having "split" records on the physical file, and "stitching them together" in a sub-routine which is actually doing the reading. Main program "reads" a logical record via the sub-routine. Sub-routine does one or more reads of actual records and puts together the bigger record. One the "way out" the stitching is "unpicked" to put one or more actual records per logical record.
You might look at VBS which is the system behaving similarly, but there may be more flexibility in doing it yourself.
I'd try to make the actual records well below 32k. If you have to split you may as well get far away from the limit and allow easier ad-hoc processing of the file.
Watch out to minimise the tossing-around of the data if you go for this.
If it runs like a dog with no legs, there are probably ways around that.