December 10, 2016, 7:42 am
Author:
don.leahy
Posted: Sat Dec 10, 2016 9:12 pm (GMT 5.5)
Another guess: the abend could be coming from the tape management system that might be installed at your shop.
_________________
"Let's work the problem, people. Let's not make things worse by *guessing*".
http://donleahy.blogspot.com/
↧
December 11, 2016, 10:19 pm
Author:
aswinir
Subject: Reply to: High CPU consumption Job using IAM files as input
Posted: Mon Dec 12, 2016 11:49 am (GMT 5.5)
Thanks all.
This was an existing program. I have sorted the input file to filter the records that are in need and read the file sequentially.
↧
↧
December 12, 2016, 12:42 am
Author:
Bill Woodger
Subject: Reply to: High CPU consumption Job using IAM files as input
Posted: Mon Dec 12, 2016 2:12 pm (GMT 5.5)
That's the way to do it. I'd expect around 90% reduction in resources if you are able to apply the whole thing as a two-file match.
↧
December 12, 2016, 12:46 am
Author:
aswinir
Posted: Mon Dec 12, 2016 2:16 pm (GMT 5.5)
Yes that what I am doing. join on keys that i have to process and include only the record types that are needed.
↧
December 12, 2016, 1:48 am
Author:
Bill Woodger
Subject: Reply to: High CPU consumption Job using IAM files as input
Posted: Mon Dec 12, 2016 3:18 pm (GMT 5.5)
To a KSDS and leaving the other program unchanged? Or you've changed the other program as well? What reductions did you get in CPU, IO, and elapsed time?
↧
↧
December 12, 2016, 2:09 am
Author:
aswinir
Posted: Mon Dec 12, 2016 3:39 pm (GMT 5.5)
Its only 1 program.
10m file is the driver file in which id is key record. each id record in parent file has 6 sub records in the 90m file. out of which, the program picks only 3 sub records which are not contiguous. Also the driver file reads 70k records
file to pick few data from this.
Joined the keys of 70K file and 10M file based on the keys and removed duplicates from 70K file. (consider file A)
Joined keys of 10M file and 90M file based on keys and include only the records that are in need. (Consider file B)
Changed the program to drive with new file A and read the 10M file in sequential mode and then reading file B sequentially based on the records in 10 M file.
Note: I have not tested with full set of records. for POC purpose, only 8M records were considered for testing out of 90M. i.e. 8 Million before program changes and with new code. this gave more than 40% improvement in CPU.
↧
December 12, 2016, 2:26 am
Author:
Bill Woodger
Subject: Reply to: High CPU consumption Job using IAM files as input
Posted: Mon Dec 12, 2016 3:56 pm (GMT 5.5)
That sounds about right, without knowing fuller details. Expect larger savings on full-sized files. Huge reduction in IO.
Looking back through the topic, I see the post I thought I'd made isn't there, so you weren't even taking my advice :-) Good work.
I'd also consider whether keeping the 70K file in a COBOL table might save, but it depends entirely on your situation.
↧
December 12, 2016, 2:33 am
Author:
aswinir
Posted: Mon Dec 12, 2016 4:03 pm (GMT 5.5)
I am sorry about it. But the existing program logic was driven thru 10M records file. Reading 70K and 90M file based on keys in random mode.
Since I joined 10M with 70K (also removing duplicates) I gets only less than 3k records (out of 70K). I changed the logic of the program to read 3k file and based on the keys, read the 10M file and 90M file sequentially. hence did not load into cobol table.
Thanks
↧
December 12, 2016, 11:23 pm
Author:
amitc23
Subject: session stuck in TPX with status ACL
Posted: Tue Dec 13, 2016 12:53 pm (GMT 5.5)
Hi
One of my TPX sessions (TSO) is stuck with a status of ACL. I am not able to select the session or kill this , other sessions are ok. Any clue please ?
Thanks
↧
↧
December 13, 2016, 6:47 am
Author:
enrico-sorichetti
Subject: Reply to: session stuck in TPX with status ACL
Posted: Tue Dec 13, 2016 8:17 pm (GMT 5.5)
topic locked, does not belong to ANY forum...
waiting for a forum reply to solve a PUZZLING APPLICAYION STATUS is pretty dumb
_________________
cheers
enrico
When I tell somebody to RTFM or STFW I usually have the page open in another tab/window of my browser,
so that I am sure that the information requested can be reached with a very small effort
↧
December 13, 2016, 8:22 pm
Author:
Ni3-db2
Subject: SQL query to run through list of values in table
Posted: Wed Dec 14, 2016 9:52 am (GMT 5.5)
I have a below query,i tried it but not able to come up with the exact solution.
I have two tables.
Table 1 has a column ID and table 2 have column CODE
NOW for every ID in table 1 there can be multiple rows in table 2 with different values of CODE.
I have to write a query such that for every ID in table 1 search (run through) table 2 to select a code which is valid( it will be valid if it has any of the value A,B,C,D) IF it is not valid make the code as X.
finally sort the output in specific order of code like X,B,A,C,D
Any suggestions/ help will be greatly appreciated.
_________________
Nitin gandhi
↧
December 13, 2016, 10:20 pm
Author:
RahulG31
Subject: Reply to: SQL query to run through list of values in table
Posted: Wed Dec 14, 2016 11:50 am (GMT 5.5)
I think you can do something as below. The first 2 columns should give you what you want and then you may get rid of the last column if needed to. This is Not tested as I don't have the access to mainframe as of now.
Code: |
SELECT
A.ID,
CASE B.CODE
WHEN 'A' THEN 'A'
WHEN 'B' THEN 'B'
WHEN 'C' THEN 'C'
WHEN 'D' THEN 'D'
ELSE 'X'
END,
CASE B.CODE
WHEN 'A' THEN '3A'
WHEN 'B' THEN '2B'
WHEN 'C' THEN '4C'
WHEN 'D' THEN '5D'
ELSE '1X'
END AS MY_ORDER
FROM TABLE1 A, TABLE2 B
ORDER BY MY_ORDER;
|
.
↧
December 14, 2016, 5:25 am
Author:
Kyle Carroll
Subject: INDEPENDENT CICS TS 4.1 MRO REGION UPGRADES TO CICS TS V5.3
Posted: Wed Dec 14, 2016 6:55 pm (GMT 5.5)
Hi,
I am new to a company that has CICS TS V4.1 MRO regions that I am upgrading to CICS TS V5.3.
These MRO regions include a TOR, 2 AOR and a FOR.
They all share the same SDFHAUTH, SDFJAUTH, SDFHLOAD, SDFHWSLD, SEYUAUTH, and SEYULOAD.
I don't know why the previous systems programmers shared these files and I am considering creating separate files for each region so that I can upgrade them to CICS TS V5.3 one at a time rather than all at once.
My question is it standard operator procedure or required to have those same files used by all CICS regions in an MRO environment?
Are there any issues with each CICS region in an MRO to have these files as separate files per region?
I would like to have separate files for each regions SDFHAUTH, SDFJAUTH, SDFHLOAD, SDFHWSLD, SEYUAUTH, and SEYULOAD so I can not only upgrade them one at a time (on different days/weeks) but also if one region has issues after the upgrade then I can backout that region only and not all of them.
Thanks in advance for your help and all have a Merry Christmas!
Kyle
↧
↧
December 14, 2016, 7:12 am
Author:
Arun Raj
Posted: Wed Dec 14, 2016 8:42 pm (GMT 5.5)
I have not touched DB2 in a while, but should n't we be checking for something like this? This is not tested either and does n't take care of the order.
Code: |
SELECT A.ID, VALUE(B.CODE,'X') FROM TABLE1 T1 LEFT JOIN (SELECT ID,CODE FROM TABLE2 WHERE CODE IN ('A','B','C','D')) T2
WHERE T1.ID = T2.ID
|
It might help if the OP can post the structure of both the tables (relevant columns) and some sample data too.
_________________
Arun
----------------------------------------------------------------------------------------------------
Love is like an hourglass, with the heart filling up as the brain empties. -Jules Renard
↧
December 14, 2016, 7:15 am
Author:
mbenaud
Subject: TWS EQQYCAIN - not sure how to get application description
Posted: Wed Dec 14, 2016 8:45 pm (GMT 5.5)
OK, I have a JOB that lists applications in groups (in TWS) and outputs this for information purposes.
I use the following EQQYCAIN sysin to generate my list.
What I need is the text description of the application also (if this is indeen possible)
Code: |
queue 'ACTION=OPTIONS,BL=Y,BLPRT=N,LTP=N;'
queue 'ACTION=LIST,RESOURCE=ADCOM,GROUPDEF='||grp||',TYPE=A,STATUS=A.
|
I know this is REXX / ISPF but this is how I am choosing to execute the API / PIF - I have trawled the API / PIF manuals without success - all I get is an application and then a expiry date
Any idea's - Cheers
_________________
Martin B
↧
December 14, 2016, 9:25 am
Author:
Debb.Brant
Subject: Viewing executing process in NDM .. questions about compress
Posted: Wed Dec 14, 2016 10:55 pm (GMT 5.5)
This is just a "why" question .. no specific problem, I am just wondering if others have seen this.
We are using NDM to transfer large files that are densely populated with data (i.e., no spaces, no regularly occurring patterns of text). For this reason, I turned off compression in the process. A quick side-note, turning off compression made a 70% difference in transfer times for these large files.
The problem is, when we view the executing process via the TSO interface, the compression factor is usually indicated to be between .3% and 2.2% as the process executes. When I look into the statistics of the completed process, no compression was used, and the compression factor is shown properly as 0.0%.
I have confirmed that the executing process does not have the COMPRESS keyword in it, and the relevant netmap entries also have nothing indicating a default compression scheme. Our NDM initialization setup is not forcing compression on any process, either.
So .. has anyone else seen this? Does anyone have a guess as to why it's showing any compression factor at all? After discussing with my colleagues, the best guess is that it's estimating what could be done with compression .. but that makes no sense to me.
I'm attaching a screenshot of the executing process, and a screenshot of the completed process. Please note the compression factor shown in the very right bottom corner of each screenshot. I am also pasting in the view of the executing process here ..
Code: |
********************************************************** Top of Data ***
========================================================== VIEW PROCESS PROCESS NAME: xxxxxxx PROCESS NUMBER: 23,234
==========================================================
label PROCESS SNODE=xxxxxxxxxxxxx - PNODE=xxxxxxxxxxxx - PNODEID=(xxxxxxxx,XXXXXXX) - SNODEID=(xxxxxxx,XXXXXXXX) - RETAIN=NO - MAXDELAY=UNLIMITED - CLASS=2 - PRTY=10
TEP01 COPY FROM - (PNODE - DSN='xxxxxxxxT.ENH2461.G0082V00' - SPACE=(27920,(00000419,00000900,),,,ROUND) - DCB=( - BLKSIZE=0000027920 - DSORG=PS - LRECL=00080 - RECFM=FB - ) - DISP=(OLD,DELETE,KEEP) - UNIT=(3390,,) - ) - TO - (SNODE - DSN='xxxxxxxEDST.ENH2461.G0001V00' - DCB=( - BLKSIZE=0000027920 - DSORG=PS - LRECL=00080 - RECFM=FB - ) - DISP=(NEW,CATLG,DELETE) - UNIT=(3390,,) - )
********************************************************* Bottom of Data *
|
↧
December 14, 2016, 9:45 am
Author:
Rohit Umarjikar
Posted: Wed Dec 14, 2016 11:15 pm (GMT 5.5)
RahulG31, I think, you are missing the relation between two tables.
Arun, I would like that approach but what if table2 doesn't have a matching row? in your case it will still be marked as 'X' but OP wants to mark X only for the entries other than (ABCD).
Nitin,
Welcome!
Before you expect a solution to the problem, please state the problem correctly along with the table structure, sample data and the desired output because right now it is a guess work.
However, based on what is stated, try this.
Code: |
select T3.ID1, T3.code2
from (SELECT T1.ID as ID1, T2.code1 as code2, case when T2.code1 = 'X' then 1 when T2.code1 = 'B' then 2 when T2.code1 = 'A' then 3 when T2.code1 = 'C' then 4 when T2.code1 = 'D' then 5 end FROM TABLE1 T1, (SELECT ID,case when CODE NOT IN ('A','B','C','D') then 'X' else CODE end as code1 FROM TABLE2 ) T2
where T1.ID = T2.ID
order by 3) as T3
|
_________________
Regards,
Rohit Umarjikar
"Knowledge is knowing that a tomato is a fruit, but Wisdom is knowing not to put it in a fruit salad."
↧
↧
December 14, 2016, 11:42 am
Author:
Arun Raj
Posted: Thu Dec 15, 2016 1:12 am (GMT 5.5)
Rohit,
Yes, that was an assumption I made that all table1 IDs are available in the second table unless the OP has stated otherwise, or what he wants in such a scenario.
Apart from that, the interpretations are different here. My understanding was like this, the OP can clarify if it is incorrect.
Let's say an ID=ID1 has the CODEs A,B,C,E,F. Since it has ANY one of the 'valid' IDs, my output will have only these:
Let's say another ID=ID2 has the CODEs E,F,G,H since none of them falls in the list of 'valid' CODEs, so the output will have these:
I see your code will give this for Case1: Code: |
ID1 A
ID1 B
ID1 C
ID1 X
ID1 X |
I'd wait for the OP to clarify the requirement before we proceed any further.
_________________
Arun
----------------------------------------------------------------------------------------------------
Love is like an hourglass, with the heart filling up as the brain empties. -Jules Renard
↧
December 14, 2016, 12:53 pm
Author:
Rohit Umarjikar
Posted: Thu Dec 15, 2016 2:23 am (GMT 5.5)
Quote: |
I see your code will give this for Case1: |
I think, you missed to look at my order by.
Quote: |
Let's say an ID=ID1 has the CODEs A,B,C,E,F. Since it has ANY one of the 'valid' IDs, my output will have only these: |
Your left outer join will give this and not just ABC.
Code: |
ID1 A
ID1 B
ID1 C
ID1 X
ID1 X |
Quote: |
Let's say another ID=ID2 has the CODEs E,F,G,H since none of them falls in the list of 'valid' CODEs, so the output will have these: |
This will come 4 times unless we use DISTINCT.
_________________
Regards,
Rohit Umarjikar
"Knowledge is knowing that a tomato is a fruit, but Wisdom is knowing not to put it in a fruit salad."
↧
December 14, 2016, 1:17 pm
Author:
Arun Raj
Posted: Thu Dec 15, 2016 2:47 am (GMT 5.5)
I was mentioning about the contents and not the ORDER. Let's keep the order aside for now.
Rohit Umarjikar wrote: |
Your left outer join will give this and not just ABC |
NO. My right table does not even select entries other than A,B,C,D. So how do you expect it to return the 2 extra Xs in case1 or the 4 Xs in case2?
_________________
Arun
----------------------------------------------------------------------------------------------------
Love is like an hourglass, with the heart filling up as the brain empties. -Jules Renard
↧