1829447 Members
1689 Online
109992 Solutions
New Discussion

Re: DCL to program

 
Wim Van den Wyngaert
Honored Contributor

DCL to program

I have a big DCL script and I would like to convert it to a program. Is there any free tool available to migrate it automatically without manual corrections afterwards ?
Reason : performance.

Wim
Wim
23 REPLIES 23
Karl Rohwedder
Honored Contributor

Re: DCL to program

Wim,

as far as I know, there is no such tool.

In former days there was a tool to convert DCL to FORTRAN, but I don't know what happened to it.

Since DCL may perform a lot of substition or image activations during runtime, the performance benefit of a high level language is questionable.


mfg Kalle
Ian Miller.
Honored Contributor

Re: DCL to program

I have not seen one. The performance gain depends on what the DCL does but can be significant especially if you program what is intended rather than a literal translation.
____________________
Purely Personal Opinion
Wim Van den Wyngaert
Honored Contributor

Re: DCL to program

Any hints of how to improve performance are also welcome. Are there logicals that can influence DCL ?

I already
1) optimized working set size
2) minimized number of char for a command
3) installed freq used commands

The script is 4000 lines and is using a lot of gosub's. Calls are used less because resulting in extra load.

Wim
Wim
Ian Miller.
Honored Contributor

Re: DCL to program

depends on the dcl but a lot of DCL runs lots of images so it helps if the images are installed (at least open/header). Often the frequent image activation overhead is the thing that takes most time so if you can run less images or less often then its a good thing.
____________________
Purely Personal Opinion
Karl Rohwedder
Honored Contributor

Re: DCL to program

Wim,

I'm sure you did already:
- squeeze the procedure using DCL_DIET or an
equivalent tool
- may be put it on a ramdisk
- tweak RMS parameters to faster read the
procedure

mfg Kalle
Wim Van den Wyngaert
Honored Contributor

Re: DCL to program

Karl,

Is ramdisk still needed with the file cache ?

The procedure is executed once and stays active for days or weeks. Is DCL reading it again and again or how is this working in detail ?

I did a manual diet but want to keep it readable.

Wim
Wim
Karl Rohwedder
Honored Contributor

Re: DCL to program

I suppose a Ramdisk... is then not needed, if
it stays online for days.

May be it is possible to reduce the number of image activations, e.g. I have seen examples, where a routine does:
$ image par1
$ image par2
$ image par2
which could be replaced by
$ image
par1
par2
par3
(simple example, I know :-])

mfg Kalle
Wim Van den Wyngaert
Honored Contributor

Re: DCL to program

Karl,

Thx for the suggestion. I already did it but forgot to mention it.

Wim
Wim
Hein van den Heuvel
Honored Contributor

Re: DCL to program


Did the dcldiet version run significantly faster?


Have you done the basic performance analysys where you for example identify which chunk is slow? For example:
- maybe the procedure has an init, read, sort, update, delete, report, exit phase and only a single stage reallly defines the time.
- maybe there there are a few loops. What is the order of magnitude of the loop count? thousands? millions? The millions loop should probably be a program.

How about a partial solution? Maybe a particula file analysis phase is more readily done in... perl ?

Show us / describe the part the you think hurts, and maybe we see a speedier solution
(provide sample input/output data if appropriate).

Groetjes,
Hein.
Wim Van den Wyngaert
Honored Contributor

Re: DCL to program

Hein,

There is no problem with the script. I just want to find an easy way to make it faster.

Most of its functions execute a command and parse the output with lexicals (f$el, f$extr, f$len,f$loc).

Wim
Wim
Hein van den Heuvel
Honored Contributor

Re: DCL to program

>> There is no problem with the script. I just want to find an easy way to make it faster.

So there is a problem... it's not fast enough :-).

Seriously, did you try the DCL-diet version to learn whether the dcl text size is a factor?

>> Most of its functions execute a command and parse the output with lexicals (f$el, f$extr, f$len,f$loc).

I suspected that, and those are so often readily replaced with much better performing perl constructs often involving regular expression. Even if that involves extra image activations for perl it may be a big win.

Hein.
Travis Craig
Frequent Advisor

Re: DCL to program

Wim,

My only data point on DCL script performance comes from one in which, as soon as I added a header section of comments, performance became very bad. The basic script runs through a loop over and over. The comments were all before the loop, but just adding them made the loop take many times as much CPU as it did without the comments. The loop, by the way, had a short wait inside it, so it was not normally CPU-bound.

--Travis
My head is cold.
Wim Van den Wyngaert
Honored Contributor

Re: DCL to program

Hein,

No perl for me. My goal is that any VMS system manager can read/understand the script. And perl is not basic knowledge. Just as awk isn't.

A second goal was that nothing had to be installed for using the script. So, dcl was the only thing thats left.

If a program could replace it, I could go on in dcl and generate the program after each modif.

Is there any doc on how command scripts are processed ?

Wim
Wim
Ian Miller.
Honored Contributor

Re: DCL to program

Travis, the classic workaround for your problem is to add a GOTO that jumps around the block of comments - this is faster, or move the history comments to the end of the file.
____________________
Purely Personal Opinion
Hein van den Heuvel
Honored Contributor

Re: DCL to program


>> No perl for me. My goal is that any VMS system manager can read/understand the script. And perl is not basic knowledge.

I understand and appreciate the sentiment, but respectfully dissagree.
A VMS system manager NOT knowing perl (or python, or awk) is holding him himself, and possibly his company, back.
Over time you may find is easier to find someome with perl skills than intricate DCL skills.

string manipulation with F$EXTR, F$LOC and so on on system data is perfectly reasonable andthe best way to write (maintenance) scripts. However, IMHO, it is often NOT appropriate, or no longer appropriate, to use DCL for basic dataprocessing like rearranging a generic contact list, or summarizing an order file, nor for post processing a VMS audit/error/operator/accounting log
But hey, that's just one opinion. And perl is just an example it will come and go like everything else.

>> A second goal was that nothing had to be installed for using the script. So, dcl was the only thing thats left.

Ah. Fully understood. That's why I still code in macro at times. Sometimes I even embed a macro program in a dcl procedure. That is: The procedure tests for an image. If the image is not there, then it branches out, compiles a source ($DECK), links, deletes object, and goes back to use the freshly created image. This gives a single, ascii distribution kit and no compilers needed.

>> If a program could replace it, I could go on in dcl and generate the program after each modif.

Ah, I mis-understood you all along. I somehow decided you wanted to convert the DCL to a C, or Fortran program. And in that case, perl should be evaluated, as is can be easier to read/maintain.
But you were really talking about a 'dcl compiler' right? Ah... then perl has no relevance.

>> Is there any doc on how command scripts are processed ?

No. But is simply works a line at a time.
No compiling, no P code, no memory (on how to do a line).
The only optimization, if you want to call it that, is that it remembers the labels by RFA addresses in the file. Which reminds me to pick up a discussion I was having with Guy about not doing record IO private block IO when possible. That could speed up DCL.

I'd be interested to know the RMS component for a DCL string manipulating procedure. How much EXEC mode (RMS) vs SUPER (DCL) when you procedure runs?

I'm still interested to know how much DCL diet helped, or an other procedure which removes all comments, all excess spaces, changes the first 26 symbols to single letters, the next 26*38 to letter plus [letter|number|$|_], reduces lexical to minimal string length (f$ex...) and so on.

Re-arranging code slightly to make critical, high-execution count, loops fit in an (512 byte!) internal rms buffer has helped others in the past. Does the procedure do significant IO to the DCL scripts that you can tell? (ask XFC)

If you attempted this repacking for a single important loop in your script, could you see the difference in SUPER:EXEC:KERNEL cpu? Woudl there be an IO count reduction (if there was IO).

Cheers,
Hein.

Wim Van den Wyngaert
Honored Contributor

Re: DCL to program

Hein,

It uses very little in exec mode. But with my version of PA I can not extract the data for 1 process. And it is difficult to say how performance evolves because its work is variable and depends on the number of files, processes etc.

It would be nice if some hints could be passed to dcl such as an indication that the script must be cached (without comments) or that an auto-diet must be done (once). Interpreter code even be better.

Concerning Perl : in a team of 8 persons over here there was 1 perl expert and 1 beginner (me). The other 6 would have had a problem ...

Wim
WIm
Wim
Cass Witkowski
Trusted Contributor

Re: DCL to program

If your command procedure does a lot of executing a command and then parsing the output it may be that a recent lexical may be able to get the information quicker. This is especially the case if the output of the command is sent to a temp file which then has to be read in and parsed.

We had a program that checked the number of CPUs the each node had and sent an email alert of the number varied from the norm. The code was dumping the SHOW CPU command out to a file and then reading in the file and parsing it. I was able to replace this with the F$GETSYI lexical. This was much faster and also let prone to the issue of the output format changing when a new version of OpenVMS comes out. Which is the case when upgrading from OpenVMS V7.3 to V7.3-2.

Cass
John Gillings
Honored Contributor

Re: DCL to program

Wim,

In addition to DCLDIET, a very long running DCL procedure may benefit from running from a RAMdisk. You can code it so the procedure checks where it's executing from, and if not a RAM disk, creates the disk, copies itself and invokes the RAM disk copy. On completion, the procedure can either tear down the RAM disk, or leave it around for the next time it's invoked. This will only be a win if the procedure runs for a long time, and XFC might do a better job of reducing the I/O.

As with any performance analysis, to improve things you need to work out what part of your procedure is costing the most and concentrate your efforts there.

Beware "automatic" conversion from DCL to compiled code. Some of these will do as much as they can to convert DCL to whatever their output language is, but code anything that's left as SPAWNed commands. As you might imagine, this can be detrimental to performance.
A crucible of informative mistakes
Wim Van den Wyngaert
Honored Contributor

Re: DCL to program

I'm going to do some testing and compare performance of perl with DCL.
IMHO it could be that starting programs (e.g. ncl) in a subprocess by perl all the time is more expensive than the overhead of DCL.

I'll post the results.

Wim
Wim
Jess Goodman
Esteemed Contributor

Re: DCL to program

I had some very long DCL command procedure that ran very slowly. They were doing GOSUBs in loops and all the GOSUB routines were at the very bottom of the procedure.

When I moved the GOSUB routines up close to where they were being called (even though I had to branch around them in the main code) performance improved many times over.
I have one, but it's personal.
Garry Fruth
Trusted Contributor

Re: DCL to program

You could turn on image accounting briefly, and see how many images are being activated. You may be surprised at what you find. Use acc/summary=image/report=(record, process, direct) /ident=PID to summarize how much time and resources are being used by the different images.
Robert Gezelter
Honored Contributor

Re: DCL to program

Wim,

A thought after reflecting on some of the earlier exchanges. Consider breaking the 4000 line file into different pieces. Also consider partially unrolling some of the loops. Also make sure that the RMS buffering and blocking factors are significant.

Breaking the file into pieces means a shorter scan when DCL has to search the file. Even this small overhead can multiply dramatically within a loop. Consider:
$ sss:
$ .....
$ goto sss

as the bottom of a 4000 line file, rather than the entire contents of a far smaller file. The overhead of a GOSUB (rather than a call) is higher, but may be dwarfed by the re-scan overhead. It is also easier to maintain.

Second, unrolling loops in a straight interpreter (such as DCL) is a large win. If you have code which is simple, for example that reads a file looking for a condition, then five reads in a row, rather than looping five times can make a difference.

Admittedly, these are a bit hackish, in that the impact cannot be generally predicted, but is highly dependent on the way that your code is structured.

I hope that the above is helpful.

- Bob Gezelter, http://www.rlgsc.com
Wim Van den Wyngaert
Honored Contributor

Re: DCL to program

Did some testing.

If there are about 4000 lines between the gosub and the called procedure, elapsed time for doing 100000 gosubs increases with 50%.

Replacing gosub by call increases elapsed time by about 50% too but placing them 4000 lines apart only increases 20% compared with gosub (also apart).

Wim
Wim