1836444 Members
2491 Online
110100 Solutions
New Discussion

Re: ftp & packed data

 
SOLVED
Go to solution
leni puccio
New Member

ftp & packed data

We are about to enter into a project that entails transfering large amounts of data via ftp. Some of the data is coming from a mainframe in "packed" format. I have heard that this may cause ftp to interpret an EOF character.

Has anyone encountered this and if so, is there a work around.

Thanks
Leni

9 REPLIES 9
Frank Slootweg
Honored Contributor

Re: ftp & packed data

Just use a binary FTP transfer (i.e. FTP "binary" command) and all should be well.

I.e. binary mode can transfer *any* file, i.e. also a "packed data" file.
leni puccio
New Member

Re: ftp & packed data

Thank you. I have always found binary to work, but I have not worked a great deal with mainframe data.

Leni
leni puccio
New Member

Re: ftp & packed data

Would you know if there is a maximum size for an ftp file transfer?

Leni
Frank Slootweg
Honored Contributor

Re: ftp & packed data

I assume the maximum size is 2 GigaByte, i.e. the manual page does not mention a limit and all commands not specifically mentioned in /usr/share/doc/lg_files.txt are limited to 2GB at most (i.e. the general UNIX API limit).

However it would probably be wise to use a much smaller size if possible.
Rodney Hills
Honored Contributor

Re: ftp & packed data

Leni,

When you say "packed", I read that to be a cobol type comp-4 or something.

If you have the following situation-
1) Transfering to a HPUX system
2) You DON'T have a cobol compiler on that system
3) You plan on doing additional processing on that data.

Then I recommend the following be done on the mainframe side, before creating the file-
1) Use ASCII instead of EBCDIC.
2) Don't use "packed" format for numerics.
3) Be sure to have sign separate (not overpunched).

Although it is possible with "C" or "perl" to deal with these files, unless you are prepared to develop some code, you would be better off getting those files in a standard ASCII text form.

My 2 cents...

-- Rod Hills
There be dragons...
harry d brown jr
Honored Contributor

Re: ftp & packed data

Leni,

When you are transfering files in "binary" mode you won't encounter any issues.

DO NOT ATTEMPT to convert EBCDIC or PACKED data using FTP. Write a program to do SUCH.

Packed, usually means that dollar amounts (and other numbers) are packed into a smaller space, as well as where the Sign of the number goes. Make sure you test the crap out of things!

live free or die
harry
Live Free or Die
rick jones
Honored Contributor
Solution

Re: ftp & packed data

in and of itself, the File Transfer Prototol (FTP) has no limit on the size of the transfered file.

Any limit on the size of a file transferable via FTP would be in the implementation - for example, was the ftp client/daemon compiled for large files (if not, it certainly aught to be), or was the destination file specified as residing on a filesystem that was not large file enabled.

even if FTP can transfer files larger than 2GB, it may still be desirable to do the transfers in smaller chunks. this is getting a bit esoteric, but if you assume that there is a packet loss percentage of p, and a given TCP stack will retransmit a given segment about oh 16 times, then the chance of any one segment being retransmitted those 16 times and the connection aborting would be (roughly) p^16. That means that the probability of any one segment making it through without the connection aborting is (1-(p^16)).

the number of segments will be the file size divided by the MSS (Maximum Segment Size)

for the file to transfer successfully (in one shot) all the segments have to get across the network.

that means that we can express the probability of a file transfer as (1-(p^maxretrans))^(filesize/mss)

now, most of the time, p will be very very very small, which means that the probabiliy of a very large file getting across is quite large.

like i said though, this is getting a bit esoteric :)
there is no rest for the wicked yet the virtuous have no pillows
Sanjay_6
Honored Contributor

Re: ftp & packed data

Hi,

Don't think there is a limit on the size of file that you an transfer through FTP. However it is limited to 2GB if the filesystem where this data is copied does not have 'largefiles" enabled for that filesystem. 2GB would be the limit for any file (irrespective of application / command ) in that filesystem if "largefiles" is not enabled.

Hope this helps.

Regds
Arnold Hausmann
Occasional Advisor

Re: ftp & packed data

Sorry I didn't see this till today. I've done a lot of this for a data warehouse project a while back. We were loading EBCDIC data into an Oracle database and the data is chock full of negative numbers, dollar amounts (decimals) and such.

I found the best way to get this done effectively was to use mainframe SyncSort (or other) to unpack any packed data as well as place leading sign indicators and decimals. SyncSort has VERY good utilities for reformating/transforming data, and this didn't add very much to the entire ETT process at all.

Check out the SyncSort EDIT command with the SIGNS option; both are used in the OUTREC statement.

Good luck.
Stuff happens...that's why Jesus saves.