- Community Home
- >
- Servers and Operating Systems
- >
- Operating Systems
- >
- Operating System - OpenVMS
- >
- Buffer error when using FTP
Categories
Company
Local Language
Forums
Discussions
Forums
- Data Protection and Retention
- Entry Storage Systems
- Legacy
- Midrange and Enterprise Storage
- Storage Networking
- HPE Nimble Storage
Discussions
Discussions
Discussions
Forums
Forums
Discussions
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
- BladeSystem Infrastructure and Application Solutions
- Appliance Servers
- Alpha Servers
- BackOffice Products
- Internet Products
- HPE 9000 and HPE e3000 Servers
- Networking
- Netservers
- Secure OS Software for Linux
- Server Management (Insight Manager 7)
- Windows Server 2003
- Operating System - Tru64 Unix
- ProLiant Deployment and Provisioning
- Linux-Based Community / Regional
- Microsoft System Center Integration
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Discussion Boards
Community
Resources
Forums
Blogs
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО10-28-2008 11:38 AM
тАО10-28-2008 11:38 AM
Buffer error when using FTP
ftp> put IQI_INV_20081021_092000000.XML
200 PORT command successful.
150 Opening data connection for DSA17:[MISCXML_FTP.IQMETRIX.IN]IQI_INV_20081021_092000000.XML; (xx.xx.xx.xx,4113)
550-RMS WRITE RTB record too large.
550 !UL byte record too large for user's buffer
ftp: 117200 bytes sent in 0.03Seconds 3780.65Kbytes/sec.
If I send the file to another VMS machine (Alpha 7.3-2, TCPWare 5.7), I dont have a problem.
The receiving User Accounts are identical on both VMS nodes. Is there any system parameter of user quota, of TCPIP logical/quota I can set to allow a normal transfer to the node which is currently giving the error.
I tried transfering the file in binary, however it wrecked the formating.
thanks
Dave.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО10-28-2008 12:11 PM
тАО10-28-2008 12:11 PM
Re: Buffer error when using FTP
As an alternative here, zip the input file, then toss it over via ftp binary, then (obviously) unzip it.
The one thing here that gives me pause around the "standard" explanation is the number of bytes that did get transferred over. Is it possible that the file isn't correctly terminated, or has some sort of a corruption or there is an unusually long record buried in there somewhere?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО10-28-2008 12:46 PM
тАО10-28-2008 12:46 PM
Re: Buffer error when using FTP
8.3 + TCPIP 5.6 is radically different from 7.3 + TCPware 5.7. Different defaults may well be in place.
As you hint yourself, You may want to check out some TCPIP$FTP.... logicals to influence those defaults as needed. See: http://h71000.www7.hp.com/doc/83final/6526/6526pro_041.html
I tend to DUMP/BLOC=COUNT=1 the resulting file too 'see' whether it makes sense andf what the line terminator might be (LF, CR-LF). If you need further help, and the data allows that, then maybe you can attach a text file with a dump. Don't bother with a dump/record. That assumes a structure which might not be there.
>> I tried transfering the file in binary, however it wrecked the formating.
Typically an XML file should be tranferred in ASCII mode as it sound like you tried. Make sur eyou explicitly specify that as a trial.
Failing that, use BINARY mode, and dump as per above. The proceed to use $SET FILE/ATTRIBUTE=(RFM=xxx) to make it match what you see. xxx would be STMLF or just STM basd on what you see. You may want to set MRS and/or LRL or just set those to 0
Good luck,
Hein.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО10-28-2008 12:56 PM
тАО10-28-2008 12:56 PM
Re: Buffer error when using FTP
Check the attributes of the file(s). My guess is your original file (the one getting errors) has an invalid "Longest Record Length" - LRL. It could 0, or at least less than the actual LRL. The FTP client is therefore working with false information, which breaks assumptions.
When you send the file to another node, RMS is creating the remote file. Since it sees all the records, it knows the real LRL and sets it correctly in the new copy. Now that FTP has correct information, the transfer is successful.
If you know the correct LRL, you can set it with:
$ SET FILE/ATTRIBUTE=(LRL:value) file
or create a new file with:
CONVERT old-file new-file
This should ensure the LRL is correct.
One other possibility (unlikely since I'm guessing the file is effectively human readable text), it's a stream_lf file, and there's a section that has more than 32767 bytes (the RMS maximum record length) between LF characters. Since this is an architectural issue, there's no simple fix. You'd need to find the extra long "record" and somehow break it into smaller pieces.
There may be alternatives involving setting the file attributes to (say) fixed length, then sending the file in binary. This may then need the other end to fiddle with the attributues to "fix" the file (but then it's not clear what the other end is, or what it's expecting).
As ever when transferring data between different systems, you sometimes need a deeper understanding of the exact format of your data in order to reconcile differences in expectations and assumptions.
[FWIW - opinion follows:] LRL is arguably an obsolete concept - it's original purpose was to give code a hint as to what size buffer is needed to process a file. This meant that the application would minimize resource consumption by not having to allocate buffers "big enough for all possibilities".
The idea has a flaw in that a file open for shared write may have longer records written to it after it was first opened by a reader, thus the LRL which was adequate when the file was first opened may be too small to hold newer records. Furthermore, on modern systems where memory is orders of magnitude cheaper than it was when RMS was architected, allocating a 32K buffer is nowhere near as extravagant as it once would have been considered.
From this, it could be considered that we should set the LRL of all files to 32768 and be done with it (as did the DECC RTL circa V6)? Well, on the other hand this is not necessarily a good idea. The classic example where this can hurt is SORT, which uses the LRL to allocate a sort table - one LRL sized entry for each record in the file. When LRL is set higher than the real LRL, SORT performance can be significantly degraded (though it may not be as significant these days with faster CPUs and more available memory).
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО10-29-2008 12:58 AM
тАО10-29-2008 12:58 AM
Re: Buffer error when using FTP
May be try this logical. What is the longest record on you Windows file ? What is used a line separator ?
Wim
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО10-29-2008 03:08 AM
тАО10-29-2008 03:08 AM
Re: Buffer error when using FTP
In the case of this file, I resolved the issue by remembering that I had an old, unused Share out there that was created when testing Advanced Server, about 6 months ago.
Also, if the problem crops up again before I have pinned down a resolution, I also have Hoff's "zip it up" suggestion, which I hadn't thought of.
I will be investigating all of the suggestions you made, and I will return with any solution I come across.
Thanks again.
Dave.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО10-29-2008 06:10 AM
тАО10-29-2008 06:10 AM
Re: Buffer error when using FTP
XML usually doesn't have gigantic text records. (I don't know off-hand if there's an architected or recommended longest record, though.)
I have encountered some environments that seemingly go out of their way to generate ill-formed XML.
Fire up one of the various xml verification tools that are around the 'net, or an xml pretty printer, and see if that helps identify the problem. This would be on the source platform; prior to the transfer.
With Mac OS X, Linux or various Unix distros, there are tools baked into the distributions; for this case, tools akin to xmlwf and xmllint are part of most any distribution, and would be obvious choices.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО10-29-2008 01:19 PM
тАО10-29-2008 01:19 PM
Re: Buffer error when using FTP
Zip and UnZip also have options which can be
used to get inappropriate CR and/or LF line
endings translated, too.
> When trying to FTP an XML file [...]
ASCII or binary? _You_ may know that it's
text, but binary may avoid the record-length
trouble. You'd probably need to do some SET
FILE /ATTRIBUTES stuff to make it look like
text again on the VMS side, however.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО11-06-2008 07:28 AM
тАО11-06-2008 07:28 AM
Re: Buffer error when using FTP
I guess my final question is;
Why would the transfer work when transfering from
Windows --> VMS(7.3-2) host running TCPWARE V5.7
but fail when transfering from
Windows --> VMS (8.3) host running TCPIP Services V5.6
Dave.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
тАО11-06-2008 08:45 AM
тАО11-06-2008 08:45 AM
Re: Buffer error when using FTP
Well, duh. Maybe because the software's
different?
> ASCII or binary?
Still wondering...