1820202 Members
3996 Online
109620 Solutions
New Discussion юеВ

Re: lp alternative

 
A. Daniel King_1
Super Advisor

lp alternative

Is there a GNU or other alternative for HP that will allow lp'ing/printing of files over 2GB?
Command-Line Junkie
24 REPLIES 24
Rick Garland
Honored Contributor

Re: lp alternative

There is enscript available from the software porting archive - gatekeep.cs.utah.edu. Available in depot format as well as source.

Not aware that lp has a size restriction unless you are constrained by the lp spool directory where the request files are stored until the print job is completed.

Over 2 GB may be an issue of the lp spool directory not being setup for large files. You may also want to look at increasing the size of the lp spool directory.

Hope you have a high speed line printer as well.
Peter Godron
Honored Contributor

Re: lp alternative

Hi,
are you sure it's a problem with lp not with the filesystem used for the temp storage?
Are you getting any error messages?

As a quick fix can you split the file?

Regards
A. Daniel King_1
Super Advisor

Re: lp alternative

The error follows. BIGFILE is 3GB, and I'm running as root.

[09:28:09] adk@myhost [/tmpdir]
# lp -dxMYPRINTER /tmpdir/BIGFILE
lp: can't access file "/tmpdir/BIGFILE"
lp: request not accepted

The spooler area is largefile enabled.
Command-Line Junkie
Rick Garland
Honored Contributor

Re: lp alternative

Are you sure BIGFILE is in the /tmpdir/BIGFILE location as you specified? This error is saying that the file is not found, doesn't exist, for the lp system to act on it.

A. Daniel King_1
Super Advisor

Re: lp alternative

[10:00:51] adk@myhost [/tmpdir]
# /usr/bin/ll -d /tmpdir /tmpdir/BIGFILE
drwxrwxrwx 24 adk sys 12288 Feb 10 08:42 /sybkup
-rw-rw-r-- 1 adk sys 3506168400 Feb 10 08:41 /sybkup/BIGFILE
Command-Line Junkie
A. Daniel King_1
Super Advisor

Re: lp alternative

Sorry, /sybkup should read /tmpdir!!!
Command-Line Junkie
Rick Garland
Honored Contributor

Re: lp alternative

So are you issuing the command 'lp -dMYPRINTER /sybkup/BIGFILE'?

A. Daniel King_1
Super Advisor

Re: lp alternative

I am printing from multiple locations, and got my output munged trying to make it generic for public consumption.

There are no path issues. The printer exists and accepts small files from the same locations. The file exists and is readable by the current shell.

enscript is close, but it does not look like a spooler subsystem. Does anyone have experience with LPRng?
Command-Line Junkie
Rick Garland
Honored Contributor

Re: lp alternative

How about CUPS?
Peter Godron
Honored Contributor

Re: lp alternative

Hi,
could you check the status of the lp system with: lpstat -s
Anything unusual?
Regards
A. Daniel King_1
Super Advisor

Re: lp alternative

Nothing unusual about lpstat -s. As I've stated, small files print just fine ...
Command-Line Junkie
A. Daniel King_1
Super Advisor

Re: lp alternative

CUPS, okay. This looks like what I was seeking. Do we know if it takes large files?
Command-Line Junkie
Rick Garland
Honored Contributor

Re: lp alternative

An lp spooler will just stream the file to wherever the spooler tmpdir is. If the spooler tmpdir is able to handle large files then there is not a problem.

CUPS will handle whatever print job you throw at it. Make sure you have enough trees and enough time and a fast lineprinter.
Pete Randall
Outstanding Contributor

Re: lp alternative

To expand on what Rick is saying:

Your file is going to be copied to the /var/spool/lp directory for printing. Do you have 2GB of free space in /var?


Pete

Pete
Rick Garland
Honored Contributor

Re: lp alternative

And is the /var/spool/lp enables for large files? (What ever that LV is)
A. Daniel King_1
Super Advisor

Re: lp alternative

We have about 20GB available in the spool area.

Other programs, such as GNU tar, GNU zip, etc. have trouble with large files, unless they are compiled a certain way, enabling large-file support. I'll take a look myself, but I was hoping that someone had specific experience with LARGE print using CUPS (or LPRng, etc.)

It seems the HP has an enhancement request in for "lp" to be able to handle large files, but the fix is not yet available.

BTW: We have several fast printers and a forest of paper just waiting for these large files!!!
Command-Line Junkie
A. Daniel King_1
Super Advisor

Re: lp alternative

Largefiles are enabled on both the source filesystem and the spool filesystems. fsadm says so:

# fsadm /spool
fsadm: /etc/default/fs is used for determining the file system type
largefiles

(/var/spool is a link to this file system.)
Command-Line Junkie
Thierry Poels_1
Honored Contributor

Re: lp alternative

hi,

did you try : "cat BIGFILE | lp -d printer"
In this way lp does not have to open the large inputfile itself.

regards,
Thierry.
All unix flavours are exactly the same . . . . . . . . . . for end users anyway.
Cheryl Griffin
Honored Contributor

Re: lp alternative

Rick & Pete -
FYI: There SR to add largefile support to lp is SR 8606291139. The two error messages "lp: can't access file" and "lp: request not accepted" are the typical symptoms of this issue.

The advised workaround is to split the print job into smaller than 2Gig jobs.

"Downtime is a Crime."
Pete Randall
Outstanding Contributor

Re: lp alternative

Cheryl,

What's an "SR"? I assume, since you mention a workaround, that it must be an acknowledged problem that HP intends to do something about?


Pete

Pete
Peter Godron
Honored Contributor

Re: lp alternative

Cheryl,
many thanks for this. Just proves, you never stop learning.

So my inital quick fix was right!!

Regards
Robert-Jan Goossens_1
Honored Contributor

Re: lp alternative

Pete,

SR is a Service Request.

Regards,
Robert-Jan
Cheryl Griffin
Honored Contributor

Re: lp alternative

SR is a problem report. This might be considered an ER (enhancement report) since the spooler is behaving the way it was designed. Adding largefiles support could be considered an enhancement.

And Peter You were right on the money with your original answer.
"Downtime is a Crime."
A. Daniel King_1
Super Advisor

Re: lp alternative

Splitting the file is the right answer for now.
Command-Line Junkie