Operating System - HP-UX
1752788 Members
6325 Online
108789 Solutions
New Discussion юеВ

The "at" command and oracle export

 
SOLVED
Go to solution
Dennis Flinn
Advisor

The "at" command and oracle export

I was wondering if someone might be able to explain the following situation. I have script file called export_cmd which I have attached. If I enter the following at the command line:

at -f export_cmd now

The export dies with and the log of the export has the following:

. . exporting table D000 0 rows exported
. . exporting table D010INC 1720140 rows exported
. . exporting table D010L
EXP-00015: error on row 30586 of table D010L, column BLOCK, datatype 24
EXP-00002: error in writing to export file
EXP-00002: error in writing to export fileerror closing export file
EXP-00000: Export terminated unsuccessfully

I checked no problem with space on the drive.

However, if I run the following from the command line using same parameter file as above:

./export_cmd

The result simply goes past the table shown above without a problem:

. . exporting table CYOPR_MOVE 0 rows exported
. . exporting table CYTSTR_TAB 0 rows exported
. . exporting table D000 0 rows exported
. . exporting table D010INC 1720140 rows exported
. . exporting table D010L 85417 rows exported
. . exporting table D010LINF 54036 rows exported
. . exporting table D010Q 64710 rows exported
.

I'm just not sure why the difference?

Thanks,
Dennis



7 REPLIES 7
A. Clay Stephenson
Acclaimed Contributor

Re: The "at" command and oracle export

Best guess:

Put a ulimit -a command in your script. I suspect that at is running with a smaller ulimit value.
If it ain't broke, I can fix that.
RAC_1
Honored Contributor

Re: The "at" command and oracle export

When the script is executed with at the environment variables should be fixed in script.

You can check envirpnment variables that are set for the script with command at -d job_id.

Check if you get all environment variables required for script. Other things to check-

specify full script path for at command.
There is no substitute to HARDWORK
A. Clay Stephenson
Acclaimed Contributor

Re: The "at" command and oracle export

Because your export seemed to start actually exporting, I assumed that only a fraction of your script was actually attached. You would, of course, need to set and export ORACLE_SID and PATH at a very minimum. The environment of cron and at is intentionally very sparse.
If it ain't broke, I can fix that.
twang
Honored Contributor

Re: The "at" command and oracle export

Agree with Stephenson, you must export all envirpnment variables in your at/cron scripts, in your case, I would suggest to create another script which calls the export_cmd script and exports the required envirpnment variables.
I have some experiences regarding runing cron scripts in problem, I resolved it by including .profile. After that I created a .cronprofile for runing cronjob.
Massimo Bianchi
Honored Contributor
Solution

Re: The "at" command and oracle export

Hi,
there is problem with at: it sources its own parameter file, that is

/var/adm/cron/.proto


if you take a look at this file, you can see that is hard-coded an ulimit value, taht is some number.

change it to

ulimit unlimited

and enjoy bigger files !

Remeber also that is good practice not to use the ksh, because it has hard-coded a limitation of 2G, and it is not workable around.


HTH,
Massimo
Rory R Hammond
Trusted Contributor

Re: The "at" command and oracle export

Another thought to check. make sure the dmp file is using a fullpath name.

IF you are executing the script from a l system with enough space the export would work., where from at my be executing from your home directory.

Rory
There are a 100 ways to do things and 97 of them are right
Dennis Flinn
Advisor

Re: The "at" command and oracle export

Thanks for the help. It turned out to be a ulimit problem. I should have included the header to the export log. I didn't provide the log because the database is a SAP system with about 22k tables. I worked on the ulimit issue and will test it again some time this afternoon hopefully.

Thanks everyone for you help.

Dennis