<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: how to  purge /etc/hpC2400/HPARRAY.INFO in Operating System - HP-UX</title>
    <link>https://community.hpe.com/t5/operating-system-hp-ux/how-to-purge-etc-hpc2400-hparray-info/m-p/2507073#M892186</link>
    <description>Kholikt ,&lt;BR /&gt;This file seems to be a log file but 100 Mb is too big for this file .&lt;BR /&gt;I would suggest you tail this file to see what are the latest entries and see if it is reporting any errors .&lt;BR /&gt;This is how mine looks ..&lt;BR /&gt;arrayscan:  No HP SCSI array devices identified.  Check SCSI connections.&lt;BR /&gt;SCAN FOR DISK ARRAY'S DID NOT DETECT DISK ARRAYS.&lt;BR /&gt;To disable set HPARRAY_START_STOP=0 in /etc/rc.config.d/hparray.&lt;BR /&gt;&lt;BR /&gt;But the size is nowhere near as u said.&lt;BR /&gt;&lt;BR /&gt;You can take a copy of this file in tape or in some other directory and trim this file .&lt;BR /&gt;&amp;gt;/etc/hpc2400/HPARRAY.INFO.&lt;BR /&gt;&lt;BR /&gt;Cheers&lt;BR /&gt;Karthik...</description>
    <pubDate>Tue, 20 Mar 2001 04:50:07 GMT</pubDate>
    <dc:creator>Karthik_2</dc:creator>
    <dc:date>2001-03-20T04:50:07Z</dc:date>
    <item>
      <title>how to  purge /etc/hpC2400/HPARRAY.INFO</title>
      <link>https://community.hpe.com/t5/operating-system-hp-ux/how-to-purge-etc-hpc2400-hparray-info/m-p/2507072#M892185</link>
      <description>Hi,&lt;BR /&gt;&lt;BR /&gt;I have one file named HPARRAY.INFO consumed around 100MB of space.  I have gzip the file to only 352K only, but my disk usage is still unchanged.&lt;BR /&gt;&lt;BR /&gt;This HPARRAY.INFO seems to be very common on all the hpux server and located at /etc/hpC2400 directory.&lt;BR /&gt;&lt;BR /&gt;I suspect this is some kind of log file for HPARRAY,  Is there any command to purge the file to release all the space?</description>
      <pubDate>Tue, 20 Mar 2001 03:54:40 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-hp-ux/how-to-purge-etc-hpc2400-hparray-info/m-p/2507072#M892185</guid>
      <dc:creator>kholikt</dc:creator>
      <dc:date>2001-03-20T03:54:40Z</dc:date>
    </item>
    <item>
      <title>Re: how to  purge /etc/hpC2400/HPARRAY.INFO</title>
      <link>https://community.hpe.com/t5/operating-system-hp-ux/how-to-purge-etc-hpc2400-hparray-info/m-p/2507073#M892186</link>
      <description>Kholikt ,&lt;BR /&gt;This file seems to be a log file but 100 Mb is too big for this file .&lt;BR /&gt;I would suggest you tail this file to see what are the latest entries and see if it is reporting any errors .&lt;BR /&gt;This is how mine looks ..&lt;BR /&gt;arrayscan:  No HP SCSI array devices identified.  Check SCSI connections.&lt;BR /&gt;SCAN FOR DISK ARRAY'S DID NOT DETECT DISK ARRAYS.&lt;BR /&gt;To disable set HPARRAY_START_STOP=0 in /etc/rc.config.d/hparray.&lt;BR /&gt;&lt;BR /&gt;But the size is nowhere near as u said.&lt;BR /&gt;&lt;BR /&gt;You can take a copy of this file in tape or in some other directory and trim this file .&lt;BR /&gt;&amp;gt;/etc/hpc2400/HPARRAY.INFO.&lt;BR /&gt;&lt;BR /&gt;Cheers&lt;BR /&gt;Karthik...</description>
      <pubDate>Tue, 20 Mar 2001 04:50:07 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-hp-ux/how-to-purge-etc-hpc2400-hparray-info/m-p/2507073#M892186</guid>
      <dc:creator>Karthik_2</dc:creator>
      <dc:date>2001-03-20T04:50:07Z</dc:date>
    </item>
    <item>
      <title>Re: how to  purge /etc/hpC2400/HPARRAY.INFO</title>
      <link>https://community.hpe.com/t5/operating-system-hp-ux/how-to-purge-etc-hpc2400-hparray-info/m-p/2507074#M892187</link>
      <description>thanks for the reply..&lt;BR /&gt;&lt;BR /&gt;I have basically gzip the file already.  I can't gunzip back the file and truncate it to zero because the space can not be released and I do not have enough space to decompress it.&lt;BR /&gt;&lt;BR /&gt;I am thinking about doing in this way.  I create a new file call HPARRAY.INFO and trucate the file.  Will this release the space of my / directory?</description>
      <pubDate>Wed, 21 Mar 2001 01:24:09 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-hp-ux/how-to-purge-etc-hpc2400-hparray-info/m-p/2507074#M892187</guid>
      <dc:creator>kholikt</dc:creator>
      <dc:date>2001-03-21T01:24:09Z</dc:date>
    </item>
    <item>
      <title>Re: how to  purge /etc/hpC2400/HPARRAY.INFO</title>
      <link>https://community.hpe.com/t5/operating-system-hp-ux/how-to-purge-etc-hpc2400-hparray-info/m-p/2507075#M892188</link>
      <description>Kholkit,&lt;BR /&gt;You can use lsof to identify which process si keeping this file open.&lt;BR /&gt;/sbin/init.d/hparray stop &lt;BR /&gt;Now you can touch a file HPARRAY.INFO&lt;BR /&gt;/sbin/init.d/hparray start &lt;BR /&gt;and you can see logging happening to this file.&lt;BR /&gt;Did u check up if there is any error message seen in this file.&lt;BR /&gt;&lt;BR /&gt;Karthik...</description>
      <pubDate>Wed, 21 Mar 2001 02:30:30 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-hp-ux/how-to-purge-etc-hpc2400-hparray-info/m-p/2507075#M892188</guid>
      <dc:creator>Karthik_2</dc:creator>
      <dc:date>2001-03-21T02:30:30Z</dc:date>
    </item>
    <item>
      <title>Re: how to  purge /etc/hpC2400/HPARRAY.INFO</title>
      <link>https://community.hpe.com/t5/operating-system-hp-ux/how-to-purge-etc-hpc2400-hparray-info/m-p/2507076#M892189</link>
      <description>Thanks Kartik&lt;BR /&gt;&lt;BR /&gt;If restart the hparray during office hour, will it affect the downtime.</description>
      <pubDate>Wed, 21 Mar 2001 03:06:19 GMT</pubDate>
      <guid>https://community.hpe.com/t5/operating-system-hp-ux/how-to-purge-etc-hpc2400-hparray-info/m-p/2507076#M892189</guid>
      <dc:creator>kholikt</dc:creator>
      <dc:date>2001-03-21T03:06:19Z</dc:date>
    </item>
  </channel>
</rss>

