August 3 2017

Eventually, without any involvement, log files will get out of hand.  Oracle published in the Lifecycle Management guide ways to configure the diagnostic logging, however, the ".out" files aren't covered.

Sure, you can just go in and delete them occasionally, but why can't this be done automatically?

It can!

*Of note: this can guide is for servers running Linux!

Enter Logrotate!

"logrotate is designed to ease administration of systems that generate large numbers of log files. It allows automatic rotation, compression, removal, and mailing of log files. Each log file may be handled daily, weekly, monthly, or when it grows too large."

Following those instructions was fairly easy, but I'll post what I did here!

First create a configuration file (name it whatever you'd like, I went with logrotate.conf)


compress

/opt/oracle/config/domains/*/servers/*/logs/*.out {
        rotate 5
        missingok
        notifempty
        size=5M
        copytruncate
        sharedscripts
        postrotate
                find /opt/oracle/config/domains/*/servers/*/logs -name "*.out0*" -exec rm {} \;
        endscript
}

Make sure it points to your managed server log directories.

I added a few additional options in the configuration, compress, copytruncate, etc.  These help keep things under control.  See logrotate documentation for more details.

I also put in a postrotate script to remove old out files that are generated on restarts.  You can also update the find command with a -mtime +7 to only remove the files older than 7 days.

Now that the script is configured, let's schedule it to run automatically!

Enter Cron!

We are going to use Crontab, to 'install a crontab table file', which is used for cron jobs.

Lets insert a record into crontab ("crontab -e") 


*/30 * * * * /usr/sbin/logrotate -s /home/oracle/scripts/logrotate.status /home/oracle/scripts/logrotate.conf

Now logrotate will automatically run every 30 minutes!

 

Checkout the log directory after a few days!

log directory

 

Also, check out my blog Automatically cleanup archive files!

About the Author

Bio

Kevin has over 9 years experience in enterprise scale implementations. Kevin is very experienced in architecting, modeling and developing BPM processes, particularly those requiring advanced ADF UI screens, as well as setting up the infrastructure for the BPM solutions, which have integrated with multiple external systems.

Join the Conversation

Enter your first name. It will only be used to display with your comment.
Enter your email. This will be used to validate you as a real user but will NOT be displayed with the comment.
By submitting this form, you accept the Mollom privacy policy.