EzDevInfo.com

log4perl

Log4j Implementation For Perl log4perl - log4j for Perl

Making self-logging modules with Log::Log4perl

Is there a way to use Log::Log4perl to make a smart self-logging module that logs its operations to a file even in the absence of the calling script not initializing Log4perl? As far as I can tell from the documentation, the only way to use Log4perl is to initialize it in the running script from a configuration, then modules implementing Log4perl calls log themselves based on the caller's Log4perl config.

Instead, I'd like the modules to provide a default initialization config for Log4perl. This would provide the default file appender for the module's category. Then, I could override this behavior by initing Log4perl in the caller with a different config if needed, and everything would hopefully just work.

Is this sort of defensive logging behavior possible or am I going to need to rely on initing Log4perl in every .pl script that calls the module I want logged?


Source: (StackOverflow)

How can I log only the INFO level in Log4perl?

log4perl has a threshold option in the configuration file that will log all calls that level or higher. Is there an option of setting it log only one type of call? I want to only log calls of level "INFO".

Thanks.


Source: (StackOverflow)

Advertisements

How can I disable Log4perl output for a particular class?

I would like to use Log4perl in a project but disable it for a certain class (which is, in this case Net::Amazon). I thought this would be an easy one, but somehow I failed.

I tried using

use Log::Log4perl (:easy_init);
use Net::Amazon;

my $amz = Net::Amazon->new( ... );
my $log = Log::Log4perl->easy_init($DEBUG);
$log = $log->get_logger("Net::Amazon");
$log->level($OFF);

$log = $log->get_logger(__PACKAGE__);
$log->info("Hello World.");

Unfortunately, debugging messages of Net::Amazon are still printed to the terminal. Why is that? And what am I doing wrong here?


Source: (StackOverflow)

How can I catch Perl warnings into Log4perl logs?

Log4perl is a great tool for logging.

The warnings pragma is also an essential tool.

However, when Perl scripts are running as daemons, the Perl warnings, are printed into STDERR where nobody can see them, and not into the Log4perl log file of the relevant program.

Is there a way to catch Perl warnings into the Log4perl log?

For example, this code will log just fine to the log file, but in case this is run as a daemon, the Perl warnings will be not be included in the log:

#!/usr/bin/env perl
use strict;
use warnings;

use Log::Log4perl qw(get_logger);

# Define configuration
my $conf = q(
                log4perl.logger                    = DEBUG, FileApp
                log4perl.appender.FileApp          = Log::Log4perl::Appender::File
                log4perl.appender.FileApp.filename = test.log
                log4perl.appender.FileApp.layout   = PatternLayout
);

# Initialize logging behaviour
Log::Log4perl->init( \$conf );

# Obtain a logger instance
my $logger = get_logger("Foo::Bar");
$logger->error("Oh my, an error!");

$SIG{__WARN__} = sub {
    #local $Log::Log4perl::caller_depth = $Log::Log4perl::caller_depth + 1;
    $logger->warn("WARN @_");
};

my $foo = 100;
my $foo = 44;

This still prints out to STDERR:

"my" variable $foo masks earlier declaration in same scope at log.pl line 27.

And the log file does not catch this warning.


Source: (StackOverflow)

Can Perl's Log::Log4perl's log levels be changed dynamically without updating config?

I have a Mason template running under mod_perl, which is using Log::Log4perl.

I want to change the log level of a particular appender, but changing the config is too awkward, as it would have to pass through our deployment process to go live.

Is there a way to change the log level of an appender at run-time, after Apache has started, without changing the config file, and then have that change affect any new Apache threads?


Source: (StackOverflow)

How can I rotate and compress Log4perl log files?

From what I can tell neither Log4Perl or any of its related modules in CPAN supports rotate & compression of log files.

Rotation can be accomplished by using:

  1. Log::Log4perl::Appender::File
  2. Log::Dispatch::FileRotate.

But neither modules supports rotation and compression. (Log::Dispatch::FileRotate has it in its todo list, but it's not currently implemented).

It is possible to do this using the standard Logrotate facility in Linux, by using either Log::Log4perl::Appender::File's recreate_check_interval or recreate_check_signal.

From initial tests, it looks like using Logrotate with the delaycompress option will do the trick - even on a machine with high load, as once the file is moved, log4perl will continue logging to the same filehandle, until the signal is cought.

However, if delaycompress is not used, and there is (even a slight delay) between the compressing of the log file, and the catching the signal by the Perl program, some logging data might be lost.

What do you think? Are there other options we did not think of?


Source: (StackOverflow)

Log4perl: How do I dynamically load appenders at runtime?

I'd like to have modules managing their logging at runtime, but without having everything referring to a single monolithic config file. When dealing with processes running under different permissions, I really don't want to deal with each process needing to be able to access every log on the system when they're only writing to a subset of them.

However, I'm not finding much documentation in the Log4perl manual on how to initialize additional appenders from a configuration file at runtime. http://metacpan.org/pod/Log::Log4perl::Appender references an add_appender method, but that works on instantiated appender objects instead of conf files. It also doesn't define the logger objects and the logger->appender relations.

I tried having each package init from its own conf, but that simply clobbers the existing config each time it's initalized. What I'd like to do is something along the lines of:

my $foo = Foo->new() ## Checks Log::Log4perl::initialized(), sees that it
                     ## hasn't been initalized yet, inits Log4perl from foo.conf
my $bar = Bar->new() ## Checks Log::Log4perl::initialized(), sees that it
                     ## has been initalized. Adds appenders and loggers defined
                     ## in bar.conf into the initialized configuration

How can I parse and add the configuration into the current config?

Edit: Probalem with using a package variable is that this is just a Moose role being consumed by various classes, pretty much just a MooseX::Role::Parameterized version of Ether's answer in Making self-logging modules with Log::Log4perl. Thus, my logger is getting composed into the library consuming it, and I don't have a global variable I can work on each time I use it.

Though..

If I declare a global variable outside of the MooseX::Role::Parameterized role block, would each and every class that consumes the role be using that same conf variable?


Source: (StackOverflow)

Perl: Log4Perl call log4perl in case of unhandled error

Is there a way in log4Perl that I can force my programs to log fatal even outside an eval clause? I want to achieve to call log4perl also in case of any unhandled program termination. Prefereably I would like to add the related error handler inside my standard module which is loaded with all my Perl programs.
The Perl version is currently 5.8, but I am upgrading soon.
This the test code for the given answer. I see neither a DIE on the screen nor that die.txt is created.

 use Log::Log4perl qw(get_logger);
$a->test();
$SIG{__DIE__} = sub {
 warn "DIE";
 open DIE,">die.txt";
 print DIE "died\n";
 close DIE;
};

Source: (StackOverflow)

How can I make log4perl create log directory if it doesn't exist?

If my logging directory (/home/hss/Data/log/DataImport.log) does not exist when log4perl is initializing, then I get this error:

Cannot write to '/home/hss/Data/log/DataImport.log': No such file or directory

Is there a way to make it create the directory by itself without me having to specify the directory anywhere except in my log.conf file?


Source: (StackOverflow)

How can I tell if Log4perl emitted any warnings during a run?

I've been using Log4perl extensively in a number of scripts. I'd like to augment those scripts to set an error code if any WARN or ERROR messages have been logged. I couldn't find any obvious way to do this based on existing documentation.

I'd like to avoid a brute-force rewrite of my existing scripts to add a check on every WARN or ERROR log message; I'd prefer to handle it prior to script exit if possible like this pseudocode:

if $log->has_warnings_or_errors then
  exit 1
else
  exit 0

Is there any easy way to call Log4Perl to determine if the current logger has issues messages of certain levels?


Source: (StackOverflow)

How can I suppress output to stdout and stderr in Log4perl?

I have this sub to initialize my logger:

sub initLogfiles{
    Log::Log4perl->easy_init($INFO); # We log all debug, info, warn, error and fatal messages.
        my $userlogappender = Log::Log4perl::Appender->new(
        "Log::Log4perl::Appender::File",
        filename => USERLOGFILE,
        mode     => "append",
        recreate => 1
    );
    my $userloglayout = Log::Log4perl::Layout::PatternLayout->new("%d;%m%n");
    $userlogappender->layout($userloglayout);
    $userlogger->add_appender($userlogappender);
}

I only want to have the loginfo in my logfile. How do i prevent this from logging to stdout?


Source: (StackOverflow)

How can I mock Log::Log4perl::INFO?

I'm writing new unit tests for an existing module that uses Log::Log4perl like:

use Log::Log4perl qw(:easy);

The module calls INFO( "important message" );. I'd like to mock this to verify from my test code that INFO is called in certain circumstances.

When I run the test module it doesn't capture the calls to INFO from the module. What's the right way to mock these calls to INFO?

Here's a complete example:

Mut.pm

#!/usr/bin/perl -w
# Mut : Module under test

use strict;
use warnings;

package Mut;

use Log::Log4perl qw(:easy);

sub new {
   my $class = shift;
   my $self = {};
   bless $self, $class;

   INFO( "Mut::new" );

   return $self;
}

1;

Mut.t

#!/usr/bin/perl -w

use strict;
use warnings;

package Mut_Test;

use Test::More tests => 1;
use Test::MockModule;
use Test::MockObject;

my @mock_info_output = ();

my $log4perl = Test::MockModule->new('Log::Log4perl');
$log4perl->mock(
   'INFO' => sub {
      print STDERR $_[0];
      push @mock_info_output, @_;
      return;
   }
    );

BEGIN {
  use_ok('Mut');
}

{
   my $mut = Mut->new;
   ## Do something here to verify INFO...
}

Source: (StackOverflow)

How can I get %x in Log4perl to not show "[undef]"?

When I haven't pushed anything to the Log::Log4perl::NDC stack, %x returns [undef]. I would like it to return an empty string when the stack is empty.

For example, take this code:

use strict;
use Log::Log4perl qw(:easy);
Log::Log4perl->easy_init({ level => $INFO, layout => "%x %m%n" });
Log::Log4perl->get_logger()->info("first message");
Log::Log4perl::NDC->push("prefix");
Log::Log4perl->get_logger()->info("second message");

This prints:

[undef] first message
prefix second message

But I want it to print:

first message
prefix second message

How can I do this?


Source: (StackOverflow)

How to create separate log files?

I'm using Log::Log4perl to create log files, but it is appending to a single log file; instead, I want to create a separate log file for each execution of my script.

How can I create separate log files ?

Here is my code :

fetch.pl

#Opening Log configuration file
Log::Log4perl::init('./logs/log4perl.conf');
my $logger = Log::Log4perl->get_logger('./logs_$$.logs');

logs.conf

log4perl.logger = TRACE,  FileAppndr1
log4perl.logger.logs = DEBUG, FileAppndr1
log4perl.appender.FileAppndr1 = Log::Log4perl::Appender::File
log4perl.appender.FileAppndr1.filename = logs.log 
log4perl.appender.FileAppndr1.layout = Log::Log4perl::Layout::SimpleLayout

Source: (StackOverflow)

Log4Perl: How do I change the logger file used from running code? (After a fork)

I have an ETL process set up in perl to process a number of files, and load them to a database.

Recently, for performance reasons I set up the code to be multi-threaded, through use of a fork() call and a call to system("perl someOtherPerlProcess.pl $arg1 $arg2").

I end up with about 12 instances of someOtherPerlProcess.pl running with different arguments, and these processes each work through one directories worth of files (corresponding to a single table in our database).

The applications main functions work, but I am having issues with figuring out how to configure my logging.

Ideally, I would like to have all the someOtherPerlProcess.pl share the same $log_config value to initialize their loggers, but have each of those create a log file in the directory they are working on.

I haven't been able to figure out how to do that. I also noticed that in the directory I am calling these perl scripts from I see several files (ARRAY(0x260eec), ARRAY(0x313f8), etc) that contain all my logging messages!

Is there a simple way to change the log4perl.appender.A1.filename value from running code? Or to otherwise dynamically configure the file name we use, but use all other values from a config file?


Source: (StackOverflow)