libwww-perl
The libwww-perl collection is a set of Perl modules which provides a simple and consistent application programming interface to the World-Wide Web. The main focus of the library is to provide classes and functions that allow you to write WWW clients. The library also contain modules that are of more general use and even classes that help you imp…
Karen Etheridge / libwww-perl - search.cpan.org
I'm creating a script with thread so I had to rebuild perl (perl5.20) with threads support.
Since I have rebuild perl, I have an error :
Can't locate object method "query_form" via package "LWP::UserAgent"
I've tried to re-install LWP::UserAgent, LWP::Simple, URI, but they are up-to-date(according to cpan).
The faulty code :
#!/usr/bin/env perl
package get_xml;
use strict;
use warnings;
use Curses;
use LWP::Simple;
use LWP::UserAgent;
use MIME::Base64;
use URI;
use URI::http;
use HTTP::Request::Common;
use parse_xml;
# ...
sub write_conv_thread{
my ($window, $rows, $username, $url, $ua) = @_;
while(1){
$$url->query_form( # line 43
"heartbeat" => '0',
"conv" => 0,
"username" => "$username",
"active" => 0
);
my $xml = $$ua->get($url);
my @conv = get_conv($xml);
print_all_lines($window, $rows, @conv);
$$window->refresh();
sleep(5);
}
}
1;
And the exact error message : Thread 1 terminated abnormally: Can't locate object method "query_form" via package "LWP::UserAgent" at get_xml.pm line 43.
Code that call the function :
#!/usr/bin/env perl
use strict;
use warnings;
use Curses;
use LWP::Simple;
use LWP::UserAgent;
use MIME::Base64;
use URI;
use threads;
use get_xml;
use post_xml;
# ... initialization of Curses windows ...
# $chat_win is a curse, $row is a number
my $server_endpoint = "...";
my $ua = LWP::UserAgent->new;
my $url = URI->new( "$server_endpoint/index.php" );
my $thread = threads->new(\&get_xml::write_conv_thread, \$chat_win, $row-4,"...", \$url, \$ua);
$thread->detach();
What can I do to make perl find the object method ?
Thank you for your answer.
Source: (StackOverflow)
How to check whether libwww-perl is installed or not in my server.
And also tell me what is the use of libwwww-perl. How to remove this if we remove what are the things going to affect.
libwww-perl is connected with perl modules or lighttpd.
Source: (StackOverflow)
I'm trying to recieve a post from a web service to a URL on IIS 7.5. It's like the request is being rejected before it even gets to .NET. I've tried it with an ASP.NET MVC action or a pure .ASPX file.
This is the request coming through:
2012-11-29 01:52:40 W3SVC9 Server-Name xxx.xxx.xxx.xxx POST /posttest.aspx - 443 - xxx.xxx.xxx.xxx HTTP/1.1 libwww-perl/5.837 - - site.com 500 0 0 6357 1226 343
Now 6357 is a sc-win32-status code. But it doesn't exist on this page:
IIS sc-win32-status codes
Is there a problem of requests coming from libwww-perl to IIS?
Is there a header missing or something that IIS is being fussy about?
Source: (StackOverflow)
I'm getting a Can't use an undefined value as a HASH reference
error trying to call HTTP::Message::decodable() using Perl 5.10 / libwww installed on Debian Lenny OS using the aptitude package manager. I'm really stuck so would appreciate some help please.
Here's the error:
Can't use an undefined value as a HASH reference at (eval 2) line 1.
at test.pl line 4
main::__ANON__('Can\'t use an undefined value as a HASH reference at
enter code here`(eval 2)...') called at (eval 2) line 1
HTTP::Message::__ANON__() called at test.pl line 6
Here's the code:
use strict;
use HTTP::Request::Common;
use Carp;
$SIG{ __DIE__ } = sub { Carp::confess( @_ ) };
print HTTP::Message::decodable();
Source: (StackOverflow)
Summary:
I am using html2ps to convert an html document with inline images. (I have both ImageMagick and libwww-perl installed.) If the images are local, this works fine; however, when the images are given through a URL, I just see [IMAGE] instead of my image.
I also tried using wget instead of libwww-perl, with the exact same result. Any help is greatly appreciated.
Code:
I always compile with
html2ps -d example.html > output.ps
I have the same image file in two places: ./local.png and http://www.example.com/remote.png.
The following html inserts the image into the ps document:
<img src='local.png' />
but this line just inserts the word [IMAGE]:
<img src='http://www.example.com/remote.png' />
The output I'm getting is
html2ps version 1.0 beta7
Reading example.html
Image: local.png
convert /var/tmp/aaaVtaOy5 /var/tmp/aaaVtaOy5.ppm
Size: 8*10
Image: http://www.example.com/remote.png
Retrieving http://www.example.com/remote.png
and a local copy of the image is created.
Source: (StackOverflow)
I had built a web crawler in Perl.
I am using
HTML::ContentExtractor
LWP::UserAgent
HTML::LinkExtor
to extract text form webpages.
Reference link for sample code web cralwer perl
Issue:
The issue is that it does not get text from web pages that have the extension as .aspx
.
It works perfectly for other webpages.I could not figure out the issue why this crawler fails for aspx
pages.
Source: (StackOverflow)
I have the following code:
...
sub setImage {
my $self=shift;
my $filename=shift;
unless(-r $filename) {
warn "File $filename not found";
return;
}
my $imgn=shift;
my $operation=&URI::Escape::uri_escape_utf8(
(shift) ? "Удалить! (Delete)" : "Сохранить! (Store)");
my $FH=&::File::open($filename, 0, 0);
my $image;
# &utf8::downgrade($image);
sysread($FH, $image, 102400, 0);
close $FH;
my $imginfo=eval{&Image::Info::image_info(\$image)};
if($@ or $imginfo->{"error"}) {
warn "Invalid image: ".($@ || $imginfo->{"error"});
return undef;
}
my $fields=[
DIR => $self->url("fl"),
OPERATION => $operation,
FILE_NAME => ".photo$imgn",
# FILE => [$filename],
FILE => [undef, "image.".$imginfo->{"file_ext"},
# Content_Type => $imginfo->{"file_media_type"},
# Content_Type => 'application/octet-stream',
Content => $image,
],
];
my $response=&ZLR::UA::post(
&ZLR::UA::absURL("/cgi-bin/file_manager")."",
$fields,
Content_Type => "form-data",
);
print $response->decoded_content;
}
...
When I try to use function setImage it fails with error HTTP::Message content must be bytes at /usr/lib64/perl5/vendor_perl/5.8.8/HTTP/Request/Common.pm line 91
. Worse that I can't reproduce this error without using all of my code and upgrading libwww-perl does nothing. What can cause it?
Versions of libww-perl: dev-perl/libwww-perl-5.836. HTTP::Request and HTTP::Request::Common came from libwww-perl package, versions: 5.827 and 5.824.
Trace:
HTTP::Message content must be bytes at /usr/lib64/perl5/vendor_perl/5.8.8/HTTP/Request/Common.pm line 91
at Carp::croak(unknown source)
at HTTP::Message::__ANON__(/usr/lib64/perl5/vendor_perl/5.8.8/HTTP/Message.pm:16)
at HTTP::Message::_set_content(/usr/lib64/perl5/vendor_perl/5.8.8/HTTP/Message.pm:136)
at HTTP::Message::content(/usr/lib64/perl5/vendor_perl/5.8.8/HTTP/Message.pm:125)
at HTTP::Request::Common::POST(/usr/lib64/perl5/vendor_perl/5.8.8/HTTP/Request/Common.pm:91)
at LWP::UserAgent::post(/usr/lib64/perl5/vendor_perl/5.8.8/LWP/UserAgent.pm:397)
at ZLR::UA::post(./zlrchecker.pl:71)
at ZLR::Info::setImage(./zlrchecker.pl:1754)
at main::main(./zlrchecker.pl:3893)
at main::(./zlrchecker.pl:4148)
Source: (StackOverflow)
I want to know the difference between these rules and which is most effective to block libwww-perl with the file .htaccess
SetEnvIfNoCase User-Agent "libwww-perl" bad_bot
Order Deny,Allow
Deny from env=bad_bot
or
RewriteCond %{HTTP_USER_AGENT} libwww-perl.*
RewriteRule .* – [F,L]
Thank you!
Source: (StackOverflow)
I'm trying to streamline some of our tasks at my place of work, and it seems that quite a lot of our developers' time is spent doing semi-mechanical tasks on the web (specifically, editing online stores that use web-based interfaces). As such, I've been looking into some solutions that will allow these tasks to be done by scripts since I figure that could save us quite a bit of time per task. So before I really started digging into any of these, I was just wondering if the Stack Overflow community had any recommendations about which web scripting/macro solution would be the best.
Here are the requirements:
- Must be able to interact with web forms (not just downloading a page and scraping the file - the script must edit controls within a web form and then submit that form)
- The forms we have to edit are secure forms, so the scripting solution must be able to handle that (ie it's of no use for us to have an incredibly powerful scripting solution if a human being will have to sit there and watch it and manually re-login every few minutes)
- It would be really, really, really preferable if it could read local files and do some basic string replacement/manipulation on them (e.g. it would be nice to have a list of variables or some HTML code in a text file and then have the script replace the token "STORENAME" with that particular store's name before it inserts the code into the form)
Here are the scripting solutions that are on my radar so far (I haven't really looked into any of these, although I have played around with Chickenfoot):
- Chickenfoot
- iMacros for Firefox
- libwww-perl
- libwww for unix and C (while searching for libwww for perl I came across this, which I did not know existed until now)
- a more "general" macro solution like AutoHotKey
Has anyone here on Stack Overflow tried any of these solutions? If so, what did you like or dislike about them? Can anyone recommend one that is not on the list? (This is by no means an exclusive or exhaustive list). I would really love to automate a lot of our mechanical processes, and I hope the Stack Overflow community can help us out so we can hopefully avoid that much of the mind-numbing part of the work :). Thanks in advance!
edit: Re: platform - We have primarily WindowsXP terminals at work, but 1) we do have a few Mac test PCs, so OS X is a viable option, and 2) if it would mean automating a lot of these tasks, I'll build a Linux box if that is necessary. So platform is pretty much a non-issue.
Source: (StackOverflow)