zackiv31 Posted January 25, 2009 Share Posted January 25, 2009 I have a website that makes a POST with a specific URL. I tried to simply do a post and get the response back, but it doesn't seem to be working. I'm writing a crawler, so i need to simulate many POSTs automatically and be ablle to parse their responses. I think I need to simulate the headers... how can I do this using Perl? I use firebug if this helps in solving my problem. Link to comment Share on other sites More sharing options...
0 Antaris Veteran Posted January 26, 2009 Veteran Share Posted January 26, 2009 What have you got so far? Link to comment Share on other sites More sharing options...
0 fatboyuk Posted January 27, 2009 Share Posted January 27, 2009 Your best bet would be to look at WWW::Mechanize. That will get URLs, save contents etc so you can do what you want with it after. Been writing a script using that over the last few days and it rocks! Link to comment Share on other sites More sharing options...
Question
zackiv31
I have a website that makes a POST with a specific URL. I tried to simply do a post and get the response back, but it doesn't seem to be working.
I'm writing a crawler, so i need to simulate many POSTs automatically and be ablle to parse their responses.
I think I need to simulate the headers... how can I do this using Perl? I use firebug if this helps in solving my problem.
Link to comment
Share on other sites
2 answers to this question
Recommended Posts