Basically an AJAX POST or an ASP.NET Postback is an HTTP POST, which Web Scraper does using what's called a Form Submit task. The tool to use for this is in the Tools menu-> Web Form Explorer. Use this in combination with the following tool:

The tool produces a SQL statement which you can use to insert search variables or other variables needed in your request. Some POSTS can be tricky and the general methodology is to use an HTTP sniffer and mimic the HTTP requests as closely as possible. The most promising header is the post body which you can edit using the tool above.

Next is the cookie, which usually you just need to visit the home page to get in a previous package step. In some rare cases you can't get the cookie that way in which case you can try just copying the cookie from the sniffer into the Package properties->Steps->Task properties->Advanced->Http Client->Edit. You may want to omit any obvious session variables in the cookie if possible so the cookie doesn't effectively expire. Some sites may require scraping a session, viewstate, or other hidden input to continue along. If this is required in the cookie, you can actually edit the cookie in the ProvidusDB database, tasks table, description column where it is embedded in a huge XML string along with other data that needs to be preserved. This is a very advanced tactic and only needed in 1% of packages.

The next most promising header is the referral Url again in the Http Client->Edit dialog.

The last header to try is the User-Agent in the same dialog. This can help when sites are blocking the Web Scraper agent or just blocking unknown agents. You can actually paste in a user agent. You should see the User Agent of your favorite browser in your sniffer as well.

Hopefully this gives a good overview of things to try when doing tricky requests. Sometimes none of the above work, like with an Ebay login, in which case a macro program may be a slower albeit more functional solution.