Angry Jew Cat - Banned for Life |
04-25-2010 02:27 PM |
Quote:
Originally Posted by TheDoc
(Post 17072203)
Best way to do it is to scrape from what I know, then log updates. Rumor has it some people some how pay for a commercial key to the api but it doesn't sound any more reliable - maybe people are using a mixture.
|
A commercial key would be fine (depending on the cost I suppose) if it worked. I mean scraping is fine too, but I am looking at tracking a large number of keywords, and in this case you're a little limited by IP restrictions. So I'd have to assume that those tracking a large number of terms are doing so across a large number of IPs.
Just starting to try and piece together some ideas for some custom research software I want to get put together. I know a lot of what I want to do can be handled via APIs, but it seems like scraping is the only method that will really do for this.
If that is the case are private proxies the way to go, or is it possible to get ARIN justification for this type of purpose? Any hosting out there that you can easily dig up a large block of IPs with? I'd be looking at needing international IPs too, so I'd have to assume private proxies would be the best bet.
|