I'm trying to download web page:
string remoteUri = "http://whois.domaintools.com/94.100.179.159";
WebClient myWebClient = new WebClient();
byte[] myDataBuffer = myWebClient.DownloadData(remoteUri);
string download = Encoding.ASCII.GetString(myDataBuffer);
HtmlDocument doc = new HtmlDocument();
doc.LoadHtml(download);
doc.Save("file1.htm");
Having an error
webexception was unhandled: (403) Forbidden.
Are there any other ways to download the page? I've tried HtmlDocument class, but as i can see it needs loaded web page in the browser.
HtmlWeb hwObject = new HtmlWeb();
string ip = "http://whois.domaintools.com/";
HtmlAgilityPack.HtmlDocument htmldocObject = hwObject.Load(ip);
foreach (HtmlNode link in htmldocObject.DocumentNode.SelectNodes("//meta[@name = 'description']"))
{
...
}
using (var myWebClient = new WebClient())
{
myWebClient.Headers["User-Agent"] = "MOZILLA/5.0 (WINDOWS NT 6.1; WOW64) APPLEWEBKIT/537.1 (KHTML, LIKE GECKO) CHROME/21.0.1180.75 SAFARI/537.1";
string page = myWebClient.DownloadString("http://whois.domaintools.com/94.100.179.159");
HtmlDocument doc = new HtmlDocument();
doc.LoadHtml(page);
}
The site simply returns an error when it finds no user agent in the request, here's the working code.
string remoteUri = "http://whois.domaintools.com/94.100.179.159";
HtmlDocument doc = new HtmlDocument();
using (WebClient myWebClient = new WebClient())
{
myWebClient.Headers.Add(HttpRequestHeader.UserAgent, "some browser user agent");
doc.Load(myWebClient.OpenRead(remoteUri));
}
doc.Save("file1.htm");
Or if you want to use HtmlWeb
HtmlWeb hwObject = new HtmlWeb();
hwObject.UserAgent = "some browser user agent";
//more code...