The Operation Has Timed Out at System.Net.Httpwebrequest.Getresponse() While Sending Large Number of Requests to a Host

HttpWebRequest.GetResponse() keeps getting timed out

I had the very same issue.
For me the fix was as simple as wrapping the HttpWebResponse code in using block.

using (HttpWebResponse response = (HttpWebResponse) request.GetResponse())
{
// Do your processings here....
}

Details: This issue usually happens when several requests are made to the same host, and WebResponse is not disposed properly. That is where using block will properly dispose the WebResponse object properly and thus solving the issue.

C# HttpWebRequest Timed Out

It may look a lot of stuff, but you don't need to be concerned with what this all does right now.

Just create a Button on a Form, then make async it's Click event handler as shown here (the main methods here use all async Http/IO .Net methods).

The remaining code just needs to be pasted (in the same Form, to make it quick).

The StreamObject used to pass data back and forth, after the connection is completed will contain some informations about the WebSite specified in its ResourceURI property (as below).

The StreamObject.Payload is the complete Html page, decoded using it's internal CodePage or the CodePage detected by the server.

If you want to see it, just pass it to a WebBrowser.

Note:
I might have left out something, adapting this to be posted here. This will be immediately
obvious.
In case, tell me what it is and I'll update this
code.

Also:

Disable Fiddler!

Visual Studio Version: VS Pro 15.7.5
.Net FrameWork: 4.7.1

private async void TestConnection_Click(object sender, EventArgs e)
{
StreamObject sObject = new StreamObject()
{
ResourceURI = new Uri(@"https://www.bestbuy.com/?intl=nosplash"),
ProcessStream = true
};

sObject = await HTTP_GetStream(sObject);
Console.WriteLine(sObject.Payload.Length);
}

using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.IO;
using System.IO.Compression;
using System.Linq;
using System.Net;
using System.Net.NetworkInformation;
using System.Net.Sockets;
using System.Net.Security;
using System.Reflection;
using System.Runtime.InteropServices;
using System.Security;
using System.Security.Authentication;
using System.Security.Cryptography;
using System.Security.Cryptography.X509Certificates;
using System.Security.Permissions;
using System.Security.Principal;
using System.Text;
using System.Text.RegularExpressions;
using System.Threading;
using System.Threading.Tasks;

public class StreamObject
{
public StreamObject()
{
this.Cookies = new CookieContainer();
}

public Stream ContentStream { get; set; }
public bool ProcessStream { get; set; }
public Uri ResourceURI { get; set; }
public Uri ResponseURI { get; set; }
public string Referer { get; set; }
public string Payload { get; set; }
public string ServerType { get; set; }
public string ServerName { get; set; }
public IPAddress[] ServerIP { get; set; }
public string ContentName { get; set; }
public string ContentType { get; set; }
public string ContentCharSet { get; set; }
public string ContentLanguage { get; set; }
public long ContentLenght { get; set; }
public HttpStatusCode StatusCode { get; set; }
public string StatusDescription { get; set; }
public WebExceptionStatus WebException { get; set; }
public string WebExceptionDescription { get; set; }
public CookieContainer Cookies { get; set; }
}

const uint COR_E_INVALIDOPERATION = 0x80131509;

public async Task<StreamObject> HTTP_GetStream(StreamObject RequestObject)
{
if (string.IsNullOrEmpty(RequestObject.ResourceURI.ToString().Trim()))
return null;

MemoryStream memstream = new MemoryStream();
HttpWebRequest httpRequest;
CookieContainer CookieJar = new CookieContainer();
HttpStatusCode StatusCode = HttpStatusCode.OK;

ServicePointManager.SecurityProtocol = SecurityProtocolType.Ssl3 |
SecurityProtocolType.Tls |
SecurityProtocolType.Tls11 |
SecurityProtocolType.Tls12;

ServicePointManager.Expect100Continue = false;
ServicePointManager.DefaultConnectionLimit = 10;
ServicePointManager.ServerCertificateValidationCallback += TlsValidationCallback;

httpRequest = WebRequest.CreateHttp(RequestObject.ResourceURI);

try
{
HTTP_RequestHeadersInit(ref httpRequest, CookieJar, RequestObject);
httpRequest.Method = "GET";
using (HttpWebResponse httpResponse = (HttpWebResponse)await httpRequest.GetResponseAsync())
{
Stream ResponseStream = httpResponse.GetResponseStream();
//SslProtocols Protocol = ExtractSslProtocol(ResponseStream);

if (StatusCode == HttpStatusCode.OK)
{
await ResponseStream.CopyToAsync(memstream);
RequestObject.ContentStream = memstream;
RequestObject.ResponseURI = httpResponse.ResponseUri;
RequestObject.ContentLenght = memstream.Length;
RequestObject.ContentCharSet = httpResponse.CharacterSet ?? string.Empty;
RequestObject.ContentLanguage = httpResponse.Headers["Content-Language"] ?? string.Empty;
RequestObject.ContentType = httpResponse.ContentType.ToLower();
if (RequestObject.ContentType.IndexOf(@"/") > -1)
{
do {
RequestObject.ContentType = RequestObject.ContentType.Substring(RequestObject.ContentType.IndexOf(@"/") + 1);
if (RequestObject.ContentType.IndexOf(@"/") < 0)
break;
} while (true);
if (RequestObject.ContentType.IndexOf(";")> -1)
RequestObject.ContentType = RequestObject.ContentType.Substring(0, RequestObject.ContentType.IndexOf(@";"));
RequestObject.ContentType = "." + RequestObject.ContentType;
}
RequestObject.ContentName = httpResponse.Headers["Content-Disposition"] ?? string.Empty;
if (RequestObject.ContentName.Length == 0)
RequestObject.ContentName = RequestObject.ResourceURI.Segments.Last();
RequestObject.ServerType = httpResponse.Server;
RequestObject.ServerName = RequestObject.ResponseURI.DnsSafeHost;
RequestObject.ServerIP = await Dns.GetHostAddressesAsync(RequestObject.ServerName);
RequestObject.StatusCode = StatusCode;
RequestObject.StatusDescription = httpResponse.StatusDescription;
if (RequestObject.ProcessStream)
RequestObject.Payload = ProcessResponse(RequestObject.ContentStream,
Encoding.GetEncoding(RequestObject.ContentCharSet),
httpResponse.ContentEncoding);
}
}
}
catch (WebException exW)
{
if (exW.Response != null)
{
RequestObject.StatusCode = ((HttpWebResponse)exW.Response).StatusCode;
RequestObject.StatusDescription = ((HttpWebResponse)exW.Response).StatusDescription;
}
RequestObject.WebException = exW.Status;
RequestObject.WebExceptionDescription = exW.Message;

}
catch (Exception exS)
{
if ((uint)exS.HResult == COR_E_INVALIDOPERATION)
{
//RequestObject.WebException = PingHostAddress("8.8.8.8", 500) > 0
// ? WebExceptionStatus.NameResolutionFailure
// : WebExceptionStatus.ConnectFailure;
RequestObject.WebException = WebExceptionStatus.ConnectFailure;
RequestObject.WebExceptionDescription = RequestObject.WebException.ToString();
}
else
{
RequestObject.WebException = WebExceptionStatus.RequestCanceled;
RequestObject.WebExceptionDescription = RequestObject.WebException.ToString();
}
}
finally
{
ServicePointManager.ServerCertificateValidationCallback -= TlsValidationCallback;
}

RequestObject.Cookies = httpRequest.CookieContainer;
RequestObject.StatusCode = StatusCode;
return RequestObject;

} //HTTP_GetStream

private bool TlsValidationCallback(object sender, X509Certificate CACert, X509Chain CAChain, SslPolicyErrors sslPolicyErrors)
{
//if (sslPolicyErrors == SslPolicyErrors.None)
// return true;

X509Certificate2 _Certificate = new X509Certificate2(CACert);
//X509Certificate2 _CACert = new X509Certificate2(@"[localstorage]/ca.cert");
//CAChain.ChainPolicy.ExtraStore.Add(_CACert);

//X509Certificate2 cert = GetCertificateFromStore(thumbprint);
X509Certificate2 cert = (X509Certificate2)CACert;

//CspKeyContainerInfo cpsKey = (CspKeyContainerInfo)((RSACryptoServiceProvider)cert.PublicKey.Key).CspKeyContainerInfo;
//if (cert.HasPrivateKey) { RSA rsaKey = (RSA)cert.GetRSAPrivateKey(); }
//if (cpsKey.Accessible) { Console.WriteLine("Exportable: {0}", cpsKey.Exportable); }

// next line generates exception "Key does not exist"
//bool isexportable = provider.CspKeyContainerInfo.Exportable;

CAChain.Build(_Certificate);
foreach (X509ChainStatus CACStatus in CAChain.ChainStatus)
{
if ((CACStatus.Status != X509ChainStatusFlags.NoError) &
(CACStatus.Status != X509ChainStatusFlags.UntrustedRoot))
return false;
}
return true;
}

private void HTTP_RequestHeadersInit(ref HttpWebRequest httpreq, CookieContainer cookiecontainer, StreamObject postdata)
{
httpreq.Date = DateTime.Now;
httpreq.Timeout = 30000;
httpreq.ReadWriteTimeout = 30000;
httpreq.CookieContainer = cookiecontainer;
httpreq.KeepAlive = true;
httpreq.AllowAutoRedirect = true;
httpreq.AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate;
httpreq.ServicePoint.MaxIdleTime = 30000;
httpreq.Referer = postdata.Referer;
httpreq.UserAgent = "Mozilla / 5.0(Windows NT 6.1; WOW64; Trident / 7.0; rv: 11.0) like Gecko";
//httpreq.UserAgent = "Mozilla/5.0 (Windows NT 10; Win64; x64; rv:56.0) Gecko/20100101 Firefox/61.0";
httpreq.Accept = "ext/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8";
httpreq.Headers.Add(HttpRequestHeader.AcceptLanguage, "en-US;q=0.8,en-GB;q=0.5,en;q=0.3");
httpreq.Headers.Add(HttpRequestHeader.AcceptEncoding, "gzip, deflate;q=0.8");
httpreq.Headers.Add(HttpRequestHeader.CacheControl, "no-cache");
httpreq.Headers.Add("DNT", "1");

if (postdata != null && postdata.UseProxy)
{
if (postdata.ProxyParameters != null)
{
if ((postdata.ProxyParameters.Host.Length > 0))
{
httpreq.Proxy = new WebProxy(postdata.ProxyParameters.Host, postdata.ProxyParameters.Port);
}
else
{
httpreq.Proxy = new WebProxy(postdata.ProxyParameters.Uri, postdata.ProxyParameters.BypassLocal);
}
httpreq.Proxy.Credentials = new NetworkCredential(postdata.ProxyParameters.Credentials.UserID,
postdata.ProxyParameters.Credentials.Password);
}
else
{
httpreq.Proxy = WebRequest.GetSystemWebProxy();
}
}
}

private string ProcessResponse(Stream stream, Encoding encoding, string ContentEncoding)
{
string html = string.Empty;
stream.Position = 0;
try
{
using (MemoryStream memStream = new MemoryStream())
{
if (ContentEncoding.Contains("gzip"))
{
using (GZipStream gzipStream = new GZipStream(stream, CompressionMode.Decompress))
{
gzipStream.CopyTo(memStream);
};
}
else if (ContentEncoding.Contains("deflate"))
{
using (DeflateStream deflStream = new DeflateStream(stream, CompressionMode.Decompress))
{
deflStream.CopyTo(memStream);
};
}
else
{
stream.CopyTo(memStream);
}

memStream.Position = 0;
using (StreamReader reader = new StreamReader(memStream, encoding))
{
html = reader.ReadToEnd();
html = DecodeMetaCharSetEncoding(memStream, html, encoding);
};
};
}
catch (Exception)
{
return string.Empty;
}
return html;
}

private string DecodeMetaCharSetEncoding(Stream memStream, string _html, Encoding _encode)
{
Match _match = new Regex("<meta\\s+.*?charset\\s*=\\s*\"?(?<charset>[A-Za-z0-9_-]+)\"?",
RegexOptions.Singleline |
RegexOptions.IgnoreCase).Match(_html);
if (_match.Success)
{
string charset = _match.Groups["charset"].Value.ToLower() ?? "utf-8";
if ((charset == "unicode") | (charset == "utf-7") | (charset == "utf-16"))
charset = "utf-8";

try
{
Encoding metaEncoding = Encoding.GetEncoding(charset);
if (_encode.WebName != metaEncoding.WebName)
{
memStream.Position = 0L;
using (StreamReader recodeReader = new StreamReader(memStream, metaEncoding))
{ _html = recodeReader.ReadToEnd().Trim(); }
}
}
catch (ArgumentException)
{
_html = string.Empty;
}
catch (Exception)
{
_html = string.Empty;
}
}
return _html;
}

A timed out Error on GetResponse() in third run

You might want to consider using the using statement:

string sURL;
sURL = "http://www.something.com";

using (WebRequest wrGETURL = WebRequest.Create(sURL))
{
using (HttpWebResponse http = (HttpWebResponse)wrGETURL.GetResponse())
{
Stream objStream = http.GetResponseStream();

//etc.
}
}

it guarantees that the Dispose method is called, even in case a exception occurs. (https://msdn.microsoft.com/en-us/library/yh598w02.aspx)

The reason for the timeout is probably that your server has a limit of x simultaneous requests. Due to the improper disposure, the connection will stay open longer then needed. And although the garbage collector will fix this for you, it's timing is often too late.

That's why I alway's recommend to call Dispose, through using for all objects that implements IDisposable. This is especially true when you use these object in loops or low-memory (low resource) systems.

Careful with the streams though, they tend to use a decorator pattern and might call Dispose on all its "child" objects.

Typically applies to:

  • Graphics objects
  • Database connections
  • TCP/IP (http etc.) connections
  • File system access
  • Code with native components, such as driver for usb, webcam's etc.
  • Stream objects

HttpWebRequest times out on second call

On the heels of the previous answers, I wanted to add a couple more things. By default HttpWebRequest allows only 2 connections to the same host (this is HTTP 1.1 "niceness"),

Yes, it can be overriden, no I won't tell you how in this question, you have to ask another one :)
I think you ought to look at this post.

I think that you are still not quite disposing of all your resources connected with the HttpWebRequest, so the connection pooling comes into play and that's the problem. I wouldn't try to fight the 2 connections per server rule, unless you really have to.

As one of the posters above noted, Fiddler is doing you a bit of a disservice in this case.

I'd add a nice finally {} clause after your catch and make sure that as the above post notes, all streams are flushed, closed and references to the request object are set to null.

Please let us know if this helps.

HttpWebRequest receives WebException: The request timed out

I figured out, why it was not working - still I don't know why it behaves like this. (Maybe a UnityEditor thing)

I added

webRequest.ProtocolVersion = HttpVersion.Version10;

and everything worked. No more timeout errors. And yes webRequest.ProtocolVersion = HttpVersion.Version11; results in the timeout error.

However, making a HttpRequest from the web succeeds with either of these: HTTP/1.1, HTTP/1.0 (with Host header), HTTP/1.0 (without Host header)

HttpWebRequest.GetResponse Timeout issue when running from scheduled task on a server

As per my previous comment, it would appear there is a proxy involved that requires the user to be logged in for the request to succeed.

Your response:

We use an internet security web proxy that works via Single Sign On for users. Basically the tool pulls the users network credentials any time they try and access a site behind the scenes before either allowing or blocking the traffic. The issue was once the admin account was disconnected, this web proxy would attempt to gather credentials from a logged in user but would find nothing. The request would then be denied. This is why any user being logged in worked but it failed every other time.



Related Topics



Leave a reply



Submit