Transferring large files using WCF

Recently I wanted to send large files (several GB) from a rich desktop client to a web service. The client and service communicate using WCF and I thought that this would be quite easy. As it turns out it is but there are a few gotchas on the way.


The first gotcha is that you do not want to buffer the message as you send it. Depending on the amount of RAM and the size of the file this may or may not be a problem on the client but it’s almost certainly going to be a problem on the server with multiple clients sending files at the same time.

To solve this you can either write a service contract that accepts the file in chunks or use WCF streaming. In this instance I decided to use streaming.

Streaming in WCF does have some drawbacks which I’m not going to go into here so it’s not suitable for everything but it was fine in this situation.

To use streaming you simply pass parameters of type ‘Stream’ and change the config to use streaming. This is a per endpoint setting so I set up a separate endpoint for the upload service.

Passing Metadata

The next problem I found was how to pass metadata with streams. Since I’m uploading via HTTP I want to make sure the file gets there intact so I’d like to transfer a checksum with the file (along with other details). WCF will not let you have any other parameters if using streams so how do you pass it.

Since using sessions with streaming can result in unpredictable behaviour there are a couple of options open. The first is to return a GUID (or other unique ID) after uploading the file and then allow a separate operation to set the metadata for that GUID and then link it to the previous file upload.The other option, and the one I chose, is to explicitly specify the message and send the metadata in the message header.

Max Message Sizes

The next issue I faced was the size of the messages, you need to ensure that the maxReceivedMessageSize is set large enough for your largest file. Since headers are always buffered, even when streaming, you want to ensure that this doesn’t result in DOS attacks by settings maxBufferSize to something reasonable such as 64K. This will allow large streamed bodies but limits the size of the headers.


So that’s it, all set up and working now. Well not quite. As I mentioned at the start there are a few gotchas with this.

First, the VS webdev server (Cassini) cannot handle streaming over HTTP. This is simple to fix, use IIS or self host in a console app or windows service.

Second, IIS uses the ASP.NET maxRequestLength setting for the max length not the WCF setting. You need to add this to your web config.

Third, IIS cannot transfer more than 2GB of data. You’ll need to self host to get around this if you need to send more data than that.

Fourth, timeouts can occur so you need to increase the send/receive timeouts in WCF.


So here are samples of the code bits I used.

web.config – server

    <httpRuntime maxRequestLength="2097151" />

    <binding name="FileSenderService.StreamedBinding"
                 transferMode="StreamedRequest" maxBufferSize="65536"
                 maxReceivedMessageSize="2000000000" messageEncoding="Mtom"


app.config – client

    <binding name="BasicHttpBinding_IFileTransfer" sendTimeout="00:10:00"
             messageEncoding="Mtom"transferMode="StreamedRequest" />


public class SendFileRequestMessage
    [MessageHeader(MustUnderstand = true)]
    public FileTransferInfo FileInfo;

    [MessageBodyMember(Order = 1)]
    public Stream FileData;

public interface IFileTransfer
    void SendFile(SendFileRequestMessage request);

public class FileTransferInfo
    [DataMember(Order = 1, IsRequired = true)]
    public string Name { get; set; }

    [DataMember(Order = 2, IsRequired = true)]
    public byte[] Checksum { get; set; }

The following blogs / articles were useful in sorting this out.