创建高性能的WCF服务
来源:互联网 发布:jquery.min.js引用 编辑:程序博客网 时间:2024/06/05 03:32
I had a WCF service where I wanted to be able to support over a hundred concurrent users, and while most of the service methods had small payloads which returned quickly, the startup sequence needed to pull down 200,000 records. The out of the box WCF service had no ability to support this scenario, but with some effort I was able to squeeze orders of magnitude performance increases out of the service and hit the performance goal.
Initially performance was abysmal and there was talk of ditching WCF entirely ( and as the one pushing WCF technology on the project this didn't seem like a career enhancing change )
Here's how performance was optimized. These are listed in the order they were implemented. Some are fairly obvious, others took some time to discover. Each item represents, a significant increase in latency or scalability from the prior - and although I have internal measurement numbers, I'm not comfortable publishing them as the size of the data increased, and the testing approach changed.
- Use NetTCP binding
This helps both throughput and the time it takes to open and close connections - Use DataContract Serializer instead of XMLSerializer
I started out using DataTables - POCO objects via Linq2Sql yielded a 6x increase
slow: [OperationContract] MyDataTable GetData(...);
fast: [OperationContract] IEnumerable<MyData> GetData(...);
- Unthrottle your service
It's quite understanable that WCF is resistant to Denial of Service attacks out of the box, but it's too bad that it's is such a manual operation to hit the "turbo button". It would be nice if the Visual Studio tooling did this for you, or at least had some guidance (MS - hint, hint)
The items to look at here are:- <serviceBehaviors><serviceThrottling ...> set the max values high
- <dataContractSerializer maxItemsInObjectGraph="2147483647" />
- and under <netTcpBinding> setting the listenBacklog, maxConnections, and maxBuffer* value high
- Cache your data
WCF, unlike ASP.Net has no built in facility to cache service responses, so you need to do it by hand. Any cache class will do. - Normalize/compress your data
this doesn't necessarily have to be done in the database, the Linq GroupBy operators make this easy to do in code. To clarify, say your data is kept in a denormalized table
stringKey1stringKey2stringKey3intval1intval2
the bulk of the result set ends up being duplicate data
LongKeyVal1LongKeyVal2LongKeyVal31012LongKeyVal1LongKeyVal2LongKeyVal311122LongKeyVal1LongKeyVal2LongKeyVal312212so normalize this into
LongKeyVal1LongKeyVal2LongKeyVal310121112212212
In code, given the following classes
public class MyDataDenormalized
{
public string Key1 { get; set; }
public string Key2 { get; set; }
public string Key3 { get; set; }
public int Val1 { get; set; }
public int Val2 { get; set; }
}
public class MyDataGroup
{
public string Key1 { get; set; }
public string Key2 { get; set; }
public string Key3 { get; set; }
public MyDataItem[] Values { get; set; }
}
public class MyDataItem
{
public int Val1 { get; set; }
public int Val2 { get; set; }
}you can transform an IEnumerable<MyDataDenormalized> into a IEnumerable<MyDataGroup> via the following
var keyed = from sourceItem in source
group sourceItem by new
{
sourceItem.Key1,
sourceItem.Key2,
sourceItem.Key3,
} into g
select g;
var groupedList = from kItems in keyed
let newValues = (from sourceItem in kItems select new MyDataItem() { Val1 = sourceItem.Val1, Val2= sourceItem.Val2 }).ToArray()
select new MyDataGroup()
{
Key1 = kItems.Key.Key1,
Key2 = kItems.Key.Key2,
Key3 = kItems.Key.Key3,
Values = newValues,
}; - Use the BinaryFormatter, and cache your serializations
If you're willing to forgo over the wire type safety, the binary formatter is the way to go for scalability. Data caching has only a limited impact if a significant amount of CPU time is spent serializing it - which is exactly what happens with the DataContract serializer.
The operation contract changes to
[OperationContract]
Byte[] GetData(...);
and the implementation to
var bf = new BinaryFormatter();
using (var ms = new MemoryStream())
{
bf.Serialize(ms, groupeList);
// and best to cache it too
return ms.GetBuffer();
}
Before items 4,5, and 6 the service would max out at about 50 clients ( response time to go way up and CPU usage would hit 80% - on a 8 core box). After these changes were made, the service could handle of 100 + clients and CPU usage flattened out at 30%
Update: Shay Jacoby has reasonably suggested I show some code.
Update2: Brett asks about relative impact. Here's a summary
itemlatencyscalability2) DataContract Serializerlargelarge3) unthrottlesmalllarge4) cache datasmall 5) normalize datamedium 6) cache serializationsmalllarge- 创建高性能的WCF服务
- WCF服务的创建与调用
- 创建一个简单的WCF服务
- 用VS2010创建简单的WCF服务
- 创建Wcf数据服务的事例
- 在Linux下创建低成本、高性能、高可用的Web服务集群系统
- WCF(1)----服务创建
- VS2010 创建WCF以及SL的客户端如何调用WCF服务教程(一): 创建WCF
- 创建高性能的索引
- 创建高性能的索引
- 创建高性能的虚拟机
- ATL服务器:用 Visual C++创建的高性能的Web应用程序和XML Web 服务
- ATL服务器:用 Visual C++创建的高性能的Web应用程序和XML Web 服务
- WCF系列: 创建一个WCF服务
- WCF(一)创建WCF服务
- 【WCF】-快速创建一个WCF服务库
- WCF中的REST架构二 (支持AJAX的WCF服务 - 创建服务)
- 在Sharepoint2010的Webpart中调用WCF服务(二)创建webpart并调用WCF服务
- Threading in C# ,phase 1
- 创建线程时的几个陷阱
- IEEE1284 USB转并口打印线缆配置
- Perl 中的正则表达式
- Intel的平板战略
- 创建高性能的WCF服务
- JQuery 选择器
- 织梦系统编程收集
- UTF-8与GB2312之间的互换
- Threading in C#, phase 2
- 学习socket编程之一:用smtp协议发送邮件
- 如何获取子进程的输出
- css ul li 图片文字混排,css完美版,jquery只适合宽高相等的情况
- ADO连接失败,Com初始化