Recent Comments

re: Using FontAwesome Fonts for HTML Radio Buttons and Checkboxes
Thursday @ 10:49am | by Rick Strahl

@twomm - did you get this resolved? I have not seen the issue unless there's a problem with CSS maps not matching the actual elements. If you use MVC make sure you use the adjusted CSS as described in the post so that you get around the injected validation element.
re: Using FontAwesome Fonts for HTML Radio Buttons and Checkboxes
Thursday @ 3:09am | by twomm

Nice, however, I first had the same issue as Allen (keyboard/mouse).
I think it was an issue with the element IDs.
re: Passing multiple simple POST Values to ASP.NET Web API
Wednesday @ 1:24pm | by Sagar Mummidivarapu

I used your code but still my post method receives null values.
I found that the below code supports only form passed values but what if I send Json request like this: { itemId: "10626217", lookupType: "0" } (multipleBodyparameters is always null)

// only read if there's content and it's form data
if (contentType == null || contentType.MediaType != "application/x-www-form-urlencoded")
// Nope no data
result = null;
// parsing the string like firstname=Hongmei&lastname=ASDASD
result = request.Content.ReadAsFormDataAsync().Result;
re: Routing to a Controller with no View in Angular
April 21, 2015 @ 9:11am | by Brian

Don't use ng-view. Set up routes and listen for the $routeChangeStart and $routeChangeSuccess events, then evaluate the updated $routeParams.
Great information
April 21, 2015 @ 7:05am | by Ivo Botusharov

Thank you very much for your investigation and the provided information. I was just wondering why in ASP.NET MVC 5 scaffold templates the data annotation on the Delete action methods is: [HttpPost] instead of [HttpDelete] which would be more appropriate in terms of http standard. Thanks again!
re: Azure VM Blues: Fighting a losing Performance Battle
April 21, 2015 @ 6:26am | by Patrick

I am with you 100% on Azure performance issues.

I had a 2008R2 VM at GoDaddy for years, but they will not move to 2012.

I am now have a 2012R2 The uptime has been great and the performance too. I know this sounds like a plug, but it is not. When I moved from GoDaddy I went to a few other hosts and it was a nightmare. It is on my blog.
re: Hosting the Razor Engine for Templating in Non-Web Applications
April 20, 2015 @ 11:56pm | by Renato

Hi Rick,

thank you for the great post. Is there any way you can think of to get "Razor Intellisense" working at runtime in an editor? Something like your WinForm example with integrated Razor Intellisense in the "Template to Render" textbox.
re: Back to Basics: UTC and TimeZones in .NET Web Apps
April 19, 2015 @ 8:48pm | by Matt Johnson

I've blogged with detail in response to one particularly problematic part of this blog post.

re: Back to Basics: UTC and TimeZones in .NET Web Apps
April 19, 2015 @ 4:06pm | by Matt Johnson

Great post. Thanks for bringing awareness to my favorite subject! Though if you don't mind, I'm going to tear it apart. :) Allow me to clarify a few points:

TimeZoneInfo.ConvertTimeFromUtc will throw an exception if passed a DateTime with Local kind. If it's unspecified, it will assume UTC. So the ToTimeZoneTime extension method will not behave as it says in the comments, accepting a Utc or local time. The input time parameter must be in terms of UTC. The way you used it, coming from DateTime.UtcNow, is just fine.

Regarding caching - It's good to hold on to a TimeZoneInfo object for as long as you need it within the local scope. However, I don't recommend building your own cache, as the FindSystemTimeZoneById method already uses a lookup cache internally.

Next, with regard to displaying the time zone names in a drop-down - While there's nothing inherently wrong with your approach, it's important to note that the DisplayName property will be localized by the OS language of the server (which might not be English). The usual localization methods in .NET don't apply to the TimeZoneInfo object. In a multilingual app, one may need to use the TimeZoneInfo.Id to look up a display name from a resource file, rather than use the DisplayName property. Alternatively, one could use my TimeZoneNames library. (

With regard to capturing the user's default time zone. Sorry, but the approach you describe is flawed. The problem is that every implementation of JavaScript is free to present time zones in whatever manner they wish. The spec is not refined enough to rely on it having any particular set of values. In particular, the example you show of "Hawaiian Standard Time" mapping back to the TimeZoneInfo ID is just a coincidence. Worldwide, typically they will NOT match up. Also, the JavaScript names will indeed switch out for DST, while the TimeZoneInfo.Id will not. So "Eastern Standard Time" will always be the ID for the US Eastern time zone, even when JavaScript shows "Eastern Daylight Time". Additionally, some browsers will just show an abbreviation (HST, EST, EDT, etc.), or may localize the time zone string to the user's language.

Time zone detection is actually quite difficult, which is why it makes sense to ask your user in a drop-down list. There are libraries like jsTimeZoneDetect which try to guess your time zone, but it is not 100% accurate. There is the JavaScript getTimeZoneOffset method, but that only returns the offset associated with the date you called it on - which is not enough to determine the actual time zone. There's also a newer way, supported by Chrome and Opera, but not IE, FF, or Safari: Intl.DateTimeFormat().resolvedOptions().timeZone - Hopefully that will catch on in other browsers, but today - there's no universal way to detect the time zone.

Moving along - The GetUserTime method looks fine, but in your GetUtcUserTime method, there's a bug. You call ToUniversalTime on the result of the time zone conversion. So that's DateTime.ToUniversalTime, which will still use the time zone of the server - not the destination time zone. You would need to use TimeZoneInfo.ConvertTimeToUtc to avoid that problem, rather than ToUniversalTime. However, I'm a bit confused why you would need this method to begin with, or what it's actually doing. If the time is in the user's time zone, then you would just have one call to TimeZoneInfo.ConvertTimeToUtc. If the time is in the server's time zone, then you would just have one call to DateTime.ToUniversalTime. The mix here doesn't make a lot of sense.

Moving further down your post, in the section on Date range queries - again, calling ToUniversalTime on a DateTime will use the server's local time zone, not the user's time zone. So the query boundaries are not correctly aligned to the user. Again, you'd need to use TimeZoneInfo.ConvertTimeToUtc.

In the User Captured Time section, the DateMath function could just be a single call to TimeZoneInfo.ConvertTimeToUtc, passing in the user's time zone. The manual manipulation of offsets is not only unnecessarily, but it's slightly flawed in that you expect the "offset" and "offsetLocal" values to be reflecting the offset for the same "start" time, but that's defined by the user's time zone - so you're not really referring to the same instant. This will show up if either time zone is near a DST transition. It's possible that your local time zone could be on one side of the transition, and the user's time zone could be on the other side.

The same easy-to-make mistake is in you AdjustTimeZoneOffset class. It will work if the "localTime" parameter has Kind=Local, or Kind=Utc, but not necessarily will it be correct in all cases when Kind=Unspecified. Think of it this way - the TimeZoneInfo.GetUtcOffset(DateTime) *prefers* to work with Unspecified kinds, as those mean that the time is relative to to the time zone of that particular TimeZoneInfo object. When it gets a Local or Utc kind, it converts to that time zone first. (This is inverted from most of the other methods that assume Unspecified means Local or Utc).

The theme throughout all of this feedback is that you should be working directly with the user's time zone and UTC - avoiding your own servers local time zone whenever possible. Really, the time zone of the server should be considered irrelevant. Avoid any use of "local" time (DateTime.Now, DateTime.ToUniversalTime, DateTime.ToLocalTime, DateTimeKind.Local, etc.) and everything will be much simpler.

Nitpick: "DayLight Savings Time" => "Daylight Saving Time". Daylight is one word, and the second word should have no "s" at the end. Think "I'm saving daylight during this time" - which is arguable, but nonetheless, that's the term's origin.

With regard to DateTimeOffset - I'll disagree with you completely. DateTimeOffset is essential in a web application, and it addresses much more than just a single scenario. You're also not tied to a single time zone, as the offset can be adjusted to any time zone you might be in. The offset doesn't track the time zone, just the offset for the time zone that happens to be in effect. Remember, there's much more to a time zone than just an offset, as many time zones switch between more than one offset due to daylight saving time.

The essential part you're missing about DateTimeOffset is that it's the only way to effectively disambiguate between ambiguous local values. For example, if you have 2014-11-02 01:00 as a DateTime, and you're in the US Pacific time zone, you have a problem because you don't know if that's PDT (-07:00), or PST (-08:00). The DateTimeOffset keeps the offset intact, so you affirmatively know which of the two possibilities you're using. If you store a UTC time, or if the time zone doesn't use DST (like Hawaii), then you're OK. But in cases where the time zone uses DST, not storing the offset could mean that you are losing data - you'd potentially be using a value as an hour before or after the intended moment.

You're right about the performance of DateTime.Now, and TimeZoneInfo. In fact Noda Time is one of the ways you can improve upon performance, as it is thoroughly optimized for perf.

Speaking of Noda Time, I'll disagree with you that you have to replace everything throughout your system. Sure, if you do, you'll have a lot less opportunity to make mistakes, but you certainly can just use Noda Time where it makes sense. I've personally worked on systems that needed to do time zone conversions using IANA time zones (ex. "America/Los_Angeles"), but tracked everything else in DateTime and DateTimeOffset types. It's actually quite common to see Noda Time used extensively in application logic, but left completely out of the DTOs and persistence layers. In some technologies, like Entity Framework, you couldn't use Noda Time directly if you wanted to - because there's no where to hook it up. Others, such as Json.Net, Dapper, and RavenDB have extension libraries for Noda Time so you can use it there if you want to, but there's still no hard requirement that you do.

In general, I think your post is great in that you are covering most of the areas of concern, where attention needs to be paid to converting between time zones. But like many things, the devil is in the details. Even you, who are more familiar with these functions than most, still were able to make mistakes like calling ToUniversalTime on a non-local DateTime, or calling GetUtcOffset with a value from a different time zone. Noda Time won't let you get into trouble, because the API prevents you from calling methods that have confusing or ambiguous behavior.

Hope this feedback was helpful.
re: Azure VM Blues: Fighting a losing Performance Battle
April 19, 2015 @ 5:55am | by Randall Tomes

Been using azure for over a year now.
I am on my 6th vm and this one is a small windows server 2008 vm.
Using mvc5 with one simple call to a one table sql database. Nothing else. After a few hours with zero traffic, the machine locks up and has to be reset via the azure portal.

I have also tried webroles and had issues with file content (the entire website) being overwritten with old versions of the website from weeks or months prior. Basically like some auto rollback feature is broken and acts on its own. It consistently has done this 10+ times until I gave up using web roles.

Their support is clueless and they don't even understand azure themselves. Its like the blind leading the blind.

Are there any other more serious, "legitimate" and affordable cloud computing companies that I can use to host .net applications?
re: Setting up and using Bing Translate API Service for Machine Translation
April 17, 2015 @ 5:56pm | by Rick Strahl

@Mario - the data is JSON and UTF-8 encoded. The delta sign is part of a UTF-8 sequence to represent an extended character. Not sure how you're capturing the data, but if you're using an HTTP client that doesn't automatically decode data you'll need to do the UTF-8 decoding explicitly.
re: Setting up and using Bing Translate API Service for Machine Translation
April 17, 2015 @ 12:18am | by Mario Vernari

Rick, ask MS to thank you, because I was throwing all the code in the can.
I really couldn't agree more on the blurry yet few-documented APIs. I needed a trivial C# Console example, but the only one I've found it looks obsolete.
You gave me (us) some very useful tip on how to use those APIs.

Maybe it's worthwhile to mention some trouble I've found related to "strange" characters in the string. In the specific, I tried to send a Greek-delta sign in the string, but it is always treated as a "?".

Thank you anyway and good luck!
re: ASP.NET Frameworks and Raw Throughput Performance
April 16, 2015 @ 1:25pm | by Spencer

The services we are building are async and have db calls within them, but the dbs run on dedicated servers separately, so there is always a bit of waiting in them. I was basically just doing some simple test runs to get a basic idea of how many servers we were going to need to support N users simultaneously. When i hit the snag with the real services i setup some cookie cutter ones including using your test solution to see what the raw performance would be with minimal dependencies.

I thought i turned off the anti-virus, but i guess it had turned itself back on my personal box. :-) The numbers are now hitting around 6k for the raw service calls on my personal box. The interesting thing is i don't remember seeing anti virus software running on our work server but there must be something running that is not obvious and was not showing itself in the task manager. Definitely the firewall was off but i'll have to do a bit more digging on that. Now at least i know i'm not crazy since i knew i'd seen higher numbers in the past. Thanks again for your input.
re: Adding Files to the Windows MRU/Recent Document List
April 15, 2015 @ 3:04pm | by Rick Strahl

@Jerome, thanks for the heads up on the Charset.Ansi flag.

As to the scroll-resizing that's on purpose. Once you scroll past the sidebar content the screen widens to give you more space for text and images. I find that useful for my own reading - didn't think it annoying, but curious if others think the same thing. Not such a big deal on big screens but certainly nice when you're on a tablet.
re: Adding Files to the Windows MRU/Recent Document List
April 15, 2015 @ 5:01am | by Jerome Viveiros

Very nice. I've used SHAddToRecentDocs before, since my app was a tool capable of viewing other app's file types, but this only worked for the global Windows Recent Files list in Windows 7. I had no idea of all the steps required to do this with my own file types, so this will come in very handy.

It may be worth adding, for the sake of anybody who takes MS code analysis warnings seriously, that the character set passed *must* be CharSet.Ansi; otherwise the function will fail. My pinvoke declaration looked like this:
    "CA2101:SpecifyMarshalingForPInvokeStringArguments", MessageId = "1"),
DllImport("shell32.dll", CharSet = CharSet.Ansi)]
public static extern void 
    SHAddToRecentDocs(ShellAddToRecentDocsFlags flag, 
    [MarshalAs(UnmanagedType.LPStr)] string path);

Also, since I'm a new reader here, I don't know if anyone else has mentioned, but when you scroll down this blog in Firefox and Chrome (but not IE), when you scroll past the bottom of the content populated in the left pane, the right one animates and scrolls to the left to take up the whole width. It's cool, but <em>really</em> distracting.
re: Updated DeleteFiles Utility now on Chocolatey
April 14, 2015 @ 3:55am | by Uwe

Thanks for using (and mentioning) my long path library!
re: ASP.NET Frameworks and Raw Throughput Performance
April 12, 2015 @ 9:49pm | by Rick Strahl

@Spencer, it sounds like you're hitting IO limitations (CPU not maxed). What are you doing in your own tests? If you hit a database as part of testing your request/sec is going to drop drastically.

So I would first set up basic helloworld requests and test those just to see the max throughput you could possibly expect.

Also make sure to turn of Anti-Virus/Firewall software while running these tests. They can drastically slow down and throttle requests especially if you run to custom non-80 ports.
re: ASP.NET Frameworks and Raw Throughput Performance
April 12, 2015 @ 3:10pm | by Spencer

Great article. i pulled down this code to do some comparisons on web api performance. I was seeing insanely slow RPS of around 170 on my work laptop using my code with the service hosted locally. So i downloaded this and didn't see much of an improvement. I then put this same code on my desktop at home which is running a i7-2600K 3.4Ghz quad core with 24 gigs of ram. the cpu never gets up over 60% but the RPS only hits around 600 RPS using anywhere from 20 to 100 concurrent users and the hosting type never seem to make much difference. I know i've seen multi thousand RPS numbers when building old asmx services but i'm scratching my head on whats causing the issue now. It seems like there is some unusual thread contention going on but i haven't dug too deep yet. Any ideas?
re: Hosting the Razor Engine for Templating in Non-Web Applications
April 12, 2015 @ 9:04am | by Iaacov

Hi Rick,

First, thanks for this great tool.

We successfully use the RazorHosting wrapper inside a Windows Service that delivers template-based emails.

The host container is started in the service OnStart event and stopped in OnStop.
The template runs in a separate AppDomain.

The problem arises when the service stays idle for more than 5 minutes. In this case the template's lifetime expires and doesn't compile any new code.

The RazorBaseHostContainer class has an InitializeLifetimeService method which, according to documentation, appears to exists for this purpose but I couldn't figure out how to use it.

Can you please explain me how to solve this problem?
re: Bitmap types, Binary Resources and Westwind.Globalization
April 12, 2015 @ 3:45am | by David McQuiggin

By the way Rick, I render my html for SPA views from MVC, for many reasons, but also due to the localisation scenario (HTML text, JavaScript messages, and Routes). Subsequently all data is handled via WebAPI. Its a very nice fit.
re: Bitmap types, Binary Resources and Westwind.Globalization
April 12, 2015 @ 3:10am | by David McQuiggin

@Rick - The View Models do not actually contain translations...

a) I use a class to contain read only properties for each 'key' that will be translated, e.g. 'RequiredFieldAttribute', which returns a value from a single method that detects the CurrentUICulture (you could easily pass that through in non-web scenarios), and returns the relevant value from a pass-through cache implementation which is injected.

b) When the T4 for ViewModels is run it creates default translations for all DataAnnotations in the database (if they do not already exist), and as it walks the Code Dom of the target namespace and processes the ViewModels for translatable values, if a translation does not exist in the database, an entry is created for the default culture, and an attribute that calls the Translate method for say, "DisplayName", is updated or added to the property in the View Model, with the relevant key.

Apart from that, the ViewModel is not altered. In fact I am actually in the process of testing a 'buddy class for meta data approach', which I have used for data annotations on DTOs, and may work out to be cleaner.

c) A similar approach is taken for Views, but it is a bit more messy as it has to work with token replacement rather than Code Dom, but it basically does the same thing, if an element has an HTML-5 data attribute that indicates it should be translated, then if it has existing text, that is added to the database, and a call to the ViewTranslations class is inserted in its place. Additionally, attributes to support in place translation are added, that are similar to the example you post above. They describe the editor type for Mercury editor, and the result is that there is design time alteration of code, runtime WYSIWYG editing of text, and good performance through caching, and requiring any custom model binders, custom base view pages, reflection etc.

d) I use T4 for several other things depending on the scenario - I derive constants for route names, controller and actions are callable by strongly typed, parameterised Url and Html.Action etc helpers (I dont use T4MVC as it far too heavy for my needs). I use other T4s to create 'strongly typed' AJAX calls to my controllers, to create client side model definitions from my View Models in TypeScript if I am creating an SPA, etc...

P.S Looking forward to seeing your article about use of your framework with SPAs!

re: Bitmap types, Binary Resources and Westwind.Globalization
April 11, 2015 @ 12:01pm | by Rick Strahl

@David - you get no argument from me when it comes to embedding binary resources - I wouldn't do it, but again it's something that is supported in native Resx so I have to support it. I was just curious if others are using embedded binary resources for anything whether it's images or text etc. If there's *anything* stored this way that has a good reason then the feature has to be there.

Luckily the choice of whether you use that particular functionality is entirely up to the developer and like you I would opt for external resources.

Thanks for bringing up your T4 implementation. Sounds interesting. Have you shared this anywhere? Sounds that would be really interesting to check out.

Personally I'm not a fan of T4 and code generation in general, but I like the idea that the model can hold translation values on it. Of course this would end up making the model much larger than usual because it would have to account for all translated values. And therein lies the rub. Ultimately I don't really want to have to do extra work other than embedding localized values into the UI *in one place*. I also would prefer that it works with plain HTML pages and not just with .NET related code.

I've been playing around with live editing of resources in different ways. The library has had support for WebForms and resource editing since WebForms actually had meta data that described the localization items. I had some logic that would find all the localized controls and then inject edit buttons into the page so you could jump to the appropriate item in the editor.

I've been thinking to do the same for MVC/HTML pages, but I realize that it won't be as smooth an implementation as the WebForms way because there's simply no metadata there - that would have to be embedded into the document. I've been thinking about an approach like this:

<body data-resource-set="MyPage">
<div data-resource-id="HelloWorld">@DbRes.T("HelloWorld")</div>

where these data-resource-xxx attributes are used by JavaScript to provide either pop up editors or link to the resource page. It adds extra noise but it would work for even plain HTML at least.
re: Bitmap types, Binary Resources and Westwind.Globalization
April 11, 2015 @ 3:36am | by David McQuiggin

Personally, although I admire the implementation, I agree with the suggestion that it is better to store a Uri or relative path.

For example, think of the situation where you will be using a Content Delivery Network... or a different media such as video.. a Uri can cope neatly with both scenarios. Typically I have uploaded such media content to Azure Blob storage, which is dirt cheap, with a database entry and a simple utility to allow me to manage them (crud operations on content). These can then be versioned, deployed to a CDN.

Slightly off-topic: I have found this to also be the case when investigating embedding Views, JavaScript etc in a DLL that could be used in a Plugin architecture; ultimately you have to add the complexity of locating, extracting and rendering the content, and you have to inflict custom view page base classes, custom resource handlers etc on *every* page or partial you are rendering, when in fact it is only required for a small percentage of an application. The same will most likely be true of images.

But its great library; I especially like your admin interface. I have a slightly different approach that ends up with the same result; I use T4 templates to scan ViewModels properties and meta data, validation messages, strings and HTML elements that have a specific HTML5 data element, which are replaceable tokens as I explain below.

This creates a relevant database entry with the default content in the invariant language, and a static class providing static fields for returning text according to a key, and that is localised from the database values via an injected provider that can offer different pass through cache options (NullCache, InMemoryCache, RedisCache). The T4 uses CodeDom FileDom to manipulate say the HTML, to replace the original content (of say a div), with a call to the static property of my 'Translations' class, and also adds attributes to the element.

These attributes enable me to hook in Mercury Editor to perform WYSIWG edit in place, updating, previewing and saving the content via WebApi to update the database, for the currently selected culture. This allows editing content and adjusting it to account for differing text length to obtain correct flow within the design. The admin area also allows importing, exporting and changing localised route names etc.

The performance is very good.
re: AngularJs and Promises with the $http Service
April 09, 2015 @ 11:28am | by Hardik

Thanks for the perfect explaination, it clears out lot of fundamentals of promise in angularjs.
re: Back to Basics: UTC and TimeZones in .NET Web Apps
April 09, 2015 @ 4:21am | by Matt Roberts

Stellar job, this was a really nice summary of some techniques I've used and then forgotten about and had to re-google, thanks!
re: Using CSS Transitions to SlideUp and SlideDown
April 05, 2015 @ 10:50pm | by Saurabh Udaniya

you are setting height as
element.css("max-height", height); 
I tried your solution with angular directive and it was not working, however adding "px" to it
element.css("max-height", height+"px"); 

saved my time so thanks for this article.
re: Bitmap types, Binary Resources and Westwind.Globalization
April 01, 2015 @ 9:32am | by Visar Gashi

I agree with the comment as well, storing the string for a web application should be sufficient. This might be more useful for thick client apps, I am assuming this works for mobile as well. Perhaps for standardization, you can include the functionality, but discourage its use for web applications?

Great library by the way, I have done something similar for multiple projects, evolving the idea with each one, but never going far enough to build a library. I hope to use yours for my next gig.
re: ASP.NET MVC Postbacks and HtmlHelper Controls ignoring Model Changes
March 31, 2015 @ 8:52pm | by Ben Gichamba

Thanks for this post. Had spent a couple of hours wondering what I am doing wrong.
re: Bitmap types, Binary Resources and Westwind.Globalization
March 28, 2015 @ 1:28am | by Rick Strahl

@Frank - yup totally agree. I don't believe it really makes sense to store binary resources, but in this case I have to support it since that's what Resx supports. You can also use this stuff outside of ASP.NET where you might need to have some resource access.
re: Bitmap types, Binary Resources and Westwind.Globalization
March 27, 2015 @ 10:01pm | by frank

It sounds like if someone needs a localized embedded image for a web app, the easiest thing for them to do is have a string resource that contains the data url. Then the processing concerns go away.
re: Azure VM Blues: Fighting a losing Performance Battle
March 25, 2015 @ 7:27pm | by nom-nom

We recently migrated a medium-ish sized system from two dedicated hosted servers at RackSpace over to Azure. We run a data api that serves 3k/6k (off-peak/peak) requests per minute - about 7M requests a day total to about 70,000 unique clients, some international.

We're having a lot of sporadic issues. In fact, as I write this one of the websites in our group has been returning 503's for 25 minutes for no discernible reason.

We're using about dozen D-series cores of cloud service for our main api app. It does a fair bit of image generation using System.Drawing, GDI+, 3rd party native libraries, etc. An A2 cloud service for recurring jobs and the Scheduler feeding to a storage queue to initiate them. An A2 for a few data ingestion apps and FTP, mounts a storage file share that's shared with the recurring job processor and api. Another A2 cloud service for periodic video encoding and image generation. A handful of websites in one pool - our public storefront and a site just running ImageResizer off our blob storage along with some internal tool sites. 13GB Redis cache.

Our main DB is currently a P3 because Azure has some bug where our database was failing over 10+ times a day and our apps would be unable to connect to our DB for 1-2 full minutes at a time, several times a day. We also use a P1 master and P2 active readable secondary DB.

I can't even begin to enumerate all the many little weird issues we experience, but the end result is that we've barely had a single day go by without all our klaxons blazing at least once. Service downtime is not an exceptional event, it's a matter of course.

There's also no effective way to get questions about these events answered, short of perhaps paying the $1000 a month support plan. Currently we submit tickets and get non-answers after 5-7 days.
re: A jquery-watch Plug-in for watching CSS styles and Attributes
March 25, 2015 @ 10:16am | by Rick Strahl

@Tibi - this was asked before and the behavior is by design. When a property changes you get passed the list of properties with their state so you can decide on what you need to address. There seems to be no need to raise multiple callbacks for each change because you're going to get exactly the same data with each of them.

If you want to handle multiple callbacks, go through the list of props and determine what needs to be done based on the values.
re: Using an alternate JSON Serializer in ASP.NET Web API
March 25, 2015 @ 10:13am | by Rick Strahl

This article refers to a pre-release version and yes JSON.NET is the default serializer now. However, this article still serves as guide for replacing the default serializer with something else.
re: A jquery-watch Plug-in for watching CSS styles and Attributes
March 25, 2015 @ 4:17am | by Tibi Neagu

First off - amazing plugin! Really hits the nail on the head for a lot of us out here.

I've just started using it and noticed that if you add more than one watcher on the same element, only the first callback will be called.

Maybe I'm doing something wrong or is this a limitation of the Mutation API?


P.S. I've also opened an issue on Github:
re: Using an alternate JSON Serializer in ASP.NET Web API
March 25, 2015 @ 12:21am | by Dennis


Have you updated this article lately. Looks like ASP.NET Web API implements JSON.NET already or am I confused? Yeah I know it is two years old a lot has changed.
re: ASP.NET MVC, Localization and Westwind.Globalization for Db Resources
March 24, 2015 @ 11:49am | by Rick Strahl

@KA - The core library can be used completely outside of the ASP.NET context, so if you have a Windows (non Web) app you can use database resources in that as well using the ResourceManager or DbRes.

When running under ASP.NET using the Web package you get two different project options to run under: WebForms or Project where WebForms uses Local/Global Resource folders and naming conventions or Project which uses simple files and that can be used with any .NET project and application.

All the front end stuff in the localization UI depends on the Web package and it assumes that ASP.NET is available, so if you run the Web Admin form HttpContext is always there. All the tooling that is used however is available in classes that you can call directly from your own applications/code.

For a WCF project you would just use the core library that has no dependencies on HttpContext - there are only two support classes in the core library that rely on System.Web - DbRes (which has helpers that return HtmlString) and the various exporters that default paths to the default web root path *if* Http context is available.

if you find other places in the core library, please file an issue on GitHub and I'll take a look at it.

re: ASP.NET MVC, Localization and Westwind.Globalization for Db Resources
March 24, 2015 @ 5:35am | by KA

Hey Rick,

Really cool stuff, I am impressed.
I have one question. Saw that for retrieving the resources you use HttpContext.GetBlobal or local resources. Is there any other way to retrieve the resources. For example if you use a WCF Service and is on TCP the HttpContext is null.
I mean somehow to be protocol independent.

re: A dynamic RequireSsl Attribute for ASP.NET MVC
March 22, 2015 @ 1:27pm | by Tim

Hi Rick,

We use this little trick when decorating classes/methods:

#if !DEBUG

public ActionResult MyAction()

For those not familiar, this uses the debug flag set up against the project configuration in Build tab. 'Define Debug constant'. You can configure whether the flag is set or not for each project build profile.
re: Creating a dynamic, extensible C# Expando Object
March 20, 2015 @ 11:29am | by Chief

I am using the Expando object to facilitate an adhoc data query in my application where the returned data is a result of a sql query built during run time. Not tied to strong typed is so very VFP like.

Thank you.
re: ASP.NET MVC, Localization and Westwind.Globalization for Db Resources
March 18, 2015 @ 10:28am | by Douglas Hammon

Good stuff. Looking forward to your post about SPA scenarios
re: Article: A low-level look at the ASP.NET Architecture
March 15, 2015 @ 3:04am | by Veverke

Excellent and a must read article. I share the philosophy that understanding the inner workings of things may contribute greatly for further creations - besides the fact the seeing the big picture gives lots of satisfaction.

A must read !!!
re: Azure VM Blues: Fighting a losing Performance Battle
March 14, 2015 @ 8:49am | by roger geisert

I has similar issues. Someone here mentioned setting up an affinity group which I did. Now it's quite snappy.
re: Prefilling an SMS on Mobile Devices with the sms: Uri Scheme
March 14, 2015 @ 3:01am | by JĂșlio

This post helped me figure out how to get SMS URIs to work.

Here’s a little follow up:
re: Publish Individual Files to your Server in Visual Studio 2012.2
March 13, 2015 @ 2:38am | by Costin

It seems that extra settings in the publishing profile are not taken into account on individual publishing.

In my publishing profile I have added some custom settings to minify CSS and JS files. When I publish the entire project, everything works as expected, but if I want to publish just the Scripts folder, the JS files don't get minified anymore.

Also, I have set up the profile to precompile source files, so almost everything ends up in the bin folder. However, there's no way to publish only this folder.
re: Using FiddlerCore to capture HTTP Requests with .NET
March 12, 2015 @ 9:40am | by Ira

Greate job! thank you!
but I have 1 trouble.
in my local machine Fiddler creates certificate and FiddlerCore Api works perfectly.

but I need to build my project also on CI TeamCity and there I have some problem.
FiddlerCore could not create certificate.
my code fails on this line:
with next error message:
"System.IO.FileNotFoundException : Cannot locate: MakeCert.exe. Please move makecert.exe to the Fiddler installation directory.
at Fiddler.DefaultCertificateProvider.CreateCert(String sHostname, Boolean isRoot)
at Fiddler.DefaultCertificateProvider.CreateRootCertificate()".

In my project I have 2 dlls: BCMakeCert.dll and CertMaker.dll;

and here is my method:
"void InstallCertificate()
if (!CertMaker.rootCertExists())
if (!CertMaker.createRootCert())
throw new Exception("Unable to create certificate for FiddlerCore.");
if (!CertMaker.trustRootCert())
throw new Exception("Unable to trust certificate for FiddlerCore.");

X509Store certStore = new X509Store(StoreName.Root, StoreLocation.LocalMachine);

where I miss something??

thank you a lot!
re: RequestValidation Changes in ASP.NET 4.0
March 11, 2015 @ 5:04pm | by fib(Littul)

just encode with javascript ?! - amounts to writing a massive encoder/decoder that will be broken each time W3 changes things, just about. Just try this ... examples:"POST", "page.aspx?val1=abc&val2=<d", false(or true));
or anything "<(letter)"...
or anything that has '&' in it...
The above is with RequestValidation in force.
Examples: if you encode '<d' ... no go!
if you swap '<' or & with certain 3 character sequences and of course, decoding code galore on the server side... wow... things can be made to work... but the code is horrendous. Don't know, show me what I am missing.
so, hypothetically <div would become xxx, <p...zzz, ... etc... may fail in a porno context! lol
re: Web Browser Control – Specifying the IE Version
March 11, 2015 @ 12:22am | by Govindarajan

Hi Guys,
I'm running a windows service to render a HTML content on WebBrowserCotrol and take the screenShot of the output and save it in the specified local directory. I'm able to achieve this if i'm logged in the server, if I'm logged out of the server I'm getting the script error. I think it still try to find the registry entry in the HKEY_CURRENT_USER

I have tried all the below option

HKEY_CURRENT_USER\Software\Microsoft\Internet Explorer\Main\FeatureControl\FEATURE_BROWSER_EMULATION



Any idea on this?

re: Cordova and Visual Studio CODE Magazine Article
March 06, 2015 @ 3:30am | by Rick Strahl

Thanks Dave. If you've already been doing Cordova development there's probably not much new stuff here, except maybe the focus on Visual Studio, which I was rather impressed with in how easy it makes the development process even with iOS.
re: Cordova and Visual Studio CODE Magazine Article
March 05, 2015 @ 8:27pm | by Dave Ward

Nice. I just pulled my first print copy of CODE magazine out of the mailbox this afternoon and thought that would be an interesting article to read since I've been doing a lot of Cordova work lately myself. Didn't realize you wrote it. I will definitely find some time to read it now.