Recent Comments


re: .NET 4.5 is an in-place replacement for .NET 4.0
Friday @ 5:54am | by Ole

Thanks for this explanation!

As much as I like doing c# and .net tech, .NET 4.5 is causing major headaches in our web application projects, mainly asp.net mvc.

Some of our backend libs still are 4.0 since they're mainly used in WinForms applications where we don't want to force the clients to do a new download.

But as good as it is, Visual Studio 2013 is a pain, when using asp.net MVC and you do NOT want to use 4.5, especially if you're using nuget for all those nifty packages like bootstrap, jquery, log4net etc.

I sometimes don't understand why the VS and .NET teams don't have the Developers in mind, who are actually using their stuff. I would like to keep costs low, but I'm wasting my companies money on such issues.
re: ASP.NET Frameworks and Raw Throughput Performance
Friday @ 5:31am | by Andy

I've always referred to this article over the years in my various discussions of performance :) Any chance you might be able to revisit it for the new 2015 technologies (MS has put a lot of effort into performance since 2012)
re: Using FontAwesome Fonts for HTML Radio Buttons and Checkboxes
Friday @ 3:22am | by Rick Strahl

@Allen - thanks for catching the missing CSS. Fixed.

Not sure why you can't click the checkboxes. Does it work if you take the with-font off? Does the sample form work for you? I can't see a reason that it work with the keyboard but not the mouse. The mouse target should work for both the label and the actual check box. I use the exact code you have in an actual application with out the angular bindings without problems. You might want to double check the actual HTML that is rendered with dev tools and Inspect Element to ensure that there isn't something that's getting injected into the middle.

@Pawel - it works down to IE 9 which is the first IE version that partially supports CSS3 which is what makes this work.

re: Using FontAwesome Fonts for HTML Radio Buttons and Checkboxes
Friday @ 1:08am | by Allen

Great article.

I believe your consolidated css at the bottom does not contain the css to make the actual checkbox/radio button invisible.

Also when I use the following html:

<input name="rememberMe" type="checkbox" class="with-font" data-ng-model="vm.loginData.useRefreshTokens"><label for="rememberMe"> Remember me</label>

I am unable to click on the font-awesome checkbox, I can use keyboard shortcuts to check and uncheck the checkbox but not the mouse? Any ideas what I'm doing wrong? This is all in Chrome.

Cheers.
re: Using FontAwesome Fonts for HTML Radio Buttons and Checkboxes
Thursday @ 11:47pm | by Pawel

I wonder if it works well in old IEs?
re: Chrome DevTools Debugging Issues
Tuesday @ 12:53pm | by DNewb

I went to settings and then workspace. I had some old workspaces referenced and once I cleared those out and restarted the browser the issues cleared up for me. Seems like it might be related to the source mapping features, as @Pilotbob pointed out.
re: A Localization Handler to serve ASP.NET Resources to JavaScript
Tuesday @ 12:32pm | by Rick Strahl

@Chris - Not sure. I think that should work, but you'll have to make sure the handlers are registered in the <system.web> section instead of <system.webServer>. I think you may have the configuration backwards? the httpHandler section is classic mode, handler is integrated.
re: Back to Basics: UTC and TimeZones in .NET Web Apps
Tuesday @ 6:34am | by MarcelDevG

Hi Rick,

I'm with you on the datetime offset/datetime issue. On the server I only want to deal with UTC dates.
But I wonder why you don't use the javascript method getTimezoneOffset() of a Date on the client to get the timezone of the user ?

Marcel
re: A Localization Handler to serve ASP.NET Resources to JavaScript
Tuesday @ 1:03am | by chris

Hey Rick, I really like your work. I'm trying to implement the javascriptresourcehandler using classic mode. But this doesn't work. The requests are answered with a PlatformNotSupportedException, telling me that the integrated pipelinemode has to be used. I followed the implementation-guide in your post, but I was only able to make it work using integrated mode and adding the handler to the handler-section (not the httphandler-section). Is the classic mode not supported anymore?

Regards
Christoph
re: Azure VM Blues: Fighting a losing Performance Battle
February 23, 2015 @ 12:31pm | by Peter Seewald

We would love to use SQL Database (aka SQL Azure) but the feature set isn't there and running a VM with SQL on it is overly expensive for our shop. Having all of our data and infrastructure in Azure would be way easier to manage and work with but we ended up having on premise environment. We tested performance of our VM versus Azure's VM for a similar setup and saw a similar pattern to what you saw with your older physical machine.

Bottom line is that it if you can't move to Azure Websites and SQL Database in Azure than it's not worth moving. The cost/performance of VM's in Azure for smaller companies/individuals is holding quite a bit of migration to the cloud back. At least in the Microsoft world. Bizspark is decent but when the cost is so high, it's hard to justify that kind of money.
Thanks!!
February 23, 2015 @ 1:57am | by Jack

Thank you for this.
I was tearing my hair out wondering why IE was mangling my site.
Adding main{display:block} fixed it!
re: Visual Studio 2013 'Could not evaluate Expression' Debugger Abnormality
February 22, 2015 @ 4:19pm | by RobinHood70

I'm currently getting the same issue with a property in an abstract class. The code is stupidly simple.

public abstract class TestBase<T> : TestBase
{
    protected T Output { get; set; }
}


and then the class using it is a simple:

public class LoginTest : TestBase<LoginOutput>
{
    ...
}


In this case, Output is not being evaluated in the debugger window when I hover over a "this.Output" it in the code. When I hover over "this", I get the window for that and drilling down to Output, I get the "Could not evaluate expression" error with a refresh icon. When I refresh, it tells me that it cannot convert LoginTest to TestBase<LoginOutput>.

If I add in a plain-and-simple backing field, the backing field gets evaluated fine.

So far, none of the solutions presented have helped in the slightest.
re: A Small Utility to Delete Files recursively by Date
February 20, 2015 @ 11:48am | by Rick Strahl

@Marcio - Any more specifics on the error? I run it on 64 bit here and on my server so pretty sure that works, unless there's a problem with Anti-Virus or other system protection software interfering.

If you have more info can you file an issue in the GitHub repo, please? Thanks.
re: A Small Utility to Delete Files recursively by Date
February 20, 2015 @ 10:30am | by Marcio

Hi Rick,

I tried using your utility (the binary) on Windows 8 64 bit and Windows gave me an error on compatibility with 64 bit versions of Windows.

Thanks anyway
re: A WebAPI Basic Authentication MessageHandler
February 20, 2015 @ 5:52am | by Jk

if I add a role in the principal:

var principal = new GenericPrincipal (identity, new string [] {role});

how can I authorize methods of the controller with the role?

Thank You

JK
re: Back to Basics: UTC and TimeZones in .NET Web Apps
February 20, 2015 @ 3:23am | by Andrei Ignat

Wrote also about JsonResult in MVC and the corresponding problems in reading in HTML .
See
http://msprogrammer.serviciipeweb.ro/2013/03/10/mvc-jsonresult-datetime-and-timezone/
re: ResourceProvider Localization Sample posted
February 20, 2015 @ 2:09am | by Rick Strahl

@Andreas - Resource Managers and Providers are case sensitive so you have to match key names explicitly, since they are based on dictionary lookups. If you turn the provider off you're getting the default values you have in controls I'm guessing but it's not actually using any resources at all. Even Resx resources names are case sensitive.

In the database the key case sensitivity depends on the database character set settings. If you're using Sql server you can use a case insensitive character set which is the default.
re: ResourceProvider Localization Sample posted
February 20, 2015 @ 1:43am | by Andreas

Hi

The keynames seems case sensetive. Some strings in our application only show the keynames. And when we turn of the provider it all works. If we change the resource key name to match what is on the webpage it works.

Is there a way to turn case sensitive off?

Andreas
re: Back to Basics: UTC and TimeZones in .NET Web Apps
February 19, 2015 @ 3:12pm | by Rick Strahl

@James - I think I understand what DateTimeOffset does - it stores the DT *and* a timezone offset of when the data is captured for that particular tz. Makes sense. But for Web apps you NEVER capture time that way. You want to capture the date in UTC and the offset is irrelevant because it represents the server's time not the client's time.

Even if you DID store the offset from the original users timezone (which means you'd have to do the conversion up front because the server's not running that users timezone) you still get *only that timezone*. Not the timezone that a user of the date might want to see at a later time.

I guess I don't see how DTO helps if the time value's offset is fixed to a specific timezone when the application always adjusts to the user's timezone preference which mostly will not be the original DTO offset. You STILL have to do these conversions for EVERY user and if I do that then what does DTO buy me? Nothing except more storage required.

I also don't agree with this:

> The *only* time you need to do any conversion is strictly when displaying

because if you do certain datetime operations like Date queries that group by day or month or less granual time increment, that requires that you adjust those queries for that timezone. Otherwise you're going to include the wrong time range. So there are a number of places where this matters - almost every date query with user input in particular since those are typically done based on day ranges.
re: Back to Basics: UTC and TimeZones in .NET Web Apps
February 18, 2015 @ 7:45am | by James Manning

@Rick - WRT "Since DTO only supports a single timezone - I first have to convert to that timezone to save" and "have to get the date into the right TimeZone for saving first" - those are not correct (and very unfortunate that those were the reasons you've avoided using it).

The whole point is that since the data type stores the offset with the datetime, then the same column can store any timezone. You don't have to do *any* timezone conversions at any point if you don't want. You get a DateTimeOffset from someone in Hawaii that's got a UTC offset of -10 and you can store it as-is and still happily allow Oregon people to store in -8/-7, EST people to store -5/-4, etc. If SQL Server forced developers/users to convert to a particular timezone for saving the datatype would have no benefit over just saving as datetime2 and telling people to store as UTC (certainly the best practice if you're stuck using datetime/datetime2).

The *only* time you need to do any conversion is strictly when displaying, and only if you want to display it in a different timezone than what it already is (for instance, displaying it in the user's timezone regardless of what the originating timezone was). You can do comparisons, queries, etc all without having to do any conversions.

In your enter-in-Hawaii/display-in-Oregon scenario, with datetime or datetime2, you would typically convert twice, once on the write path to convert to UTC for storage (since your storage has no support to encode the UTC offset, so you would either encode it as a separate column, or more likely, store the version with no offset), then again on the read path to convert the UTC to Oregon.

If you use datetimeoffset instead, you don't need to convert on your write path at all. It comes in as offset -10 from your Hawaii person, -8 half the time from your Oregon people, -7 the rest of the time, etc. and you just store it like that. On your read path, you can display it with the original offset if you want (something that's not really an option if you forced conversion to UTC on the write path), or convert it to local time (just like you would in the datetime/datetime2 scenario)

I think Bart Duncan said it best, IMHO :)

http://blogs.msdn.com/b/bartd/archive/2009/03/31/the-death-of-datetime.aspx

***
When should you use datetimeoffset instead of datetime? The answer is: you should almost always use datetimeoffset. I’ll make the claim that there is only a single case where datetime is clearly the best data type for the job, and that’s when you actually require an ambiguous time
***
re: Azure VM Blues: Fighting a losing Performance Battle
February 17, 2015 @ 1:16pm | by Rick Strahl

@Mark - yeah I'm using my MSDN account to experiment with this stuff. After all that is what it's supposed to be for. I sure hope they're not throttling MSDN accounts - if they are it's a great way to ensure people won't use Azure because the performance is so terrible :-)

It's interesting to hear responses here that mostly seem to concur on the abysmal performance, but a few here and there seem to suggest that performance is just fine.

Just to clarify when I re-installed new VMs more recently I've had better luck with performance and at least the RDP performance is 'usable'. It's better, but overall with load tests even these newer installs have been very slow compared to other providers I've tried with smaller (and much cheaper) server configurations.
re: Azure VM Blues: Fighting a losing Performance Battle
February 17, 2015 @ 3:53am | by Mark Randle

Hi Rick

I noticed similar results especially the extra slow response when doing anything through RDP.

I did think it was the server set up. However I did try Azure previously using a standard try it for 30 days offer. When I did this response was good definitely no RDP lag and also comparable to my current Dedicated Server hires.

I do have a theory though - I like you have credit through Visual Studio and this time I am using this. Surely Microsoft wouldn't throttle this back as we are effectively getting free credit would they ???

Azure was in my migration plan to get away from leasing Physical Servers but not so sure now - Maybe should try again with a public (not free) account!
re: Azure VM Blues: Fighting a losing Performance Battle
February 16, 2015 @ 3:41am | by Rick Strahl

@washu - thanks for the detailed feedback.

As to D2: D2 did improve performance significantly, but I wonder if that's simply due to the Windows swap file sitting on the SSD drive. Sounds like the CPUs are also better.

This is my general complaint with VM based hosting - it's impossible to compare these services - and often even the different levels, because the CPU numbers and Ghz numbers are nearly useless because it depends on the type and vintage of CPU used. As you say Azure's spec look good on paper, but when compared to other providers that appear the same, it lags behind terribly.
re: Azure VM Blues: Fighting a losing Performance Battle
February 15, 2015 @ 8:11pm | by washu

@Giedrius - As per my previous comment I don't want to sound like I am supporting Amazon EC2, other than it is much better than Azure. This is just my experience with using both platforms.

Amazon has two instance types that have CPU throttling, t1 and t2. How t1 instances throttle is not published, but generally the overall performance is low. T2 instances have very specific and documented throttling parameters based on a credit system and you can view your credits in the AWS console. I have found t2 instances to work very well for "bursty" workloads that otherwise are mostly idle. All other instance types in EC2 are not throttled.

Azure has the A0 instance which is comparable to the EC2 t1 instance and is the only type that is throttled on Azure. The A0 is a low performance shared core instance. All the instances that Rick tested are not throttled. The performance is consistent, just slow. The A series instances above A0 have very slow and out of date AMD CPUs, but the bigger issue is that all Azure instances have absolutely abysmal I/O performance. I have personally done tests on Azure where I was getting seconds per IOP, not IOPS. This is why Rick still had performance issues after switching to a D2. The CPU in the D series is ok, but the disk I/O is still terrible. Note, the SSDs in the D series only refers to the local temporary drives. Unless you are specifically using them they make no difference. Even the A series have local mechanical temporary drives that are much faster than non-local drives. Also, bigger instances do give more sequential disk performance on Azure (and EC2) but since IOPS are usually more important it doesn't help much.

In my experience the current generation EC2 instances (M3, C3, C4, R3, I2) have much higher CPU performance than Azure A series and are still a bit faster than the D series. On I/O performance EC2 is night and day faster no matter what the instance type. Again, this is not a recommendation to use Amazon, just a warning to not use Azure. You will find most of the other major providers are closer to Amzaon than Azure in performance.
re: Web Browser Control – Specifying the IE Version
February 14, 2015 @ 8:57am | by Hilaly Hassan

i tried to use embed web browser control to execute an css web menu under win xp + IE8 but in vain even though i modified the registry key as shown

I added this DWORD Key :
HKEY_CURRENT_USER\Software\Microsoft\Internet Explorer\Main\FeatureControl\FEATURE_BROWSER_EMULATION

with :
key name = vfp9.exe Decimal value = 8000

But nothing happends.
note : this css menu works well inder win7+IE11

Needs helps
thinks
re: Back to Basics: UTC and TimeZones in .NET Web Apps
February 13, 2015 @ 2:13am | by Luke Puplett

I tend to write more web services than web sites, though they work the same way, so the same logic should apply and the same problems.

Web services tend to use date-time formats that capture and serialize the local time and the offset from UTC, and no other time-zone information.

This is stored as DATETIMEOFFSET in MSSQL - indeed, EF for a while, removed support for the old DateTime, forcing DTO!

This information has captured a moment in time that has all the information to be able to show what a person thought the time was, historically. Even if their local daylight savings time rules are changed, such as for the USA, I remember the impact on IT when working on a trade-floor in 2005:

http://en.wikipedia.org/wiki/Daylight_saving_time_in_the_United_States#2005_revision_to_dates_of_observance

The problem with not persisting the offset is that the programmer is at the mercy of the current algorithm in whatever library, when retroactively applying it to an historic timestamp. Will the algorithm know that daylight savings time was different back then?

As you can imagine, this is very important legally. I believe this is why ISO 8601/RFC 3339 has come to be the standard for date exchange and persistence.

....I think. Seriously, its one of those mind-boggling problems that invokes a fair bit of self doubt.

In any case, Microsoft are de-emphasising DateTime in favour of DateTimeOffset, probably for similar reasons. Used correctly, it is, I guess "non destructive".

Thanks for the blog post, its a useful tool for rationalizing and scenario planning.

All the best,

Luke
re: Using Cordova and Visual Studio to build iOS Mobile Apps
February 12, 2015 @ 12:15pm | by Ian

Rick - how are you managing builds for multiple environments? I find myself having to change the Display Name in config.xml to get multiple application version. Such as MyApp Dev, MyApp QA, and just MyApp (production).
re: Back to Basics: UTC and TimeZones in .NET Web Apps
February 11, 2015 @ 8:22pm | by Jeremy

Funny timing - I have been globalizing a Vacation tracking system for the past week and am using nodatime. I first tried the built in .NET classes you describe above, but I found the list of available time zones (which I believe are read from the server registry) to be terribly lacking and insufficient for my project. Further research led me to the tz database of IANA/Olson time zones which were perfect for what I was trying to accomplish. A little more research and I happened upon NodaTime. Within an hour I had it working with an external tz database. This is awesome because I can automate the updates to the db. Once I got my app working with NodaTime, I spent the majority of my time analyzing the dates and datetime values in the system to determine how best to store the dates. This portion of the project seems to require the most thought and is still ongoing. I was surprised to see this blog post today in my inbox and was glad to see that you mentioned NodaTime. Good luck everyone with your projects and keep up the helpful blog posts.
re: Back to Basics: UTC and TimeZones in .NET Web Apps
February 11, 2015 @ 2:49pm | by Rick Strahl

@James, @Johan - DateTimeOffset is not all that useful in server applications where the timezone is fluent. If I enter a date in Hawaii, then later go to Oregon and want to see the date in Oregon timezone format I still have to do conversions. Since DTO only supports a single timezone - I first have to convert to that timezone to save as well so I have up front work to do. You also have to get the date into the right TimeZone for saving first since the server will catch date using server local time. So you end up doing the time zone conversion up front. So in a fluent Web date environment where users are not statically tied to a timezone DateTimeOffset buys very little IMHO. 

To be honest I've not used DateTimeOffset much at all, because of the above issues, but maybe I'm missing something obvious :-)

@Johan - yes client side data display is easier if you push data down as UTC with JSON and then just display the data, which is nice since the browser knows what timezone to use. But that only addresses one of the scenarios - and which is basically the easiest one (ie. calling user.GetUserTime()). It doesn't address querying on the server, it doesn't address grouping or assignment for display values that are often pushed down from views in the server (I tend to pre-format dates as strings on the server in my view objects).

re: Back to Basics: UTC and TimeZones in .NET Web Apps
February 11, 2015 @ 1:59pm | by Johan

As James says above DateTimeOffset handles the server side storage for you, client-side you can just stick with Date object to handle UTC to local time both ways, especially if you use an SPA and do all the binding client side, this makes timezone handling trivial. Convert to UTC for input from client as early as possible and present back to client as local time as late as possible, no need to complicate things server side
re: Back to Basics: UTC and TimeZones in .NET Web Apps
February 11, 2015 @ 1:00pm | by Martin Enzelsberger

»[NodaTime] is a little more complex [...] because it forces you to think about what kind of date you are dealing with«

I'm pretty sure that's not a negative.
To me this sentence sounds like »catch blocks are great, but they are a little more complex because they force you to think about to do when an exception occurs«.

You *should* be thinking about it, so it's a good thing that NodaTime forces you to do so. To me it's more of an advantage than a disadvantage (if you're dealing with time zones).
re: Back to Basics: UTC and TimeZones in .NET Web Apps
February 11, 2015 @ 10:06am | by James Manning

"In the end if you pull data out of the database you still have to somehow figure out what timezone is used though."

Nope, that's why the datetimeoffset data type was introduced in SQL Server (linked to in that SO thread, even)

https://msdn.microsoft.com/en-us/library/bb630289.aspx

DateTimeOffset keeps the DateTime along with the UTC offset, which datetimeoffset does in SQL Server as well. It's simpler to use that end-to-end instead of having to remember to convert to and from UTC when dealing with the database.

If you're stuck with a legacy situation where only datetime/datetime2 is supported, then yes, storing as UTC is definitely the way to go. If you're using SQL 2008 or later, though, IMHO you should definitely be using datetimeoffset instead of dealing with timezone conversion to and from UTC.
re: Back to Basics: UTC and TimeZones in .NET Web Apps
February 11, 2015 @ 3:03am | by Rick Strahl

@Tom - DateTimeOffset is useful as it provides built-in support for the Offset from Utc, but it doesn't really help simplify the issues addressed in this post. I think DateTimeOffset would be very useful in applications that run in many different physical locations where you both need local dates for user display in a fixed location and Utc dates for storage to the database. In the end if you pull data out of the database you still have to somehow figure out what timezone is used though.

There's a great entry that compares about both DateTime and DateTimeOffset on SO:
http://stackoverflow.com/questions/4331189/datetime-vs-datetimeoffset
re: Azure VM Blues: Fighting a losing Performance Battle
February 11, 2015 @ 12:11am | by Giedrius

It would be really interesting to see how Amazon EC2 performs in this context. Btw, EC2 had CPU throttling on smaller instances - if you was hitting CPU too hard, it would limit CPU performance for some time, could it be, that Azure has something similar?
Because I've tried some smaller cloud providers in the past and I can say that I had terrible experience with them. EC2 support is not brilliant too, but what works, works quite well.
re: Back to Basics: UTC and TimeZones in .NET Web Apps
February 11, 2015 @ 12:03am | by Tom Deleu

What about DateTimeOffset? Where does that fit in according to your opinion?
re: Back to Basics: UTC and TimeZones in .NET Web Apps
February 10, 2015 @ 6:10pm | by Rick Strahl

NodaTime is excellent, but it does requires a full commitment in an application which is often not possible especially in existing applications. I've added a few notes to the post . Thanks for the reminder.
re: Back to Basics: UTC and TimeZones in .NET Web Apps
February 10, 2015 @ 11:08am | by Michael Weinand

Any thoughts on http://nodatime.org/? We've used that in the past with success to make Timezones a bit easier to deal with.
re: Back to Basics: UTC and TimeZones in .NET Web Apps
February 10, 2015 @ 9:23am | by Chris Hynes

NodaTime takes a lot of the pain out of dealing with time zones: http://nodatime.org/.

Instead of doing querying for local dates in memory, with EF you can define custom SQL functions like this: http://blog.3d-logic.com/2014/08/11/the-beta-version-of-store-functions-for-entityframework-6-1-1-code-first-available/. NHibernate has a similar capability.
re: Azure VM Blues: Fighting a losing Performance Battle
February 09, 2015 @ 2:18pm | by Rick Strahl

@washu - it certainly looks that way. I stood up a couple of VMs on Vultr and perf is nearly the same as my physical machine and costs a fraction of what Azure charges for the same setup that doesn't perform half as well. Azure is just not even close to competitive on performance in that comparison. I have to say I did not expect that when I looked at Azure - I figured it would be top of the line experience in terms of performance.

Microsoft better get their shit together on this - great infrastructure, which they certainly have, is not enough to make up for bad performance. After all bad performance means you end up overpaying for resources you otherwise wouldn't need.

re: Azure VM Blues: Fighting a losing Performance Battle
February 09, 2015 @ 8:43am | by Mika Kukkonen

I did a similar study two years ago with Azure North Europe Region.
http://www.flowmarks.com/2013/02/migrating-web-application-to-azure-vm.html

My tests focused on average page load times rather than throughput.
I measured page load times for a single user to see if the best-case performance in Azure was competitive with an existing system.

Results were similar, the physical server (a 2006 low end machine) generally outperformed Azure.

I concluded that because the application made many requests per page, round-tripping between Helsinki and Dublin (2000 km one way) was the main cause of worse load times. There might have been other Azure-related latencies, but if the bottleneck was the database, Azure was actually faster.

Network latency could also explain your problems with RDP.

But overall the Azure performance was so unreliable that in the end we decided to buy new physical servers to replace the old ones. This was a very good investment, because low-end server performance had improved dramatically. A low end server today would be something like a 4-core Xeon, 64GB RAM and a 256GB SSD Raid-1.

At the time, Azure prices were competitive with other VM providers in North Europe, but I haven't checked lately.
re: Chrome DevTools Debugging Issues
February 09, 2015 @ 3:34am | by Hani

I'm facing the same problem one month ago. My current version is Version "40.0.2214.111 m". Chrome was my favorite browser for browsing and debugging. But, a lot of strange behavior is making the work very hard specially with debugging. No more linkable style sheets in Developer Tools!. Very bad performance with jQuery plugins such as Accordion.

Sadly, I might move to another browser such as FireFox!.

I feel that somebody inside Google is doing bad things against Google!.
re: ASP.NET Web API and Simple Value Parameters from POSTed data
February 09, 2015 @ 3:12am | by vicky

Thanks for your quick reply Rick and Sorry for the delay in my response.
so by the above example i can send XML,JSON or RAW from the body? and why the return type is async can we return anything from the method?

Regards,
Vicky
re: Azure VM Blues: Fighting a losing Performance Battle
February 08, 2015 @ 4:19pm | by washu

As someone who has used many of the big cloud providers all I can say is this: don't use Azure. I'm not going to suggest who you should use because it really does not matter, they are almost all better and faster than Azure. Try someone else, anyone else and you will see how bad Azure really is.
re: Using Cordova and Visual Studio to build iOS Mobile Apps
February 08, 2015 @ 9:15am | by Cristian

Hey,

Do you know if it's possible to have the desktop web browser as a platform in a Cordova project?
re: Using Cordova and Visual Studio to build iOS Mobile Apps
February 06, 2015 @ 2:03pm | by Rick Strahl

@Zakaria - You can most certainly build an HTML front end application with a FoxPro backend that acts as a service. In fact, this very same example is one that ships with Web Connection here:

http://west-wind.com/wconnect/musicstore/#/albums

Which could be ported to run on a phone easily because it's pure HTML on the front with a FoxPro Web Connection Service backend. (http://west-wind.com/webconnection)
re: Using Cordova and Visual Studio to build iOS Mobile Apps
February 06, 2015 @ 1:32pm | by Zakaria

Hi Rick,

I'm wondering whether this porting to phone applications could als be applied to webconnection foxpro web applications?
re: WCF WS-Security and WSE Nonce Authentication
February 06, 2015 @ 8:20am | by Francesco

hi
first og thank you so much for your post it's very helpfull.
i have some problem with your solution in post message. the writer.writerow dosen't append the userNameToken tag but if i add <o:Security><o:UsernameToken>...</0:UsernameToken></Security>
the writer.writeRow works properly but the message contains two tags security like this:
<o:Security><o:Security><o:UsernameToken>...</0:UsernameToken></Security></Security>

do you how can i resolve the problem? i'm using vb.net instead of c#
thank you so much
Francesco
re: Filtering List Data with a jQuery-searchFilter Plugin
February 06, 2015 @ 5:46am | by Nevin House

Thanks for the post. Yes, I'm one of those people that still finds jQuery quite useful, and it's interesting that dispite the hype and popularity of the AngularJS, jQuery is indeed still relevant - and both AngularJS and jQuery are being rewritten into new libraries today, in 2015!!!

My question is in your humble opinion, if you had do create a website using only one Javascript JS file, would you use an AngularJS or a jQueryJS (yes, I know you'd rather write your own) AND what version of that JS file would you use?
re: Azure VM Blues: Fighting a losing Performance Battle
February 03, 2015 @ 8:27pm | by Max

Would filling out the Azure penetration testing whitelist form help with this?
https://security-forms.azure.com/penetration-testing
re: Web Browser Control – Specifying the IE Version
February 03, 2015 @ 12:18pm | by Grant Edwards

Brilliant!

After spending a couple weeks working on the web pages in an embedded device, I was pretty happy with the results -- I had even added some extra code to get things to work as far back as IE8 (which is as far back as we're willing to support). Then I fired up a windows app that ships with the device and contains an embedded IE control. Of course, it looked like a disaster. For a while, I thought I was going to have to start all over again. Then Google let me to your blog, and my problem vanished.

Many thanks...