Interesting topic! I am experimenting with this too. I have two WPF TabControls next to each other (like a split view). My goal is to know which tab is currently active. SelectedItem does not change if the user switches to the tab in the other control. A tab is considered 'active' if a child control (eg. TextBox) has user focus.
Maybe you have an idea how I could solve this problem?
@Isaac - hopefully in the future Microsoft will make this easier and the installers 'just works' as intended so we don't need all of this rigamarole.
Great post, thank you for sharing. I have a Windows ARM developer kit that I keep intending to set something up on, and running a SQL Server instance on it would be a great use case for my home lab as I am starting a job as a SQL Server Database Administrator soon (hopefully!), so doing this kind of cutting-edge stuff will be valuable for me to keep my skillset sharp.
@Steven - oh I like that! Adding to my helper library! Thanks!
Ah, the old "navigate the WPF tree" technique. One thing I've found useful here is to define enumerable primitives for different "axes" (visual/logical tree, ancestors/descendants, whatever) and then you can query the WPF tree using LINQ:
public static IEnumerable<DependencyObject> SelfAndAncestors(this DependencyObject currentControl) { while (currentControl != null) { yield return currentControl; currentControl = VisualTreeHelper.GetParent(currentControl); } }
Once you have simple helper methods like the above, then you can query it in any way you want:
var tab = src.SelfAndAncestors().OfType<MetroTabItem>().FirstOrDefault();
That way you have an "ancestor axis" that you can traverse, and the consuming code is more explicit - it's clear that it's looking for the first MetroTabItem and returns null if none is found (without having to check the XML documentation for FindAncestor).
MetroTabItem
null
FindAncestor
Nope - it's not working for me. I can't connect with server=(localDb)\MsSqlLocalDb. It always just hangs before failing.
server=(localDb)\MsSqlLocalDb
@Rick: So the auto instance feature of LocalDB does not work on ARM?
@Stephen - going to try the M1 scripts later today. I started the install earlier and the installer ran, so it looks like it likely works.
@Ralph - I wouldn't mind LocalDb, except for the fact that TCP/IP is not working and a new named pipe has to be used for every restart of the server. That requires you start it up manually.
@Duncan - thanks for the direct links - adding to the Resources of the post.
You can download the 2022 (v16) SqlLocalDb .msi directly from here: https://download.microsoft.com/download/3/8/d/38de7036-2433-4207-8eae-06e247e17b25/SqlLocalDB.msi - how I found the direct download is documented here: https://blog.dotsmart.net/2022/11/24/sql-server-2022-localdb-download/
In my opinion localDB is the better SQL Server for developers! The DB driver starts the instance automatically (no manual start necessary) and when not in use it doesn't get started π
You don't even have to use named pipes...
Server=(localdb)\mssqllocaldb;Database=...;Integrated Security=......
Have you tried the scripts here? https://github.com/jimm98y/MSSQLEXPRESS-M1-Install/tree/main
I was able to install the latest SQL server using this method on my Mac (Windows 11 Arm on Parallels) fine!
@Thomas - yeah choices are good. Thanks for the feedback - I added a small section in the post to clarify that the log output provides this.
@Rick It's probably a matter of preference. But I really like having the option to enable/disable information like these by changing log levels for specific log categories. It requires you to update the appsettings.json file or create environment variables which kind of sucks. But very powerful.
Hey Rick where is part 3? Great resource!
@Thomas, that's not wrong - except when logging is off. I tend to run without information prompts and in production that is off by default.
But - you're right - I didn't think of that when we were hunting for ports π
Maybe I misunderstood something. But can't you just enable this through logging? ASP.NET Core outputs the URL and port it is hosted on through ILogger. If you enable info logging on the Microsoft.Hosting.Lifetime category it outputs log lines like these:
ILogger
Microsoft.Hosting.Lifetime
Now listening on: https://localhost:7028 Now listening on: http://localhost:5010
@lytico - cool. I had no idea about ivkm - that might come in handy for other things too. Thanks!
you can use plantuml native in .net over ikvm. the latest ikvm-releases support that, and you can generate the images in process without a server.
260 get me ? Yeah it can be a real headache, I tried LongPath Tool Program which helped a lot.
You can bypass for PDF generation, but it requires a different mechanism. See this post:
Programmatic Html to PDF Generation using the WebView2 Control and .NET
Hi, would the use of CoreWebView2Controller allow us to set up and render WebView2 with NavigateToString without needing to deal with the visibility issue? Or same visibility issue exists? I only need to print from the content loaded via NavigateToString. Thanks.
Thanks Matt! That was my answer!! This has been bugging me for ages.... π¦
Anyone noticed webview2 hang while running in batch mode
Dude this is freaking awesome!!!
Thanks for posting all of the WebView2 content, Rick. It has been super helpful. As a Minnesotan I've gotta say I'm a little jealous of your location...
Nice!!! Thanks Bro! It worked flawlessly. You Are Da MAN!!!
We encounter the same issue on a regular basis as well. However, most of the time especially for new package versions, clearing the http-cache is also sufficient.
dotnet nuget locals --clear http-cache
This significantly reduces the time of the next nuget restore. π
nuget restore
With Git 2.46 you can have * in folder paths so in you case:
[safe] directory = d:/projects/*
Checking in from 2024. This is still useful, solved my problem!
I did read the article. WhenAny is a form of observation. Granted, you didn't cancel the delay (wasting resources) or ensure the completion was observed, but you did observe the delay. Not observing it at all would be pointless, no?
But, we're both being pedantic. I think we both mostly agree with the other. I'll continue to be hyperbolic, as I think it's safer, even if not correct. π
@William - Did you not read the post? π The timeout uses a Task.Delay with WhenAny() to time out. If timed out before the delay completes the Task.Delay() keeps running.
Task.Delay
WhenAny()
Task.Delay()
I think we agree on the principle of trying to make tasks observed. I just hate absolutes 'thou shalt not do' because they are rarely appropriate for all scenarios. As you point out the better to make sure all tasks are somehow awaited/continued and to use something like .FireAndForget() style continuations to ensure that non-observed tasks runs to completion and that exceptions are properly terminated.
.FireAndForget()
@Rick, what's the point of a Task.Delay if you don't observe it? I take your point, though. There are some operations that simply won't cause problems if you terminate them in the middle of processing. However, I will say it's bad design to create and not observe such tasks (there's a reason we have cooperative cancellation and why things like Thread.Abort are considered dangerous). Can you get away with such code? Sure. Is it a good idea? In general, NO. I shouldn't speak in absolutes, but in this case I do so for a reason. I've seen far too many cases of "fire and forget" that can be disastrous when the application shuts down for any reason. Rather than talk about the nuances, corner cases and alternative designs, it's easier to talk in absolutes.
@William - I'm not sure that an absolute 'All Tasks need to be observed' is a necessary requirement.
While I'm with you on making sure that any critical tasks that can potentially cause instability or lock up should be observed, surely letting a Task.Delay() run without completion is not going to break anything. Likewise waiting on a UI operation to complete that may never complete is not something you can await indefinitely, especially since most of those don't have a CancellationToken that you can cancel on. Preciously few operations outside of the core framework offer CancellationToken support, likely because it's an awkward implementation.
await
CancellationToken
FWIW, in most of my applications I use a .FireAndForget()` method to 'await' tasks and ignore the results when not explicitly awaiting in mainline code.
@Andrew, the corollary is true as well... just because you don't wait for a task does not mean it completes. There's huge potential for bugs with "fire and forget" tasks. If a task is running and the process ends, the "task" will be prematurely ended, likely in the middle of some critical operation such as writing to disk, corrupting state. All tasks should be "observed" to complete, always.
It's always been possible to implement a timeout using a CancellationTokenSource, Task.WaitAsync just makes it easier. https://learn.microsoft.com/en-us/dotnet/csharp/asynchronous-programming/cancel-async-tasks-after-a-period-of-time
Hi, i recently had to deal with something similar, here are some links of interest :
https://devblogs.microsoft.com/oldnewthing/20220505-00/?p=106585 (Raymond Chen, Microsoft)
https://devblogs.microsoft.com/pfxteam/crafting-a-task-timeoutafter-method/
https://learn.microsoft.com/en-us/dotnet/csharp/asynchronous-programming/cancel-async-tasks-after-a-period-of-time
This is a little more old school (C# 7, Framework 4.6)... shouldn't this do similarly?
using (CancellationTokenSource cts = new CancellationTokenSource(TimeSpan.FromSeconds(1.5))) { try { await SomeTaskOperation(cts.Token).ConfigureAwait(false); } catch (OperationCanceledException ex) when (cts.IsCancellationRequested) { // Timeout occurred, do what you might need to do. } } private async Task SomeOperation(CancellationToken token = default(CancellationToken)) { while (true) { if (cts.IsCancellationRequested) { throw new OperationCanceledException(); } // Do something async. } }
Dear Rick,
I am most grateful for your post and your code. I had been trying hard to make WebView2 printing my HTMLs, without showing them, without success, getting only a blank page. Now, I had been able to adapt your code to my problem, and it worked like a charm!
Thank you so much for sharing your knowledge.
Regards, Rafael
Came to point out WaitAsync() but I guess someone got there first π it was actually introduced in .net 6, so more widely availableπ I wrote a post about it and how it's implemented here: https://andrewlock.net/a-deep-dive-into-the-new-task-waitasync-api-in-dotnet-6/
Maybe also worth pointing out that just because you stopped waiting for the task, doesn't mean it stops running on the background: https://andrewlock.net/just-because-you-stopped-waiting-for-it-doesnt-mean-the-task-stopped-running/
Yes, I have been using ExecuteScriptAsync(script) where script="setItem('tmDoc', <div class='itm'>text</div>)";
<div class='itm'>text</div>
this is a partial string, the original is large. My 1st param is the html id The 2nd param is the html. I have never encoded it. Why do I need to? My setItem is: function setItem(id, docStr) { try catch (e) { showError(e); } } I wonder what the string size limit of this parameter is? Also, what do you mean by this statement "you end up doing most of the work with JavaScript either in the original document and calling into it (preferred if possible)"?
@Byron - well, you can kinda do that with ExecuteScriptAsync() but it's definitely more cumbersome as you have to encode any string data you send and you can't pass object references back to .NET - everything has to happen in the DOM itself.
ExecuteScriptAsync()
The big difference is that the control doesn't expose the DOM in the same way the WebBrowser control did - there's no direct connection between the control and the DOM. The only way to interact with the DOM is via script code. Once you know how this works you can do this relatively easily, but you end up doing most of the work with JavaScript either in the original document and calling into it (preferred if possible) or scripting via ExecuteScriptAsync().
I wish they would at least allow element.innerHTML=someHtmlString. I used it all the time in old IE Browser component, still waiting for it.
Thanks @Majkimester - totally missed that this is available and it would probably work. According to the docs though, preferrable way is to let the data source manage the async load, so the workarounds are a good choice regardless.
You can do also a non blocking loading of the image with async binding:
See also: https://learn.microsoft.com/en-us/dotnet/api/system.windows.data.binding.isasync?view=windowsdesktop-8.0
.gitinfo is a mistake, it should be .gitconfig
.gitinfo
.gitconfig
Is it .gitconfig or .gitinfo. In some sections you switch between both of them
@Richard - But... it's great to discuss. I know I have to really work at reminding myself to look and see if I can utilize Span<T> or Memory<T> to optimize memory usage in a lot of places. In fact, in this very library I should probably go through all of my string functions - I'm sure there are lots of opportunities because that code mostly dates back to .NET 2.0 π
Span<T>
Memory<T>
@Richard - I am aware, but I try to avoid pulling in any extra dependencies as this code lives in a (trying to be) small utility library. It's not an issue for Core, but for NETFX I'm on the lookout to avoid if possible to keep the footprint down.
That said all of this is premature optimization: That code a) isn't ever going to be called in any critical path, b) isn't using any significant amount of memory, and c) allocates on the stack anyway. So it's hardly necessary to optimize. If I recall correctly, the compiler may actually automatically elevate that code to a span (I seem to remember Stephen Toub mention that recently, but can't recall if that only applied to the new collection initializers or also plain typed arrays) in later compiler versions compiling for the net80 target.
Unfortunately the library compiles to netstandard2.0 and net472 and so I can't use that.
If you can take a reference on the System.Memory NuGet package, you can still do that. You'll just need to manually change the <LangVersion> in the project file. π
<LangVersion>
<PropertyGroup> <LangVersion>12.0</LangVersion>
NB: Some language features will just work; some will require polyfills; and some require runtime support, and won't work at all. I tend to use PolySharp for the polyfills.
@maruthi - No as that's a security concern. You'd have to manually capture that information and pass it in the extra variables that send up with the files.
Is is possible to get the files path when uploading files. I need to show the file path on where it is located with in the internal shared network.