Managed Code Speed vs UnManaged - and Rudolph.

Richard Grimes has a very recent piece here about comparisons between managed and unmanaged code.

He points out specifically: "The problem is that when most people think of .NET they think of other frameworks with a runtime, like Java or Visual Basic; or they may even think about interpreters. They do not think about applications, or what they do; they do not think about limiting factors like network or disk access; in short, they do not think.
.NET is not like those frameworks. It has been well though out and Microsoft has put a lot of effort in making it work well. In this article I will present some code that performs some computationally intensive operation and I will compile it as both managed C++ and unmanaged C++. Then I will measure how each of these libraries perform. As you will see, .NET is not automatically much slower than unmanaged code, indeed, in some cases it is much faster."
.

Grimes presents FFT (Fast Fourier Transform) sample code for Unmanaged, Managed C++, C++/CLI and C# Managed along with the respective high-performance timings. (I found it amusing looking at this because I wrote custom FFT code in Turbo Basic for my Ph.D. dissertation back in 1990!) . Grimes provides all the sample code in downloadable form.

The bottom line is that the results for C# code showed that there is little difference between C# and managed C++ in terms of performance. Indeed, the optimized C# library was actually slightly faster than the optimized managed C++ libraries.

You have to be very careful about the uninformed who are in a position of authority. They even populate the newsgroups. Who knows? One of them might even be your boss....


Only Rudolph the Red Nosed Reindeer really knows everything, and, wisely, he has nothing to say.

Comments

  1. Hey Mr. Peter, I am a .NET newbie and looking forward to delve and understand well this platform. I am currently learning C# and I have just started my blog. I want to learn more on .NET various technologies like SQL server, ADO.NET, ASP.NET, XML...
    Any guidenance would be very appreciated.

    Joseph.

    ReplyDelete
  2. Sure, Joseph!
    Lay out a plan for what you want to learn. Stick to the plan. Take it a slice at a time. Ask questions on the appropriate newsgroups, and places like the forums at eggheadcafe.com. Expect to spend long hours and sleepless nights learning to solve programming problems. Have a healthy mistrust of authority.

    ReplyDelete
  3. Thx for the tips.

    My target is to become a good .NET web developer, and I have already a plan for that!

    C# --> SQL server--> XML and ADO.NET-->ASP.NET.

    I have been learning c# for 2 months, I have the best books covering this language from C# primer plus, to beginning C# from concept to code, Wrox Pro C#...
    how much do you think I need for the other technologies, one month for each?
    cause I need to find a work soon, so I can be able to do MS in computer science...

    ReplyDelete
  4. Rick,
    Agreed. As you point out, one must consider much more vis-a-vis the word "performance" than just numerical processing power. I thought Grimes' piece was a well - constructed first step though, since it addresses the most common misconceptions in the Code-O-Sphere.

    BTW, one of my favorite "Performant" apps, back in 1997, was something I put together with another dev in FoxPro 2.6... We called it "The Screamer"!

    ReplyDelete
  5. Anonymous6:24 PM

    Hmm.
    Then why our heavy loaded Delphi forms paint like a charm and our .net 2.0 STUB forms flicker like hell ( there are no handlers on them, they were used for UI prototyping because vs 2005 designer seemed cool to our ui guy)

    ;-)

    ReplyDelete
  6. Anonymous3:04 PM

    read this wonderful article a must read for every .net lover.

    My several doubts were cleared after reading it .. hope it helps

    ReplyDelete
  7. Anonymous11:17 AM

    This is a bit an unfortunate example.

    While I fully agree with the conclusion of the article - that managed code is as fast as unmanaged code - using this FFT code does not give a correct picture. We should distinguish between safe managed code and unsafe managed code. I have several examples from linear algebra that when implemented as unsafe managed code perform approximately as fast as unmanaged code after some careful optimization, using C++/CLI and laying out things so that the JIT can do its job well. However, I cannot say the same for safe code. With safe code you always have these checks for the array bounds that hurt performance. You can only get rid of them by using unsafe code (pointers). There is a limited set of situations where the JIT compiler is able to move the bounds checks out of the loop, but I found they are rare. It seems that this FFT code is one of them. With my code base the safe variants are usually much slower, sometimes up to 16x. Also, in contrast to this example, the C# compiler usually does a poor job. The C++/CLI compiler optimizes better and produces IL code that runs faster. So the conclusion should correctly be: UNSAFE managed code can be as fast as unmanaged code, with some effort.

    This example is not optimal to prove the point for another reason:
    When strifing for the ultimate performance in numerical algorithms, unmanaged code could make use of special processor instructions (SIMD) that result in a speed gain of factor 2 or more. This is something I cannot do in managed code.

    ReplyDelete
  8. Anonymous5:54 PM

    Hi everyone. There is another aspect of performance that I think was overlooked so far. In our company we are currently debating whether or not to move to managed code and clearly the issue of performance has come up.

    In our case, the application is NOT CPU intensive. Instead, we are more "data-intensive". Essentially, we have a data recorder so a lot of stress goes into network and disk controller I/O. Additionally committed memory usage is also up there. Typically our application (currently written in C++) can easily use 400-600MB and write out a lot of data to disk (don't remember the numbers). At the same time because we support large number of data sources all recorded in parallel, CPU still reaches 25-40% on a very decent PC.

    Our implementation right now is more or less optimized to do as little heap operations as possible during critical steps. It seems if we go with C#, then even 12-character string that used to be allocated on the stack would now result in malloc/free calls (their managed equavalent). Right now we do as much memory management as possible without writing a custom heap. But with managed stuff no one even seems to know when GC will even perform the "free" step.

    Is this something to be concerned about if entire server is rewritten to be in C#? Also long term we should be able to run for months if not years (ideally) and not leak any resources.

    ReplyDelete
  9. Anonymous1:28 PM

    I think everyone is missing the point here. .NET/C# is a tool and like any other tool one should take careful consideration to use the right tool for the job. That being said, if you take a step back and think about it, the bottom line is that the CPU executes native (x86) instructions. Period. How well optimized this native code is crafted dictates performance (putting aside other limiting factors such as I/O, etc). C/C++ is well known for its high quality code generation; in fact that is what made it so popular. Modern 3GL languages make it easier on the programmer and allow more complex applications to be written (as someone else indicated). But if you want to only consider performance keep in mind that the further away the abstraction (code) is from the processors native instruction set, the more likely it will take a hit in performance. This is simply the price one pays for convenience. I’m sure none of us wants to start programming in assembler…

    ReplyDelete
  10. Anonymous7:31 AM

    You also forgot that in a windows environment .NET is being heavily worked on where as when was the last time the Win32 API was even touched, bugs present in Windows 95 are still present TODAY!......

    Maybe if the unmanaged side was still being pushed like the managed side was, we might see some real results.

    After all we are trying to compare an old dog not being taught any new tricks to a fresh young pup with all the knowledge under the sun being pumped into it's head.

    ReplyDelete

Post a Comment

Popular posts from this blog

FIREFOX / IE Word-Wrap, Word-Break, TABLES FIX

Some observations on Script Callbacks, "AJAX", "ATLAS" "AHAB" and where it's all going.

IE7 - Vista: "Internet Explorer has stopped Working"