August 2008 - Posts

Thumbs Down!I don't normally do product reviews on my blog but recently I have come across some really poor electronics and have had such a bad experience I want to warn any potential buyer -- These products are beyond poor, they are unacceptable!  So I have decided that at least for the time being I would share my experiences online. 

There are all kinds of new electronics out there to buy.  Flat screen TV's, iPods, iPhones, HD Radio's, BluRay, etc.. However, I personally feel that there are some companies who's business model is to produce junk and sell it as these new style electronics.  In the case of both of these products it is obvious that they rushed to market in such a way that makes them completely worthless!  This is much reminiscent of the late 90's where computer manufacturers were pushing out components before they were ready resulting in very unstable PC's!

Another thing to note is that I am a little more savvy than the average consumer.  Not to tout too much my credentials as a "product test guy" but I did used to work for a service company called Service West and I would fix electronics.  When you see the guts of some electronics you really see the difference between brands!  You could (at least then) really tell the difference between a Sony and a JVC!  The Sony was eloquent and beautiful inside and I never saw many of them!  Some of the JVC's I worked on I had to de-solder wires just to get the unit open to perform a simple mechanical adjustment and they were always in there because the user tried to do two things at once (like change disks and press play in rapid succession).  In short, you get what you pay for.  Unlike computers where you buy a Dell for the name or cereal where you pay a lot more for the little improvement in quality; electronics are a little different.  If you can make a cheaper brand electronics component work for you then that's great but it is not going to be as good as quality as a better brand.

Westinghouse 32H570D Flat Panel TV

When we first purchased this TV we loved it!  The best feature was the DVD player, it is built in such a way that you can pop a DVD into the front of the TV (front loading) and it will turn the TV on and start playing the DVD.  How they were able to do a front-loading DVD slot and keep the TV so low profile I'll never know.  I bought this from Target back in April 2008 and returned it exactly 90 days later.  I was hoping to find something else to replace it with but we're going to be doing a lot more homework before trying that again. 

Westinghouse TV

The Pro's:

  • Front loading DVD slot!
  • The price
  • Very low profile
  • Great picture
  • Easy to play DVD's


The Con's:

  • The TV would hang every one in a while.  Not an innocent "the picture is frozen" but it completely LOCKED UP!  You would have to get up, un plug it, and plug it back in!  You can't even power it off with the remote!  Yes, there was adequate ventilation and the TV did not appear to be hot.
  • The audio inputs did not work!  I spent literally DAYS trying to get audio from my computer (where we would watch online content) to the TV!  Getting the audio from the laptop speakers was not desirable! 
  • The HD tuner was clumsily laid out.  You could literally take a full 20 seconds just to flip past channel 7!
  • The remote was nothing special!  It had a poor layout as well and did not have buttons for things you would do often.
  • The menu did not have a lot in the way of customization & function.  This may be part of the "easy to use" but I was unable to select an audio source.


I would not recommend this unit to a friend!  It's possible they fix the "bugs" and will have a good product in the future, but from other reviews their technical support was very poor and sending the TV for repair is very expensive.  I can not recommend this brand to a friend either.

Jensen HD5112

This radio supposedly had it all!  Literally!  HD Radio, MP3/WMA playback on CD/CD-R/CD-RW, SD Memory, USB.  IPod link, Aux input, pre-outs, Satellite ready, EVERYTHING!  The problem? None of it worked! (more below)  I really wanted this product to work for me -- epically since these puppies aren't easy to install! Plus it was pretty inexpensive and I knew that anything else wouldn't have as many features.  I thought that if I can just find 1 brand of product compatible with the SD reader or USB, or CD ROM then I would be fine.  I couldn't find any combination that worked!  I purchased this from WalMart in August 2008 and returned it days later.


The Pro's:

  • Very Featureful (on the surface)
  • Price is right!
  • I really liked the HD radio.  It seems like the Satellite radio companies are fighting very hard to limit the number of units with this feature so it was nice to find one.  HD Radio is pretty cool and one of the main reasons for buying this unit.
  • I didn't get to try the iPod link (as I do not yet have an iPod and would probably opt for a Zune)
  • Once I got the correct dash kit (don't say that you support *any* Pontiac when you don't stupid first dash kit!) installation was pretty easy.  The wires were ISO compliant so I just had to match colors with my GM/Pontiac pigtail.
  • The unit seemed to have pretty good power.  It went much louder than my stock radio.

The Con's:

  • I could not install it without removing the "warranty void if removed" sticker!  It was right over where the sleeve slid over.  You slide it over that spot more than twice (which is usually required) that sticker is gone man!
  • The installation instructions couldn't be accessed.  They are usually provided online through a 3rd party.  You enter your serial number and it will let you download installation instructions for your vehicle.  I suspect I was the N'th person to buy this particular unit and so it wouldn't let me download the instructions!
  • I tried 6 different SD cards in this unit and could not get any of them to read.  I tried formatting these every which way possible and even tried a number of different MP3 formats just in case it was not capable of playing VBR.  I tried renaming the files to have as few as possible chars. I never got this feature working!
  • I tried 15 different CD brands! Nothing would play.  Though in defense (if you can call it that) this unit must have been defective because the CD playback didn't work with a regular music CD (CDDA) that came straight from the store. That's right, I couldn't play my Blink 182 album!
  • I tried all 6 thumb drives I had and it wouldn't read any of them so I bought a new one, and it wouldn't read that either!  Reading other reviews these last three are common.  One guy could only get 1 thing to work and that was a SD card reader to work in the USB slot.  I believe he described it as "an abomination sticking out of my dash!" and that a sudden stop would snap his card reader & radio like a twig!
  • The clock looked weird and it would never display the information I wanted. 
  • The aux port worked but only if the faceplate was not open (expected but still annoying)
  • The faceplate didn't detach or attach without a fight! 
  • The snaps that hold the radio to the sleeve weren't great.  After I installed it I couldn't get the right side to snap into place.
  • The product manual was very poor!  The website had few answers.

I would not recommend this unit to a friend.  Based on the whole number of poorly implemented features I could not buy this brand of electronics again.

I have bought the XOVision DVD player for my car to replace this CD Player.  It should get here in the next week so look for a review on that.

with 12 comment(s)
Filed under:

Managed vs Unmanaged I am asked all of the time about the performance of managed vs. unmanaged code and "how much slower is it?".  This is one of the questions I am going to attempt to answer using experimentation.  In this post I'll talk about some of the theory and make some predictions (I haven't written any code yet) and we'll see how close the theory matches the experiments. 

Managed Code Defined

Managed code is somewhat an misunderstood concept.  Managed code is simply code that targets the CLR runtime.  The CLR runtime is a bit more complex to explain and people often see the tem "virtual machine" and misunderstand the meaning of that statement.  When understanding the flow of execution for a managed application there is a three-step process.  First, the code is compiled to MSIL (Microsoft Intermediate Language).  This could be any code, C#, VB.NET, etc.  All Managed code compiles initially to MSIL.  MSIL is a low-level language like assembly.  It is often described as "Object Oriented Assembly" and rather than using registers you act upon memory in a highly optimized stack called the Evaluation Stack.  The beauty of IL is that is is low-level enough to be a very fast JIT (Just In Time) compile to native code, but the optimizations on a per-CPU basis can still be achieved in this final compilation step.  CPU optimizations like SSE2, SSE3, etc. that can greatly speed up code execution can be "given" to the users of managed code for free.  This could also include other optimizations such as GPU optimizations. 

When you run a piece of managed code it will normally take the first bit of execution to covert the MSIL to Native code specifically optimized for your platform.  This is known as JIT compilation usually happens very quickly and most people don't even realize that is what is happening every time they launch their application.  Of course, if your code is large, complex and has many dependencies you may be able to shave valuable milliseconds or in some cases even seconds by using ngen on your code.  Ngen is a tool that ships with the .NET framework that does the same compilation from IL to Native that the JIT compiler does but also keeps a copy of the native image that is generated.  This way a managed application is loaded much the same way a native application is loaded -- without JIT'ing. 


Code Execution

Figure 1 :: Managed Code Execution Lifecycle

So when you hear that .NET code is run in a virtual machine what they refer to is that you do not have to program to the specifics of a CPU; In that way the code is virtualized.  There are also services provided to you such as Memory Management, Thread Management, Exception Handling, Garbage Collection, and Security.  However, in my opinion the term virtual machine is a pretty poor fit.  These services all run side-by-side under the same process and even the same App Domain.  Therefore these services can be better thought of as a standard code template that gets compiled into every application rather than anything that is virtualized. 

On the Left - Managed Code

Managed code has some advantages!  Not the least of which is productivity.  Lets review some areas where managed code actually has an advantage! 

CPU Optimizations

As mentioned earlier, Managed Code is JIT compiled to target the specific platform in which you are running.  CPU optimizations can have a big advantage in performance!  While Native code can also take advantage of CPU optimizations you have to make a trade-off.  You have to either ship your code without such optimizations enabled for fear that someone without one of these will try to run your code OR you have to build some code and logic around working with or without each optimization you plan to target. 

Little is known, however, about the optimizations performed during JIT compilation.  Also, most modern CPU's are likely to have a base set of the most common CPU Optimizations so this argument gets somewhat weaker if there are few differences between chips. 

Managed Memory

Managed code has an awesome memory manager!  Suppose we have an array of integers like int[].  This is a data structure of contiguous integers.   Some of the advantages of using such a structure is that it is very quick to access.  You can use a syntax like MyInts[5] for access that is nearly as quick as referencing a local variable.  Internally pointer arithmetic is taking place where sizeof(int) * index + MyInts& = MyInts[5]; or Θ(1) which makes this the fastest method of accessing dynamically allocated memory.  The down-side to this stricture is adding elements to this array is extremely expensive!  Unlike dynamically-linked structures in order to add an element to this array usually requires the allocation of a completely new, bigger array, and the copying of all of the elements from the old array to the new one.  This has the complexity of Θ(n). This is extremely expensive for a single add operation! 

This is where managed code has another advantage!  Because there are no pointers, only system-managed references, it is possible for the memory manager to simply expand the array construct and anything in the way can be safely moved without harm to the execution of the application.  The worst case scenario for this operation is now a Θ(1) or that there is a constant cost for such an operation regardless of the size of the current structure.  So even in a worst case scenario we have the best possible performance.  This is good news for immutable data structures! 

Another memory-related advantage managed code has over native code is that it takes advantage of the memory vs. time tradeoff.  Basically all .NET applications consume more OS memory than they need.  There is some complex memory algorithm at the heart of this that tries to minimize calls to the OS for more memory and that such calls will gain larger blocks of memory.  Calls to the OS for more memory are very expensive whereas calls to the Memory Manager in Managed code is extremely fast.  Therefore, in theory, we should be able to dynamically create objects substantially faster in a managed language than a native one. 

Some of the other advantages to managed memory are the virtual non-existence of memory leaks, and automatic memory defragmentation.  And while the last of this list is somewhat controversial, I list it here as an advantage.  That is the garbage collector.  The garbage collector is the nebulous cloud that hangs over most C/C++ developers looking to write managed code.  They have been taught to manage memory themselves and do not like the idea of giving that control up to a complicated set of algorithms.  They all ask the same thing -- what happens if garbage collection is triggered at an inopportune time?  The answer to that question is a little complicated. But basically the garbage collection is very fast -- usually pausing execution for less than a few milliseconds.  Also, because the GC operates on many objects at once it can be a more efficient and less error-prone than self-styled memory management. Basically, the GC is not going to cause you any grief unless you do something stupid like leave your streams open (the only thing I can think of in Managed code that will cause a 'leak') and will mostly be a great burden off of your shoulders!


One of the major reasons for the push for Managed Code was security!  Managed code can make certain guarantees about it's vulnerabilities to attack.  Even if an exploit is found (which very few, if any have been) then the CLR can usually be quickly patched to provide protection to ALL .net assemblies.  I believe this is one of the reasons Microsoft prefers you to allow the JIT to happen for each execution.  Part of these guarantees is type safety and bounds checking which are very common exploits for overflow attacks.

If you choose to sign your assemblies you are no longer venerable to "dll replacement" attacks as the CLR will verify the signature of the called assembly.  Additionally, with the GAC (Global Assembly Cache) different versions of the same assembly (dll) can live harmoniously side-by-side thus eliminating the infamous "dll hell".

On the Right - Unmanaged Code

Native code enjoys the legacy and reputation of performance.  This style of programming is considered "metal to metal" indicating you have complete control over the hardware of the system.  These applications are lean and efficient and have no "magic" services which is going to make this section very lean!  They use, in general, very little memory compared to their managed counterparts and have the ability to use memory in ways the managed code doesn't like or won't allow. 

Native code allows "unsafe" type casting which can result in better performance especially in cases where .net would employ boxing / un-boxing techniques which are very costly.  Because nothing is going to happen without you making it happen you should end up with faster code.  In smaller applications the memory management could be overkill.  You also have much more control over the lifetime of an object.  Rather than waiting for a GC to collect the memory at some unknown time you explicitly delete objects and the memory is reclaimed immediately.

There isn't much to say about Native code and I think that's why people are more comfortable with it's performance. 


The Matchup

I hope to be able to write the following tests in C# for managed code and C++ and/or Delphi for unmanaged code.  I will try to post the code for each "round" and am very open to criticism on fairness.  In some respects this is a little bit like comparing apples to oranges but that doesn't mean they can't compete!


  1. Round 1 : Theoretical (this round)
  2. Round 2 : Computational (bit manipulation, looping, adding, subtracting, searching, sorting, etc.)
  3. Round 3 : Dynamic Memory (object creation, array resizing, memory allocation, memory de-allocation, etc.)
  4. Round 4 : Windows Forms & Messages (dynamic creation of windows, buttons, etc.)
  5. Round 5 : IO (File System Access, Network Streams, etc.)

The Prediction

Based on the theory I would say that after a managed assembly has been loaded it should execute faster than it's native counterpart.  I base this idea mostly on the memory management provided to managed code. Though I would be a little surprised if Managed Code were to win a head-to-head challenge.  I believe that the end result will be native code will perform faster than managed code but by a statistically insignificant amount. 

I officially call this round: Winner - Managed Code (by decision).

Round By Round Predictions:

  1. Round 1 : Won by Decision: Managed Code
  2. Round 2 : Win by Native Code
  3. Round 3 : Win by Managed Code
  4. Round 4 : Tie (1 point each)
  5. Round 5 : Win by Native Code



SQL2008Logo Pat is an excellent presenter and a friend.  I am also excited about SQL Server and Pizza so this meeting was a winning combination for me!  It was also nice to see some familiar faces and had a small but decent turnout. 


The meeting started late (as expected with almost anything I attend now days) but was forgivable due to unexpected traffic situations.  I personally arrived 5 min's late and was there in enough time to help setup.  The pizza came pretty late in the meeting which was a little annoying because I usually eat well before 7:30pm.  Also the drinks were all caffeinated (which is usually fine for programmers) but I like the non-caffeinated sort.  Also, I liked in the past how those rooms were setup with tables and chairs rather than just tables.  I usually like to take notes on my laptop but find the literal term "laptop" rather unworkable. 

Microsoft SQL Server 2008

The first question almost asked universally is "when is it going to come out"?  Although none of us had any insight into Microsoft it was generally agreed that we had heard 4th quarter 2008.  Perhaps even in time for Microsoft's PDC event.  Also RC0 is currently available for download

New features in SQL 2008 include:

  • New data types for Date and Time with enhanced precision
  • Compressed Backups (YAY)
  • New MERGE keyword for simultaneous Insert Update & Delete
  • New Table Variables allow you to pass arrays of objects into procs reducing round-trips
  • New Code Insight for the SSMS studio
  • New Transparent Data Encryption provides more control over encryption in the database
  • New Change Detect Capture (CDC) functionality for audit logging
  • New Data compression for compressed tables
  • New Policy-based management to enforce the stupid naming standards people sometimes put into a database (ughh!)
  • New Hierarchy ID field for Parent / Child / Grandchild, etc. Data
  • New FileStream Data (remains of WinFS) allows large amounts of binary data to be stored with (but not really in) the database
  • New Large User-Defined Types overcomes the 8KB limit for UDT's
  • New Spatial Data Types to store lat/long style data
  • New Grouping Sets allow multiple GROUP BY statements
  • Improvements to SQL Server Reporting Services


We did not talk about every point in that list, spending most of our time split between the new data types and their capabilities and the new MERGE statement.  We also had a fun discussion about and all of the 'unfortunate' database architectures we have encountered.


It was a really fun meeting.  It was a small crowd with an open atmosphere where we felt like we could ask questions.  Pat is more than knowledgeable about such questions and did a great job answering some of those.  Of course some of his answers left us scratching our heads as we can't follow a real DBA that deeply! 




Content Download: 2008DevPresentToNorthernNET.rar

Pat Wright Blog: