I did not, at any time, imply you did it intentionally. But if you present inaccurate information you may mislead people unknowingly. You yourself are probably misled by some other people who gave you inaccurate information, and is propagating the wrong facts.
So let me ask you. If CMOS is that great? Why are professional HD video cameras still using CCD till this day? Why are Medium format cameras still using CCD? These are professional level equipment where the best technology is used first. And price is no problem since users in this space will pay for top quality gear since it is for commercial purpose. Ahh... did you notice that point? Price is the key.
Please read up on CMOS and CCD and understand their technology. Each techonogly has its pros and cons. CCD technology has less potential for noise due to its architecture. CCD also exhibits less pattern noise as image data is read row by row each time. CMOS is read at pixel level, and due to nature of semiconductors, you can get uneven sensitivity across different pixels, so CMOS is more susceptible to pattern noise. CMOS also has more potential for noise as each pixel is layered with electronics (like a semiconductor). CCDs are much more expensive to make, CMOS are much cheaper. CCD are actually more sensitive to light and CMOS less. (one reason why D80 base ISO is 100, and D90 base ISO is 200). CCD consumes more power, CMOS consumes less.
In DSLR space, where price has become one of the competing parameter, CMOS has come further because there is more development done on CMOS because it is cheap. Newer generations of CMOS exhibit less noise than previous generations of CCD is due to the design of supplementary circuits within the CMOS to do noise reduction at the pixel level. And since it is just layered onto the chip itself (like a semiconductor) it is cheaper.
So there is no clear advantage who is the winner. Each technology has its own inherent pros and cons. Just that which one they selected for each application (product line) is based on many different factors and in the DSLR case, cost seems to be one of the major driving force. And don't forget Moore's law, that every piece of technology gets improved very quickly as it is developed, and for semiconductor, its speed gets doubled every 6 months. You didn't see the super low noise CCD in DSLR, is simply because they are not used in them to save cost. Remember, new is always better than old. This is the nature of electronics. So you really cannot compare the D3000 to D5000. D3000 is using the same sensor that first debut on the D200. The sensor later got used by D80, then D40x, then D60, then D3000. The D5000 sensor first appeared in a slightly different form on the D300. Then it got modified (for video I believe) and ended up in the D90, Sony A500, then the D5000, Pentax Kx and now rests in the D300s. You are talking about sensors from 2 different generations.
Please understand the technology first. If not, what you are saying is like saying that cheese is better than butter.