Why don't programmers write applications in microcode?
Is it because microcode is hard to understand and write or mostly processors ain't microcode compatible?
I want the reason
Why don't programmers write applications in microcode?
Is it because microcode is hard to understand and write or mostly processors ain't microcode compatible?
I want the reason
 
    
    The machine-code instruction set and the details of what each instruction does is part of the processor specification. Usually, later processors in a given family support the same instructions as previous models, unless there's a good reason. In other words, the instruction set is a stable target for programmers to code against. Microcode, on the other hand, isn't ordinarily available to the programmer, or specified at all. It's an implementation detail, and the manufacturer has the freedom to change it radically between different chip designs if it suits their purposes. If microcode was part of the ISA, that freedom would be lost, and we would probably end up needing micro-microcode (nanocode?) to make up for it.
 
    
    The reason is that most all micro-code exist as firmware inside of CPU's and MPU's. Usually instructions are fed to the CPU as assembler, which has been translated at 'build' time from C or C++. Micro-code is mostly an issue just for the manufacturers of CPU's and MPU's and some FPGA's.
 It is the decoded version of basic instructions fed to the CPU, often 40 bits wide or more. These individual bits act to set the source of data and destination in the CPU. Also these bits control the ALU and prefetch sub-systems, L1 and maybe L2 control as well.
 In MPU's and FPGA's there are many sub-systems to control. Todays microcode is handled by existing firmware that creates and pre-programs the CPU's before wafer testing. The more that machines control and implement this micro-code process, the higher the yields in good chips off the wafer.
 Humans may need to get involved in the abstract layers in terms of creating new CPU's which need new firmware and micro-code, but each year more of the work is done by robotic assemblers, taking out as much chance of human error as possible.
 Does anyone remember the Pentium 90, and its problems with FP math?
The general approach to modern software programming is to abstract the previous level of complexity. If programming a television application like paid-for-streaming, the program may find it advantageous to abstract internet communications and concentrate on video playback. Likewise a compiler architect who is very familiar with a processor's machine code may prefer not to involve him or herself in the microcode that manages the processor's gates and logic.
