-4

Why don't programmers write applications in microcode?

Is it because microcode is hard to understand and write or mostly processors ain't microcode compatible?

I want the reason

awan
  • 1
  • 2
    Because then they would be microprogrammers writing microprograms. Seriously, there's not a lot of space for microcode to do program-scale tasks, the idea is that it consists of little subroutines which accomplish complex insturctions. If you like the idea of everything being a bit closer to the metal, look into classic RISC. – Chris Stratton May 06 '16 at 02:50
  • 1
    Why don't websites send uncompressed video across the Internet? Uncompressed video is easier to process and gives more control over the display... but storing it and moving it around is very inefficient. – Ben Voigt May 06 '16 at 03:15
  • Furthermore, allowing runtime patching of processor behavior needs a path to load micro-op decoder tables from RAM into the processor lookup tables. It doesn't require any path from RAM into the micro-op execution pipeline (and there likely isn't an efficient one). – Ben Voigt May 06 '16 at 03:20
  • This is where RISC came from, somewhere between between a microcoded CISC, and what the masses of programmers could handle. Or at least a reason for its popularity. – old_timer May 06 '16 at 10:35

3 Answers3

2

The machine-code instruction set and the details of what each instruction does is part of the processor specification. Usually, later processors in a given family support the same instructions as previous models, unless there's a good reason. In other words, the instruction set is a stable target for programmers to code against. Microcode, on the other hand, isn't ordinarily available to the programmer, or specified at all. It's an implementation detail, and the manufacturer has the freedom to change it radically between different chip designs if it suits their purposes. If microcode was part of the ISA, that freedom would be lost, and we would probably end up needing micro-microcode (nanocode?) to make up for it.

hobbs
  • 7,397
  • 1
  • 20
  • 31
  • +1 this one makes the most sense for a primarily opinion based question. Programmers for some reason find assembly hard, microcode is harder, and you absolutely would have to keep drawing a line. This is the lowest level published and supported instruction set for this processor. – old_timer May 06 '16 at 10:34
0

The reason is that most all micro-code exist as firmware inside of CPU's and MPU's. Usually instructions are fed to the CPU as assembler, which has been translated at 'build' time from C or C++. Micro-code is mostly an issue just for the manufacturers of CPU's and MPU's and some FPGA's.

It is the decoded version of basic instructions fed to the CPU, often 40 bits wide or more. These individual bits act to set the source of data and destination in the CPU. Also these bits control the ALU and prefetch sub-systems, L1 and maybe L2 control as well.

In MPU's and FPGA's there are many sub-systems to control. Todays microcode is handled by existing firmware that creates and pre-programs the CPU's before wafer testing. The more that machines control and implement this micro-code process, the higher the yields in good chips off the wafer.

Humans may need to get involved in the abstract layers in terms of creating new CPU's which need new firmware and micro-code, but each year more of the work is done by robotic assemblers, taking out as much chance of human error as possible.

Does anyone remember the Pentium 90, and its problems with FP math?

  • I have rewritten the entire answer to refer to micro-code, not assembler –  May 06 '16 at 03:33
  • the floating point math problem was not a microcode thing. actually microcode was the kind of thing that might have fixed it had they predicted a mistake like that. floating point bugs still exist, that was just the famous one. – old_timer May 06 '16 at 10:29
  • I wouldnt generically say CPUs/MPUs as microcoding is the exception not the rule. x86 being well known for it of course. but for every x86 you have you have many/dozens (in the same box as well as outside) of other processors of which many if not all are not microcoded. – old_timer May 06 '16 at 10:31
0

The general approach to modern software programming is to abstract the previous level of complexity. If programming a television application like paid-for-streaming, the program may find it advantageous to abstract internet communications and concentrate on video playback. Likewise a compiler architect who is very familiar with a processor's machine code may prefer not to involve him or herself in the microcode that manages the processor's gates and logic.

st2000
  • 3,334
  • 10
  • 12