One practical difficulty is that modern logic processes can't tolerate the 12V supply voltage (and usually 18V abs max rating) for classic 4000 series devices.
Anything breaks down (including silicon dioxide and other insulators) in a sufficiently high electric field (measured in volts/metre), and as you scale features down, you find you can get a lot of volts/metre across a micron, let alone 14 nM.
It seems like only yesterday to an old codger like me, that 3.3V and now 2.5 and 1.8V, took over from 5V as the supply rails for general purpose logic, because the I/O transistors could no longer support voltages as high as 5V ... let alone 12V.
And of course the core in a CPU or high end FPGA now runs on a fraction of a volt, with only especially large and beefy transistors around the I/O pads, capable of tolerating 2.5V or so. Newer FPGA families stopped supporting 5V I/O about ten years ago.
I believe that CD4000 was designed around a 10um or 15um gate width, easily achieved with what was basically optical microscope technology used in reverse...